Poe Bot Points
scroll table horizontally ↔ for more info
bots in red, link to their poe page
leave a comment here for any errors or updates
Bot | pts/ msg | pts/ txt | pts/ msg | base pts |
---|---|---|---|---|
Cartesia | 1900 | |||
ChatGPT-4o-Latest-128k | 583 | 167 | 167 | 547 |
o1 | 8000 | 500 | 500 | 7954 |
Claude-3-Haiku | 19 | 9 | 9 | 17 |
Claude-3-Haiku-200k | 200 | |||
Claude-3-Opus | 2234 | 500 | 500 | 2120 |
Claude-3-Opus-200k | 2474 | 500 | 500 | 2360 |
Claude-3-Sonnet | 360 | |||
Claude-3-Sonnet-200k | 1390 | |||
Claude-3.5-Haiku-200k | 640 | |||
GPT-4o-Mini | 15 | |||
Claude-3.5-Sonnet-200k | 424 | 100 | 100 | 401 |
Claude-3.5-Sonnet-June | 380 | |||
Claude-3.5-Sonnet-June-200k | 1800 | |||
Code-Llama-34b | 20 | |||
Command-R | 170 | |||
Command-R-Plus | 1130 | |||
DALL-E-3 | 1500 | |||
Dream-Machine | 16000 | |||
FLUX-pro | 1250 | |||
FLUX-pro-1.1 | 1000 | |||
FLUX-pro-1.1-ultra | 1500 | |||
Gemini-1.0-Pro | 20 | |||
Gemini-1.5-Flash-1M | 1700 | |||
Gemini-1.5-Flash-128k | 300 | |||
Gemini-1.5-Flash-Search | 25 | |||
Gemini-1.5-Pro-2M | 35000 | |||
Gemini-1.5-Pro-128k | 1750 | |||
Gemini-1.5-Pro-Search | 175 | |||
Gemma-2-9b-T | 35 | |||
Gemma-2-27b-T | 90 | |||
GPT-3.5-Turbo | 17 | 17 | 17 | 13 |
GPT-3.5-Turbo-16k | 55 | |||
Claude-3.5-Sonnet | 343 | 100 | 100 | 320 |
GPT-4-Classic | 2250 | |||
GPT-4-Turbo | 483 | 334 | 334 | 409 |
GPT-4o-128k | 341 | 84 | 84 | 327 |
GPT-4o-Aug | 300 | |||
GPT-4o-Aug-128k | 650 | |||
GPT-4o-Mini-128k | 75 | |||
Grok-beta-128k | 2725 | |||
Hailuo-AI | 12500 | |||
Haiper2.0 | 5000 | |||
Ideogram | 1500 | |||
Ideogram-v2 | 1900 | |||
Imagen3-Fast | 500 | |||
Kling-Pro-v1.5 | 12500 | |||
Llama-3-8B-T | 15 | |||
Llama-3-70b-Groq | 75 | |||
Llama-3-70b-Inst-FW | 75 | |||
Llama-3-70B-T | 75 | |||
Llama-3.1-8B-FW-128k | 50 | |||
Llama-3.1-8B-FW-128k | 254 | |||
Llama-3.1-8B-T-128k | 100 | |||
Llama-3.1-70B-FW-128k | 400 | |||
Llama-3.1-405B-FP16 | 2070 | |||
Llama-3.1-405B-FW-128k | 1500 | |||
Llama-3.2-11B | 115 | |||
Llama-3.2-90B-FW-131k | 475 | |||
Llama-3.3-70B | 235 | |||
Llama-3.3-70B-FP16 | 110 | |||
Llama-3.3-70B-FW | 240 | |||
Mistral-Large-2 | 1000 | |||
Mistral-Large-2-128k | 1600 | |||
Mixtral-8x7B-Chat | 20 | |||
Mixtral8x22b-Inst-FW | 120 | |||
Mochi-preview | 8000 | |||
Pika-1.0 | 3750 | |||
Playground-v3 | 200 | |||
Qwen-2.5-7B-T | 75 | |||
Qwen-2.5-72B-T | 300 | |||
Qwen-QwQ-32b-preview | 240 | |||
Qwen2.5-Coder-32B | 50 | |||
QwQ-32B-Preview-T | 320 | |||
RekaCore | 1250 | |||
RekaFlash | 40 | |||
SD3-Medium | 870 | |||
QwQ-32B-Preview | 0 | |||
Solar-Pro | 1 | |||
StableDiffusion3-2B | 250 | |||
StableDiffusion3.5-L | 1625 | |||
ChatGPT-4o-Latest | 507 | 167 | 167 | 471 |
Claude-3.5-Haiku | 95 | |||
o1-mini | 1800 | |||
o1-preview | 8125 | 500 | 8018 | |
GPT-4o | 291 | 84 | 273 | |
This page was last updated 15 January 2025 @ 5:37 pm