if FBI owns every endpoint why is still there CSAM? why they don't remove all of them?
What a week! GLM-5, Minimax-2.5 and now Qwen-3.5 let's see
"Anna's Archive is a non-profit project with two goals:
-
Preservation: Backing up all knowledge and culture of humanity.
-
Access: Making this knowledge and culture available to anyone in the world."
Thanks everyone who contributed this great project.
I don't think so there is a mutual relationship with AI companies and the copyright's future is not bright
The real reason is LLMs are still using the same architecture and there is no breakthrough at the end of the day their intelligence will become so close to each other, when this happens they will have to decrease the prices to compete with open-weight models and even with these prices they don't generate revenue so instead of just scaling they will have to focus on optimization and innovation
Free Software is essential
Filterlist mainteiners are GOAT
Besides, clothing without logos is more expensive than others.
OpenRouter is basically the place for third-party AI stuff: tools, research, benchmarks...
Sure, it won't tell you what regular users are doing, but it shows where professionals actually spend money. And since it's pricey (seriously, once you're used to free ChatGPT or Gemini, paying hurts :D), it reveals which models are actually worth it.
General ranking (weekly) (higlighted models are open-weight)
General ranking (weekly)
- Kimi K2.5 - 1.45T tokens
- Gemini 3 Flash Preview - 737B tokens
- DeepSeek V3.2 - 711B tokens
- Claude Sonnet 4.5 - 678B tokens
- MiniMax M2.1 - 454B tokens
- Gemini 2.5 Flash - 449B tokens
- Grok 4.1 Fast - 421B tokens
- Trinity Large Preview - 388B tokens
- Gemini 2.5 Flash Lite - 358B tokens
- Claude Opus 4.5 - 345B tokens
- Grok Code Fast 1 - 314B tokens
- Claude Opus 4.6 - 275B tokens
- gpt-oss-120b - 266B tokens
- GPT-5 Nano - 265B tokens
- Gemini 2.0 Flash - 175B tokens
- GLM 4.7 - 171B tokens
- Gemini 3 Pro Preview - 169B tokens
- Pony Alpha (GLM-5) - 147B tokens
- GPT-5.2 - 145B tokens
- Claude Haiku 4.5 - 132B tokens
Wow 46% of tokens are now going through open-weight models thats amazing.

Tor is used by many countries, both users and governments. The reason for Tor is that it's not searchable: you need an exact, password-like URL to reach, for example, login pages. This ensures there is no chance another country can spy on or access those communications.