100
1-bit LLMs Could Solve AI’s Energy Demands
(spectrum.ieee.org)
This is a most excellent place for technology news and articles.
But since it takes 10% of the space (vram, etc.) sounds like they could just start with a larger model and still come out ahead