230
AI models fed AI-generated data quickly spew nonsense
(www.nature.com)
A community to post scientific articles, news, and civil discussion.
rule #1: be kind
<--- rules currently under construction, see current pinned post.
2024-11-11
no, not really. the improvement gets less noticeable as it approaches the limit, but I'd say the speed at which it improves is still the same. especially smaller models and context window size. there's now models comparable to chatgpt or maybe even gpt 4.0 (I don't remember, one or the other) with context window size of 128k tokens, that you can run on a GPU with 16gb of vram. 128k tokens is around 90k words I think. that's more than 4 bee movie scripts. it can "comprehend" all of that at once.