94

The big AI models are running out of training data (and it turns out most of the training data was produced by fools and the intentionally obtuse), so this might mark the end of rapid model advancement

you are viewing a single comment's thread
view the rest of the comments
[-] frauddogg@lemmygrad.ml 55 points 5 months ago* (last edited 5 months ago)

While synthetic data is a thing, you've really gotta wonder how often you can train a model on basically empty calories before the hallucination rate starts going up.

I, for one, hope the theftbots die.

[-] KnilAdlez@hexbear.net 24 points 5 months ago

I was reading an article about how ChatGPT will sometimes go on existential rants and I figure it's probably because so much of the training data is now generated by LLMs and posted on the internet. probably a glut of people posting "I asked chatGPT what it was like to be a robot" and things of that nature.

[-] SacredExcrement@hexbear.net 7 points 5 months ago

Hopefully they die off before the entire net is just an all consuming ouroboros of this LLM generated garbage

this post was submitted on 11 Jun 2024
94 points (100.0% liked)

technology

23313 readers
128 users here now

On the road to fully automated luxury gay space communism.

Spreading Linux propaganda since 2020

Rules:

founded 4 years ago
MODERATORS