126
AI hallucinations are getting worse โ and they're here to stay
(www.newscientist.com)
A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If itโs technological news or discussion of technology, it probably belongs here.
Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if theyโre wrong, or use a Linux distro you donโt like). Personal attacks will not be tolerated.
Subcommunities on Beehaw:
This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.
Wonder if we're already starting to see the impact of AI being trained on AI-generated content.
Absolutely.
AI generated content was always going to leak in to the training models unless they literally stopped training as soon as it started being used to generate content, around 2022.
And once it's in, it's like cancer. There's no getting it out without completely wiping the training data and starting over. And it's a feedback loop. It will only get worse with time.
The models could have been great, but they rushed release and made it available too early.
If 60% of the posts on Reddit are bots, which may be a number I made up but I feel like I read that somewhere, then we can safely assume that roughly half the data these models are being trained on is now AI generated.
Rejoice friends, soon the slop will render them useless.
I can't wait for my phone's autocomplete to have better results than AI.
This is the one I have a few days ago I was going to get a subscription to the house that was made by the same place as the first time in the past week but it would have to go through him and Sue. When is the next day of the month and the next time we get the same thing as we want to do the UI will freeze while I'm on the plane. But ok, I'll let her know if you need anything else
I thought I just had a stroke. But it is not the first thing that I hear people do when I see a picture like that in a movie and it makes my brain go crazy because of that picture and it is the most accurate representation of what happened in my life that makes me think that it is a real person that has been in the past.
You fools! You absolute bafoons and I will never be in the same place as the only thing ๐ is that you are a good person and I don't know what to do with it but I can be the first ๐ฅ๐๐๐๐ to do the first one of those who have been in the same place as the other day.
The Amelia is a good idea for the kids grow up to be democratic in their opinion and the kids grow up in their hearts to see what their message more than days will happen and they have an opinion about that as well and we were allegations a bit if you want a chat
I was tiny detour downtown though with them even just because I'm still gonna be back to open another day instead of your farts in you an interactive shell without running Of on if want air passing through or not that cold ride years on that tune is original from really cold like using bottle to capitalize
Not before they render the remainder of the internet useless.
In the case of reasoning models, definitely. Reasoning datasets weren't even a thing a year ago and from what we know about how the larger models are trained, most task-specific training data is artificial (oftentimes a small amount is human-generated and then synthetically augmented).
However, I think it's safe to assume that this has been the case for regular chat models as well - the self-instruct and ORCA papers are quite old already.