50
submitted 1 year ago by 0x815@feddit.de to c/technology@beehaw.org

Using model-generated content in training causes irreversible defects, a team of researchers says. "The tails of the original content distribution disappears," writes co-author Ross Anderson from the University of Cambridge in a blog post. "Within a few generations, text becomes garbage, as Gaussian distributions converge and may even become delta functions."

Here's is the study: http://web.archive.org/web/20230614184632/https://arxiv.org/abs/2305.17493

you are viewing a single comment's thread
view the rest of the comments
[-] FaceDeer@kbin.social 10 points 1 year ago

So the only Reddit data that's really valuable to AI companies is the stuff that's already been archived, and the stuff that will be gated behind their paid API is less useful now that bots are becoming prevalent?

Good move, Reddit.

this post was submitted on 20 Jun 2023
50 points (100.0% liked)

Technology

37692 readers
317 users here now

A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.

Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.

Subcommunities on Beehaw:


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 2 years ago
MODERATORS