50
submitted 1 year ago by 0x815@feddit.de to c/technology@beehaw.org

Using model-generated content in training causes irreversible defects, a team of researchers says. "The tails of the original content distribution disappears," writes co-author Ross Anderson from the University of Cambridge in a blog post. "Within a few generations, text becomes garbage, as Gaussian distributions converge and may even become delta functions."

Here's is the study: http://web.archive.org/web/20230614184632/https://arxiv.org/abs/2305.17493

you are viewing a single comment's thread
view the rest of the comments
[-] fragmentcity@lemm.ee 6 points 1 year ago
[-] LoreleiSankTheShip@lemmy.ml 2 points 1 year ago

That was trippy af

That was really cool, thank you!

[-] Roberto@toast.ooo 1 points 1 year ago

That was interesting as duck

this post was submitted on 20 Jun 2023
50 points (100.0% liked)

Technology

37708 readers
155 users here now

A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.

Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.

Subcommunities on Beehaw:


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 2 years ago
MODERATORS