1578
submitted 1 month ago by furycd001@lemmy.ml to c/memes@lemmy.ml
you are viewing a single comment's thread
view the rest of the comments
[-] saigot@lemmy.ca 4 points 1 month ago

If it was done with enough regularity to eb a problem, one could just put an LLM model like this in-between to preprocess the data.

[-] Azzu@lemm.ee 3 points 1 month ago

That doesn't work, you can't train models on another model's output without degrading the quality. At least not currently.

[-] FooBarrington@lemmy.world 1 points 1 month ago

No, that's not true. All current models use output from previous models as part of their training data. You can't solely rely on it, but that's not strictly necessary.

[-] Vashtea@sh.itjust.works 1 points 1 month ago* (last edited 1 month ago)

I don't think he was suggesting training on another model's output, just using ai to filter the training data before it is used.

this post was submitted on 22 Apr 2025
1578 points (98.9% liked)

Memes

50904 readers
795 users here now

Rules:

  1. Be civil and nice.
  2. Try not to excessively repost, as a rule of thumb, wait at least 2 months to do it if you have to.

founded 6 years ago
MODERATORS