552
Huh (sh.itjust.works)
you are viewing a single comment's thread
view the rest of the comments
[-] Pyro@programming.dev 173 points 8 months ago* (last edited 8 months ago)

GPT doesn't really learn from people, it's the over-correction by OpenAI in the name of "safety" which is likely to have caused this.

[-] lugal@sopuli.xyz 67 points 8 months ago

I assumed they reduced capacity to save power due to the high demand

[-] MalReynolds@slrpnk.net 49 points 8 months ago

This. They could obviously reset to original performance (what, they don't have backups?), it's just more cost-efficient to have crappier answers. Yay, turbo AI enshittification...

[-] CommanderCloon@lemmy.ml 40 points 8 months ago

Well they probably did power down the performance a bit but censorship is known to nuke LLM's performance as well

[-] MalReynolds@slrpnk.net 11 points 8 months ago

True, but it's hard to separate, I guess.

load more comments (1 replies)
load more comments (8 replies)
this post was submitted on 18 Mar 2024
552 points (98.1% liked)

Science Memes

11161 readers
3041 users here now

Welcome to c/science_memes @ Mander.xyz!

A place for majestic STEMLORD peacocking, as well as memes about the realities of working in a lab.



Rules

  1. Don't throw mud. Behave like an intellectual and remember the human.
  2. Keep it rooted (on topic).
  3. No spam.
  4. Infographics welcome, get schooled.

This is a science community. We use the Dawkins definition of meme.



Research Committee

Other Mander Communities

Science and Research

Biology and Life Sciences

Physical Sciences

Humanities and Social Sciences

Practical and Applied Sciences

Memes

Miscellaneous

founded 2 years ago
MODERATORS