69
submitted 1 year ago* (last edited 1 year ago) by preasket@lemy.lol to c/showerthoughts@lemmy.world

I'm sure there are some AI peeps here. Neural networks scale with size because the number of combinations of parameter values that work for a given task scales exponentially (or, even better, factorially if that's a word???) with the network size. How can such a network be properly aligned when even humans, the most advanced natural neural nets, are not aligned? What can we realistically hope for?

Here's what I mean by alignment:

  • Ability to specify a loss function that humanity wants
  • Some strict or statistical guarantees on the deviation from that loss function as well as potentially unaccounted side effects
you are viewing a single comment's thread
view the rest of the comments
[-] Brochetudo@feddit.de -1 points 1 year ago* (last edited 1 year ago)

Pal, I want of whatever you smoked prior to writing this

Now seriously, from the way you wrote the post, I believe that you might not have had hands-on experience with deep learning techniques and may very well have just watched a handful of videos on YouTube instead

[-] preasket@lemy.lol 2 points 1 year ago

Well, this is showerthoughts! 🤣

Did I say something wrong?

this post was submitted on 14 Jul 2023
69 points (96.0% liked)

Showerthoughts

29239 readers
836 users here now

A "Showerthought" is a simple term used to describe the thoughts that pop into your head while you're doing everyday things like taking a shower, driving, or just daydreaming. The best ones are thoughts that many people can relate to and they find something funny or interesting in regular stuff.

Rules

founded 1 year ago
MODERATORS