69
submitted 1 year ago* (last edited 1 year ago) by preasket@lemy.lol to c/showerthoughts@lemmy.world

I'm sure there are some AI peeps here. Neural networks scale with size because the number of combinations of parameter values that work for a given task scales exponentially (or, even better, factorially if that's a word???) with the network size. How can such a network be properly aligned when even humans, the most advanced natural neural nets, are not aligned? What can we realistically hope for?

Here's what I mean by alignment:

  • Ability to specify a loss function that humanity wants
  • Some strict or statistical guarantees on the deviation from that loss function as well as potentially unaccounted side effects
you are viewing a single comment's thread
view the rest of the comments
[-] preasket@lemy.lol 4 points 1 year ago

The idea of backpropagation and neural nets is quite old, but there's some significant new research being done now. Primarily in node types and computational efficiency. LSTM, transformers, ReLU - these are all new.

[-] Zo0@feddit.de 2 points 1 year ago

Haha reading your other replies, you're too humble for someone who knows what they're talking about

this post was submitted on 14 Jul 2023
69 points (96.0% liked)

Showerthoughts

29525 readers
1213 users here now

A "Showerthought" is a simple term used to describe the thoughts that pop into your head while you're doing everyday things like taking a shower, driving, or just daydreaming. The best ones are thoughts that many people can relate to and they find something funny or interesting in regular stuff.

Rules

founded 1 year ago
MODERATORS