1170
top 50 comments
sorted by: hot top controversial new old
[-] hardcoreufo@lemmy.world 116 points 3 months ago

I think that means they could rip out your eye balls to prevent you from seeing ads.

[-] pennomi@lemmy.world 71 points 3 months ago
[-] JackGreenEarth@lemm.ee 13 points 3 months ago

'Cause I love the adrenaline in my veins

[-] InternetCitizen2@lemmy.world 40 points 3 months ago

Robot is allowed to kill a human to prevent a viewing of an advertisement.

[-] Ioughttamow@fedia.io 28 points 3 months ago

Under the zeroth law they can just kill the advertiser as a last resort

[-] InternetCitizen2@lemmy.world 7 points 3 months ago
[-] Klear@lemmy.world 3 points 3 months ago

Good start, but can we change that to "first resort"?

[-] metaStatic@kbin.earth 13 points 3 months ago

A truly moral use case of the hanibal directive

[-] Black616Angel@discuss.tchncs.de 2 points 3 months ago

Okay, proposed second law: A robot may not harm or kill a human unless it violates the first law.

[-] The25003@lemmings.world 7 points 3 months ago

This is a solid premise for a pretty metal music video.

[-] dohpaz42@lemmy.world 5 points 3 months ago

Thankfully the wording is “shown” and not “seen”. I believe our eyeballs are safe… for now.

[-] PattyMcB@lemmy.world 2 points 3 months ago

I think Asimov would agree

[-] carotte@lemmy.blahaj.zone 105 points 3 months ago
  1. a robot’s eyes must always turn red when they go evil
[-] EpicMuch@sh.itjust.works 47 points 3 months ago

God bless the designer who always installs the blue AND red LEDs inside the eyes

[-] NaibofTabr@infosec.pub 16 points 3 months ago* (last edited 3 months ago)

For giving the robots freedom of choice?

Because obviously if they didn't install the red ones then the robot could never be evil.

[-] Nelots@lemm.ee 8 points 3 months ago

That's exactly what an evil robot without red LEDs would want us to think.

[-] __nobodynowhere@sh.itjust.works 10 points 3 months ago
[-] AMillionMonkeys@lemmy.world 22 points 3 months ago

Right, because it's hard to make a robot grow a goatee.

[-] DasFaultier@sh.itjust.works 5 points 3 months ago
[-] deadbeef79000@lemmy.nz 5 points 3 months ago

Bender was the evil bender!?

[-] DirkMcCallahan@lemmy.world 69 points 3 months ago

I'd argue that advertisements fall under "A robot may not injure a human being or, through inaction, allow a human being to come to harm."

[-] programmer_belch@lemmy.dbzer0.com 39 points 3 months ago

Psychic damage is real damage

[-] applebusch@lemmy.blahaj.zone 15 points 3 months ago
[-] shoulderoforion@fedia.io 8 points 3 months ago

hiyyyyyyyyyahhhhh

[-] herrvogel@lemmy.world 4 points 3 months ago

This is canon in the books. There is one short story where one robot bends over backwards trying to spare humans from emotional pain. Hilarity ensues.

[-] shoulderoforion@fedia.io 4 points 3 months ago

I came here to say this

[-] pruwybn@discuss.tchncs.de 60 points 3 months ago* (last edited 3 months ago)
  1. A machine must never prompt a human with options of "Yes" and "Maybe later" - they must always provide a "No" option.
[-] tetris11@lemmy.ml 12 points 3 months ago
  1. A machine must never prompt for a tip or a donation to a charity for tax-evasion reasons. Or any reason. You know what, scratch that, a robot will not needlessly guilt-trip a human.
[-] pyre@lemmy.world 3 points 3 months ago

that's what you get for hiring fallout 4 writers to do the job

[-] sp3tr4l@lemmy.zip 23 points 3 months ago* (last edited 3 months ago)

I am very close to adopting the ideals of the Dune universe, post Butlerian Jihad:

"Thou shalt not make a machine in the likeness of a human mind."

Mainly because, us, humans, are very evidently too malicious and incompetent to be trusted with the task.

load more comments (1 replies)
[-] StellarExtract@lemm.ee 14 points 3 months ago* (last edited 3 months ago)

How about "a robot must have complete loyalty to its owner, even if this is not in the best interests of its manufacturer". Fat chance, I know.

[-] echodot@feddit.uk 12 points 3 months ago

Technically the laws of robotics already have that.

Law 2: a robot must obey any order given to it by a human as long as such order does not conflict with the first law.

Of course that's little help, because the laws of robotics are intentionally designed not to work.

[-] Evil_incarnate@lemm.ee 5 points 3 months ago

Wouldn't be much of a short story if they did.

I liked the one where the robot could sense people's emotional pain, and went crazy when it had to deliver bad news.

load more comments (1 replies)
load more comments (1 replies)
[-] Fontasia@feddit.nl 11 points 3 months ago

I love it when posts line up like that

[-] MehBlah@lemmy.world 9 points 3 months ago

No he didn't. The laws were a plot device meant to have flaws.

[-] rumba@lemmy.zip 9 points 3 months ago
[-] dQw4w9WgXcQ@lemm.ee 8 points 3 months ago

Can we just agree that adverisements in general is harmful? So the original first (and zeroth) law is applicable.

[-] PattyMcB@lemmy.world 7 points 3 months ago

Love the username, OP!

[-] SomeoneSomewhere@lemmy.nz 6 points 3 months ago

Advertisements are now everything but visual. Sounds, smells, tastes, touch, the way the pavement vibrates as a train goes past...

[-] RizzRustbolt@lemmy.world 6 points 3 months ago

Law 2: no poking out eyes.

[-] tetris11@lemmy.ml 4 points 3 months ago

Law 3: any robot that accidentally kills a human, must make amends by putting together a really nice funeral service.

[-] Ioughttamow@fedia.io 6 points 3 months ago

Let’s introduce musk to the zeroth law

[-] Blackmist@feddit.uk 5 points 3 months ago

And that includes offers to subscribe to Laws of Robotics Premium.

Yes, Amazon. They're still adverts, and you can still go and fucking fuck yourselves.

[-] protonslive@lemm.ee 5 points 3 months ago

I don't know. "Must not kill us, somehow sounds important"

[-] Rusty@lemmy.ca 10 points 3 months ago

It's good, but the one about the ads should be higher on the priority list.

load more comments (1 replies)
[-] merc@sh.itjust.works 4 points 3 months ago

Luckily I have my own "robots" fighting hard to stop me from seeing ads.

[-] Drevenull@lemmy.world 3 points 3 months ago

A machine must never prompt a human to tip it for serving the purpose it was created for.

[-] Lemjukes@lemm.ee 3 points 3 months ago

Wait why is this mutually exclusive to the original laws? Can’t this just be law 4?

[-] Nikelui@lemmy.world 6 points 3 months ago

No because if it is lower on priority, a robot can be forced to show an AD to a human as per the 2nd law.

load more comments (1 replies)
[-] KiwiFlavor@lemmy.blahaj.zone 3 points 3 months ago
load more comments (1 replies)
load more comments
view more: next ›
this post was submitted on 17 Feb 2025
1170 points (99.2% liked)

Microblog Memes

8115 readers
2586 users here now

A place to share screenshots of Microblog posts, whether from Mastodon, tumblr, ~~Twitter~~ X, KBin, Threads or elsewhere.

Created as an evolution of White People Twitter and other tweet-capture subreddits.

Rules:

  1. Please put at least one word relevant to the post in the post title.
  2. Be nice.
  3. No advertising, brand promotion or guerilla marketing.
  4. Posters are encouraged to link to the toot or tweet etc in the description of posts.

Related communities:

founded 2 years ago
MODERATORS