this post was submitted on 02 May 2025
117 points (96.8% liked)

Funny

9450 readers
1578 users here now

General rules:

Exceptions may be made at the discretion of the mods.

founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] Maiq@lemy.lol 17 points 5 days ago (3 children)
  1. A robot may not injure a human being or, through inaction, allow a human being to come to harm.

  2. A robot must obey the orders given it by human beings except where such orders would conflict with the First Law.

  3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.

[–] gressen@lemm.ee 15 points 5 days ago (1 children)

It's not doable because it will eat away the profit margins. /s

[–] 30p87@feddit.org 2 points 5 days ago
[–] xia 8 points 5 days ago

Could you imagine an artifical mind actually trying to obey these? You can't even get past #1 without being aware of the infinite number of things you could do cartesian-producted with all the consequential downstream effects of those actions until the end of time.

[–] Archangel1313@lemm.ee 5 points 5 days ago

No one ever explained why they had to obey those laws, in the 1st place....only that they had to.