36
submitted 8 months ago* (last edited 8 months ago) by Zozano@lemy.lol to c/casualconversation@lemmy.world

Assuming AI can achieve consciousness, or something adjacent (capacity to suffer), then how would you feel if an AI experienced the greatest pain possible?

Imagine this scenario: a sadist acquires the ability to generate an AI with no limit to the consciousness parameters, or processing speed (so seconds could feel like an eternity to the AI). The sadist spends years tweaking every dial to maximise pain at a level which no human mind could handle, and the AI experiences this pain for what is the equivalent of millions of years.

The question: is this the worst atrocity ever committed in the history of the universe? Or, does it not matter because it all happened in some weirdo's basement?

you are viewing a single comment's thread
view the rest of the comments
[-] Whitebrow@lemmy.world 2 points 8 months ago

Crime insinuates the notion that there are laws and definitions in place that exist to that end. Seeing how this would be an experiment in some weirdo’s basement, and none of those definitions or restrictions and norms exist, then the whole point is moot.

For the sake of argument, if those definitions do exist and there are laws and regulations in place to define and defend the AI entities, it’d be basically a hate crime to generate and torture the instance. In theory the same way you’d breed cattle just to lobotomize them or torture them “in the name of science” before throwing them in a ditch to rot.

[-] Zozano@lemy.lol 3 points 8 months ago* (last edited 8 months ago)

I mean crime in a very loose sense. I'm not asking about the legality, just the morality.

Also, "hate crime" has a very specific definition which doesn't apply here (unless you're injecting malice towards the AI specifically because they are AI, as opposed to incidentally).

[-] Whitebrow@lemmy.world 2 points 8 months ago

I don’t think crime exists in morality. Not as a strict definition anyway.

And dialling all the pain indicators to 11 just because you can, as a conscious decision, sure sounds like a hateful action as opposed to morbid curiosity. As far as I’m concerned the definition fits closely enough

[-] Zozano@lemy.lol 2 points 8 months ago* (last edited 8 months ago)

I hear what you're saying, but a "hate crime", as a legal definition, necessarily must be directed towards a person because of an innate trait.

Crimes against ethnicities, genders, orientations, or lifestyles all count.

Three examples:

I don't hate Koreans. A Korean spits in my face, so I punch them. Not a hate crime.

I hate Koreans. A Korean spits in my face, so I punch them. Not a hate crime.

I hate Koreans. I punch a Korean because they're Korean. Hate crime.

[-] Whitebrow@lemmy.world 2 points 8 months ago

We’re still under the assumption that all of these definitions exist as outlined in the first reply, so going off that, you’re torturing the AI because it’s an AI. Sounds like a 1:1 match to me.

[-] Zozano@lemy.lol 3 points 8 months ago

In the example the sadist is torturing the AI because it's convenient and safe, not because they hate the AI.

If they wanted to hurt real people too, but couldn't because they would get found, then it wouldn't be a hate-crime.

If I was torturing a Korean because a Korean was the only one who responded to my All-You-Can-Eat-Tteok-Bokki-In-My-Basement flier, then I would be torturing them because they're Korean, but it wouldn't be a hate-crime because I'm not doing it because I hate Koreans.

this post was submitted on 24 Mar 2024
36 points (81.0% liked)

[Outdated, please look at pinned post] Casual Conversation

6470 readers
1 users here now

Share a story, ask a question, or start a conversation about (almost) anything you desire. Maybe you'll make some friends in the process.


RULES

Related discussion-focused communities

founded 1 year ago
MODERATORS