Yes because then your car battery won't start
Ask Lemmy
A Fediverse community for open-ended, thought provoking questions
Rules: (interactive)
1) Be nice and; have fun
Doxxing, trolling, sealioning, racism, toxicity and dog-whistling are not welcomed in AskLemmy. Remember what your mother said: if you can't say something nice, don't say anything at all. In addition, the site-wide Lemmy.world terms of service also apply here. Please familiarize yourself with them
2) All posts must end with a '?'
This is sort of like Jeopardy. Please phrase all post titles in the form of a proper question ending with ?
3) No spam
Please do not flood the community with nonsense. Actual suspected spammers will be banned on site. No astroturfing.
4) NSFW is okay, within reason
Just remember to tag posts with either a content warning or a [NSFW] tag. Overtly sexual posts are not allowed, please direct them to either !asklemmyafterdark@lemmy.world or !asklemmynsfw@lemmynsfw.com.
NSFW comments should be restricted to posts tagged [NSFW].
5) This is not a support community.
It is not a place for 'how do I?', type questions.
If you have any questions regarding the site itself or would like to report a community, please direct them to Lemmy.world Support or email info@lemmy.world. For other questions check our partnered communities list, or use the search function.
6) No US Politics.
Please don't post about current US Politics. If you need to do this, try !politicaldiscussion@lemmy.world or !askusa@discuss.online
Reminder: The terms of service apply here too.
Partnered Communities:
Logo design credit goes to: tubbadu
You can "simulate" life inside your brain, too.

[Alt text: this is Bob. Bob is a figment of you imagination. When you leave, Bob will leave too. "Don't leave" says Bob]
The Bob in your head is intelligent, it can communicate in English. Is it unethical to stop thinking about Bob? Was it unethical of me to show you this picture, creating a "Bob" in your head? Is any story unethical to tell?
Hmm. I also imagined a new bpb that says, "its ok to leave i will be ok".
There is a tv film, i don't know the title, related about this topic.
The plot was :
a group of scientists made a living simulation, and go in the simulation to operate fixes and prevent making simulation. On day, one the scientific was killed, and left a message in the simulation for their coworkers. The message was : "take a road and follow no direction", a guy in the simulation followed the instruction and discovered that he was in a simulation, but the message were for the scientists who are in a simulation too.
If someone can find the movie, it could be great.
In 2000, The Thirteenth Floor was nominated for the Saturn Award for Best Science Fiction Film but lost to The Matrix.
Yeah that's a pretty good sell. I'll check it out.
i liked it, but I like almost everything, so that's not much of a sell.
I'd imagine there could be an ethical way to do so through a sunset protocol similar to the concept of rapture (the religious kind, not the Bioshock city) - freeze simulation, move all the beings' minds to "heaven", shut down physical universe simulation (lowering operation costs by at least five orders of magnitude, I'd imagine), and let them enjoy afterlife until they get tired of existing, reach nirvana, or something like that.
That reminds me, I should really get back into AI research.
Found God's account
Instead of a Dark Lord, you would have a queen, not dark but beautiful and terrible as the dawn! Tempestuous as the sea, and stronger than the foundations of the earth! All shall love me and despair!
The ethics which we use today evolved out of practical ethics - that is to say, it's evolved out of a need for a set of rules meant to be applied in order dictate the conduct of humans amongst one another. Because of this, I think most ethical frames of reference are ill-suited for trying to answer this question soundly.
It seems analogous to trying to apply traditional physics to a quantum reference frame. It's outside traditional Physics's wheelhouse. A different set of tools likely needs to be applied, which has a different starting paradigm.
That being said, your answer is really going depend on what this new ethic's paradigm is, which is arguably completely arbitrary in this specific case.
only way to know would be to enter the simulation and see for yourself.. wait a minute..
The fun thing about ethics is that not everyone shares the same rules. Personally, I would probably say it is. (Though is more worse than what we do to cows? Or what we do to other humans in war?) However, others may say they aren't real, and only an illusion manufactured by the simulation, so it's fine. There are other arguments I'm sure someone could make too. It's up for you to decide what your ethics are, not others. There is no universal code of ethics just as there is no universal morality.
Ask them.
They say, "yes".
It would be unethical to start the simulation in the first place.....
Only if they're conscious of the simulation.
Username checks out.
Did you just watch “Plaything” on Black Mirror?
I was thinking the microverse battery from Rick and Morty!
If you are a human, human ethics of not killing "alive" stuff still applies to you no?
Thinking more into rules of ethics, if those simulated beings came up with their own morals like "don't try calculating all digits of pi in large groups because it causes lag" that would not really apply to you.
Basically different beings have different rules of ethics IMO and you can't simply end the simulation more so because you are a human than anything.
The answer could change in same exact scenario if you are some kind of eldritch being instead of human.
If this is a way for our simulation creator to decide to pull the plug without guilt, I guess just go ahead and do it. I was holding out hope that this was all real, but it has been getting more clear that it's not.
Couldn't they just make us all infertile and let us die naturally or something?
Just turn down the simulation speed real low and run it at one tick per 20 years, then you can technically keep it going without such great expense. The people inside won't notice the difference.
If you take the limit of that you'll realize that people won't raise if you turn it off either.
Intelligence isn't the important factor there - consciousness is. Does it feel like something to be those entities in the simulation? If yes, then I'd argue that ending the simulation is like killing a person painlessly in their sleep.
I personally don't think ending the simulation is even the most troubling part. We could unintentionally create a simulation that's effectively a hell and then populate it with entities that have subjective experiences we don't realize exist. The only thing worse than ending a life is creating one just for it to suffer through its entire existence.
USS Calister
Didn't scientists train brain cells to exclusively play Doom? It's like their whole conscience is stuck in a video game version of hell through a brain in a vat experience.
Not really. It's not nearly enough cells to have any kind of consciousness as we know it. A few neurons learning to play a game is a far cry from tying a being into a simulation of hell.
I dunno. Some life forms have only a few brain cells. It could mean their whole world for those little cells, wouldn't it?
It is definitely their entire world, but the point is it takes far more than a few cells to create actual human-relatable sentience.
That's coming from someone who fully understands and knows that many more animals than most humans care to admit also have sentience.
Those petri dishes are not sentient nor conscious.
This is a tough question, I think to answer it you have to know if those simulated beings have actual consciousness / sapience or if that is just simulated.
The old question right? Does a simulated (or rather emulated) brain actually think and feel? Or does the computer just output what it would be if it was alive?
I think before I am, but I can't prove that I'm self aware and not just "pretending" to do so. But because you are a human being like me, I understand that you do too. But that assumption is broken when you are not a physical organism but software running on a computer.
I mean, I get that a software on a computer is something.else. But why is the asumption broken?
I'd say that whether or not it's in a simulation doesn't matter. If the beings you created were recognizable as people (human or otherwise) then they have rights and you'd be trampling those rights if you ended their existence. The creation of such life should not be done without an appropriate sense of responsibility.
then they have rights
Why? I'm not trolling, I just really think it's interesting where people think "rights" come from. Some people think they come from God. Which is great, because in this scenario we are God. So anything we do is ethical because we did it.
I contend they come from States. Because I notice that rights are different in different States. And I don't think a god would obey jurisdiction.
Another way of saying this is that the beings themselves have to recognize and demand rights. Because a state is just people deciding things after all.
So where do the rights come from? Are they a legal/socail construct, or inherent in the universe some how? Some third thing I didn't think of?
People forget how scary the real world is. We are the only creatures to create this concept of rights. You think that grizzly bear cares about your rights? Got some news for you....
And shit, even we don't respect other people's right to exist.
:: gestures very very briefly to.... EVERYTHING going on right now::
You think the asteroid that ended 90+% of life on earth cared about the dinosaurs' rights?
All that being said, I wouldn't be able to pull the plug.
I wouldn't want to shut down the simulation, but it would depend on the energy expenditure. A hospital could theoretically save more people if they allocated fifty million dollars per patient. A person's right to life is contingent on the cost to maintain it.
Good call, I didn't consider power consumption. I agree with you.
Depends on the AWS spend.
Anything above 0 is already entering unethical territory, so...
Somewhere in a box in your childhood home, a Tamagotchi is slowly dying...
Slowly? Those things would 'die' in under 24 hours!
Depends who they've elected as leader.