this post was submitted on 27 Dec 2025
818 points (94.3% liked)
Comic Strips
20752 readers
2796 users here now
Comic Strips is a community for those who love comic stories.
The rules are simple:
- The post can be a single image, an image gallery, or a link to a specific comic hosted on another site (the author's website, for instance).
- The comic must be a complete story.
- If it is an external link, it must be to a specific story, not to the root of the site.
- You may post comics from others or your own.
- If you are posting a comic of your own, a maximum of one per week is allowed (I know, your comics are great, but this rule helps avoid spam).
- The comic can be in any language, but if it's not in English, OP must include an English translation in the post's 'body' field (note: you don't need to select a specific language when posting a comic).
- Politeness.
- AI-generated comics aren't allowed.
- Adult content is not allowed. This community aims to be fun for people of all ages.
Web of links
- !linuxmemes@lemmy.world: "I use Arch btw"
- !memes@lemmy.world: memes (you don't say!)
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Damn, this made me laugh hard lol
When I hear about people becoming "emotionally addicted" to this stuff that can't even pass a basic turing test it makes me weep a little for humanity. The standards for basic social interaction shouldn't be this low.
Humans get emotionally addicted to lots of objects that are not even animate or do not even exist outside their mind. Don't blame them.
For a while I was telling people "don't fall in love with anything that doesn't have a pulse." Which I still believe is good advice concerning AI companion apps.
But someone reminded me of that humans will pack-bond with anything meme that featured a toaster or something like that, and I realized it was probably a futile effort and gave it up.
Yeah, telling people about what or who they can fall in love with is kind of outdated. Like racial segregation or arranged marriage.
I find affection with my bonsai plants and yeast colonies, those sure have no pulse.
I personally find AI tools tiring and disgusting, but after playing with them for some time (which wasnt a lot, I use local deploy and free tier of a big thing), I discovered particular conditions where appropriate application brings me genuine joy, akin to joy from using a good saw or a chisel. I can easily imagine people might really enjoy this stuff.
The issue with LLMs is not fundamental and internal to concept of AI itself, but it is in economic system that creared and placed them as they are now while burning our planet and society.
You're when it comes to finding affection. Which is precisely why my approach fell flat.
While the environmental problems and the market bubble eventually bursting are bigger issues that will harm everyone, I see the beginnings of what could be a problem of equal significance concerning the exploitation of lonely and vulnerable people for profit with AI romance/sexbot apps. I don't want to fully buy into the more sensationalist headlines surrounding AI safety without more information, but I strongly suspect that we'll see a rise in isolated persons with aggravated mental health issues due to this kind of LLM use. Not necessarily hundreds of people with full-blown psychosis, but an overall increase in self-isolation coupled with depression and other more common mental health issues.
The way social media has shaped our public discourse has shown that like it or not, we're all vulnerable to being emotionally manipulated by electronic platforms. AI is absolutely being used in the same way and while more tech savvy persons are likely to be less vulnerable, no one is going to be completely immune. When you consider AI powered romance and sex apps, ask yourself if there's a better way to get under someone's skin than by simulating the most intimate relationships in the human experience?
So, old fashioned or not, I'm not going to be supportive of lonely people turning to LLMs as a substitute for romance in the near future. It's less about their individual freedoms, and more about not wanting to see them fed into the next Torment Nexus.
Edits: several words.
What are you, necrophobic?
Well, that's certainly not the direction I expected this conversation to go.
I apologize to the necro community for the hurtful and ignorant comments I've made in the past. They aren't reflective of who I am as a person and I'll strive to improve myself in the future.
Reminds me of this old ad, for lamps, I think, where someone threw out an old lamp (just a plain old lamp, not anthropomorphised in any way) and it was all alone and cold in the rain and it was very sad and then the ad was like “it's just an inanimate object, you dumb fuck, it doesn't feel anything, just stop moping and buy a new one, at [whatever company paid for the ad]”.
I don't know if it was good at getting people to buy lamps (I somehow doubt it), but it definitely demonstrated that we humans will feel empathy for the stupidest inanimate shit.
And LLMs are especially designed to be as addictive as possible (especially for CEOs, hence them being obligate yesmen), since we're definitely not going to get attached to them for their usefulness or accuracy.
https://www.youtube.com/watch?v=dBqhIVyfsRg
The lamp ad, fwiw
Also, since there is no relevant XKCD, there has to be a relevant Community (yes, it's a law):
https://www.youtube.com/watch?v=z906aLyP5fg&t=7s
That's the one, thanks!
Also, I must note, that feeling attachment to whatever is fine; guiding your professional behavior on which live humans rely by emotional attachment is just unprofessional. The thing is, capitalism, - at least since Marx's times, because he writes about it - relies heavily on actively reducing professional skills of all its workers; CEOs are not an exception.
...