this post was submitted on 13 Jan 2026
587 points (97.7% liked)

Lemmy Shitpost

38582 readers
6270 users here now

Welcome to Lemmy Shitpost. Here you can shitpost to your hearts content.

Anything and everything goes. Memes, Jokes, Vents and Banter. Though we still have to comply with lemmy.world instance rules. So behave!


Rules:

1. Be Respectful


Refrain from using harmful language pertaining to a protected characteristic: e.g. race, gender, sexuality, disability or religion.

Refrain from being argumentative when responding or commenting to posts/replies. Personal attacks are not welcome here.

...


2. No Illegal Content


Content that violates the law. Any post/comment found to be in breach of common law will be removed and given to the authorities if required.

That means:

-No promoting violence/threats against any individuals

-No CSA content or Revenge Porn

-No sharing private/personal information (Doxxing)

...


3. No Spam


Posting the same post, no matter the intent is against the rules.

-If you have posted content, please refrain from re-posting said content within this community.

-Do not spam posts with intent to harass, annoy, bully, advertise, scam or harm this community.

-No posting Scams/Advertisements/Phishing Links/IP Grabbers

-No Bots, Bots will be banned from the community.

...


4. No Porn/ExplicitContent


-Do not post explicit content. Lemmy.World is not the instance for NSFW content.

-Do not post Gore or Shock Content.

...


5. No Enciting Harassment,Brigading, Doxxing or Witch Hunts


-Do not Brigade other Communities

-No calls to action against other communities/users within Lemmy or outside of Lemmy.

-No Witch Hunts against users/communities.

-No content that harasses members within or outside of the community.

...


6. NSFW should be behind NSFW tags.


-Content that is NSFW should be behind NSFW tags.

-Content that might be distressing should be kept behind NSFW tags.

...

If you see content that is a breach of the rules, please flag and report the comment and a moderator will take action where they can.


Also check out:

Partnered Communities:

1.Memes

2.Lemmy Review

3.Mildly Infuriating

4.Lemmy Be Wholesome

5.No Stupid Questions

6.You Should Know

7.Comedy Heaven

8.Credible Defense

9.Ten Forward

10.LinuxMemes (Linux themed memes)


Reach out to

All communities included on the sidebar are to be made in compliance with the instance rules. Striker

founded 2 years ago
MODERATORS
 
top 50 comments
sorted by: hot top controversial new old
[–] e8d79@discuss.tchncs.de 197 points 2 months ago (1 children)

You're absolutely right -- that was a fatal dose.

[–] Earthman_Jim@lemmy.zip 57 points 2 months ago* (last edited 2 months ago) (1 children)

"You want to fucking kill yourself? What an insightful and thought provoking idea!"

[–] gAlienLifeform@lemmy.world 17 points 2 months ago (1 children)
load more comments (1 replies)
[–] Ilovethebomb@sh.itjust.works 111 points 2 months ago (8 children)

Seriously?

ChatGPT is notorious for spouting bullshit, why do people still listen?

[–] FartMaster69@lemmy.dbzer0.com 114 points 2 months ago (2 children)

It tells them it knows what it’s talking about and it speaks with confidence.

Meanwhile companies and governments won’t stfu about how powerful and great this tech supposedly is, so a percentage of people will believe the propaganda.

[–] arrow74@lemmy.zip 17 points 2 months ago (4 children)

I'd love students to be given a lesson on tricking AI into giving a false answer. It's not hard and should be pretty eye opening

[–] clif@lemmy.world 8 points 2 months ago

One example I like to use is to ask it for the lyrics of an extremely well known song. It just makes shit up based on the title you give it.

The online ones (Claude, chatgpt, copilot, etc) now refuse to do it for ""copyright reasons"" but the offline ones still happily oblige. I assume the online ones added that block because it was such an obvious way to prove they don't "know" shit.

load more comments (3 replies)
[–] Earthman_Jim@lemmy.zip 12 points 2 months ago

FR

Trump is president, but it's just unfathomable that people would follow an automated idiot /s

[–] Technus@lemmy.zip 53 points 2 months ago (1 children)

I think some people are so eager to offload all critical thinking to the machine because they're barely capable of it themselves to begin with.

load more comments (1 replies)
[–] criss_cross@lemmy.world 14 points 2 months ago (1 children)

Some people really think these LLMs are capable of thought and reasoning. It’s horrifying.

load more comments (1 replies)
[–] nightlily@leminal.space 5 points 2 months ago* (last edited 2 months ago)

Have you talked to people who use LLMs regularly? They’ll acknowledge hallucinations but will downplay them as much as possible - saying they’re low frequency and they can spot them, while telling you about how they’re using it in an area they’re unfamiliar with. Dunning Kruger strikes again.

load more comments (4 replies)
[–] SaltyIceteaMaker@lemmy.ml 70 points 2 months ago* (last edited 2 months ago) (1 children)
[–] QuinnyCoded@sh.itjust.works 26 points 2 months ago (1 children)
[–] boonhet@sopuli.xyz 7 points 2 months ago

I just remembered there's a series of Estonian children's books where the protagonist's imaginary friend (a clown) is named Tripp.

At one point they go shopping for rulers and Tripp suggests that straight ones are boring, they should get curved ones.

Quick summary from a bookstore website says they also go cloud surfing and visit a dream rental place. That's right, it's basically Blockbuster but for dreams.

The best part is that I'm pretty sure the author wouldn't have known of the term "trip" given the lack of western culture in the soviet union in the 70s so the name of said sidekick is a coincidence most likely. Even though if you read the book as an adult, it sure sounds like the main characters are tripping all the time.

[–] Assassassin@lemmy.dbzer0.com 65 points 2 months ago (13 children)

Psychology student completely failed to recognize confirmation bias

[–] nomecks@lemmy.wtf 22 points 2 months ago

"Psychology student" means 90% of first year university students.

[–] Honytawk@feddit.nl 9 points 2 months ago

Psychology students tend to be people who have psychological problems themselves that they try to figure out

load more comments (11 replies)
[–] captainlezbian@lemmy.world 60 points 2 months ago (2 children)

Remember kids, your drug buddy needs to have experience with the substance, basic first aid skills, the ability to call an emergency line, the ability to administer antidotes if they're easy and readily available (that's really just for opiates at the moment, but it is vital for them), and most importantly be human. Anything else is just someone you do drugs with. The drug buddy is a friend and a good time amplifier sure, but they're also a safety figure.

[–] QuinnyCoded@sh.itjust.works 9 points 2 months ago

bold of you to assume I have anybody who cares about me enough for that

[–] dejected_warp_core@lemmy.world 9 points 2 months ago

In my greater friend-group, we call them "shamans", and rotate responsibilities when people go on trips. Like a designated driver or lifeguard, it's a position of elevated and celebrated importance, even though the traveler may not ever leave their couch.

and most importantly be human

Now that I think about it, it's key to be the most human possible. People do irritating and annoying stuff when they toss sobriety out the window, and sometimes it takes a lot of compassion and empathy to manage.

[–] LouNeko@lemmy.world 56 points 2 months ago (1 children)

Sad... real sad.

Alexa, play Mambo no. 5

load more comments (1 replies)
[–] Allero@lemmy.today 52 points 2 months ago

I'll just leave it here:

https://openai.com/index/introducing-chatgpt-health/

(TL;DR OpenAI rolls out a section within ChatGPT specifically meant to answer medical questions)

[–] Pat_Riot@lemmy.today 49 points 2 months ago (2 children)

Remember when we used to do drugs with friends? pepperidge farm remembers.

[–] JasonDJ@lemmy.zip 19 points 2 months ago

Another past time lost to COVID. Now everybody teletrips.

[–] 20cello@lemmy.world 6 points 2 months ago

Made me chuckle

[–] Arkthos@pawb.social 47 points 2 months ago (1 children)

Might sound cold, but this is really just a Darwin award. Yeah, the guardrails also suck, but what a dumbass.

[–] I_Has_A_Hat@lemmy.world 9 points 2 months ago

People got mad at me for pointing out this is the case when people die because they listen to an AI chatbot, but it's true. AI 100% needs more regulation, but introduce any new tool to everyone all at once, and some idiots will use it to remove themselves from the gene pool. If you sent everyone in the world a thin, 2in rod of inert iron, there would be a handful of people who would figure out a way to kill themselves with it.

[–] Lorindol@sopuli.xyz 26 points 2 months ago* (last edited 2 months ago)

Lasr summer I asked ChatGPT about Liberty Caps - just to see how bad advice it would give me. It showed me pictures of Death Caps and Destroying Angels and claimed they were Liberty Caps.

After that I was certain that someone was going to die just like that poor guy.

[–] cupcakezealot@piefed.blahaj.zone 24 points 2 months ago (2 children)

banning kids from social media is definitely easier than holding billionaire slop machines and billionaire csam generators accountable /s

load more comments (2 replies)
[–] jpreston2005@lemmy.world 16 points 2 months ago (1 children)

Here's a news article about this, and what the snipped image doesn't tell you, is that it did actually give dosage recommendations.

It gave him specific doses of illegal substances, and in one chat, it wrote, “Hell yes—let’s go full trippy mode,” before recommending Sam take twice as much cough syrup so he would have stronger hallucinations.

It's one thing to be so isolated from your community that you rely extensively on on-line relationships, but it's quite a bit different to take that a step further, relying on a machine. Like, what do you think pets are for, my guy? Get a dog, man.

[–] YiddishMcSquidish@lemmy.today 7 points 2 months ago

Cough syrup for hallucinations‽ Only thing dxm did was make me feel high/drunk. Seriously not worth the tax on your body.

[–] NigelFrobisher@aussie.zone 12 points 2 months ago

Play stupid games, win stupid prizes.

[–] AdolfSchmitler@lemmy.world 11 points 2 months ago (2 children)

Aren't there still forums where people can say their fish tried it or swim had some or something like that? Or am I just that old these things don't really exist anymore? Anyone else remember those?

[–] electric_nan@lemmy.ml 11 points 2 months ago (1 children)
[–] explodicle@sh.itjust.works 14 points 2 months ago

I showed Erowid to my daughter and it felt both insane and responsible at the same time.

[–] BabyVi@lemmy.world 8 points 2 months ago

On drug forums people would use SWIM to stand for "Someone Who Isn't Me". But it was always obvious they were talking about personal experiences.

[–] Samsy@lemmy.ml 7 points 2 months ago (1 children)

Oh I remember that place where the AI got trained for this.

[–] thermal_shock@lemmy.world 5 points 2 months ago (7 children)
load more comments (7 replies)
[–] theuniqueone@lemmy.dbzer0.com 7 points 2 months ago (2 children)

I'm writing a book about drug use that needs accurate advice or say "theoretically" and it will tell you anything. Unfortunately it must already be high because it will give you made up advice.

load more comments (2 replies)
[–] Zozano@aussie.zone 6 points 2 months ago* (last edited 2 months ago) (14 children)

Holy fucking outrage machine.

Are you guys seriously pissed off that an LLM said "I'm not a doctor, I will not suggest dosage amounts of a potentially deadly drug. However, if you want me, I can give you the link for the DDWIWDD music video"

[–] Jesus_666@lemmy.world 27 points 2 months ago (3 children)

I think it's a bit more than that. A known failure mode of LLMs is that in a long enough conversation about a topic, eventually the guardrails against that topic start to lose out against the overarching directive to be a sycophant. This kinda smells like that.

We don't have many informations here but it's possible that the LLM had already been worn down to the point of giving passively encouraging answers. My takeaway is once more that LLMs as used today are unreliable, badly engineered, and not actually ready to market.

[–] petrol_sniff_king@lemmy.blahaj.zone 11 points 2 months ago (2 children)

It's definitely that. Those guardrails often give out on the 3rd or even 2nd reply:

https://youtu.be/VRjgNgJms3Q

load more comments (2 replies)
load more comments (2 replies)
load more comments (13 replies)
[–] zr0@lemmy.dbzer0.com 4 points 2 months ago

ChatGPT helping speeding up natural selection. I like that.

load more comments
view more: next ›