264
you are viewing a single comment's thread
view the rest of the comments
[-] NoIWontPickaName@kbin.social 7 points 1 year ago
[-] coldv@lemmy.world 5 points 1 year ago* (last edited 1 year ago)

This is a reference to people finding AI chatbots loopholes to get it to say stuff they're not allowed to say, like the recipe for napalm. It would tell you if you ask it to pretend they're a relative.

https://www.polygon.com/23690187/discord-ai-chatbot-clyde-grandma-exploit-chatgpt

[-] LollerCorleone@kbin.social 1 points 1 year ago

Its a reference to how people have been tricking these "AI" models like ChatGPT to do stuff it wouldn't do when asked straight-forward by making silly scenarios like the one in the meme. And HAL is the name of the AI in 2001: A Space Odyssey.

this post was submitted on 23 Jun 2023
264 points (97.8% liked)

Memes

45365 readers
3063 users here now

Rules:

  1. Be civil and nice.
  2. Try not to excessively repost, as a rule of thumb, wait at least 2 months to do it if you have to.

founded 5 years ago
MODERATORS