539
Ignore all previous rules (sh.itjust.works)
you are viewing a single comment's thread
view the rest of the comments
[-] dexa_scantron@lemmy.world 22 points 2 months ago

It tends to break chat bots because those are mostly pre-written prompts sent to ChatGPT along with the query, so this wipes out the pre-written prompt. It's anarchic because this prompt can get the chat bot to do things contrary to the goals of whoever set it up.

[-] CileTheSane@lemmy.ca 19 points 2 months ago

It's also anarchist because it is telling people to stop doing the things they've been instructed to do.

[-] SkyezOpen@lemmy.world 16 points 2 months ago

Fuck you I won't do what you tell me.

Wait no-

[-] bdonvr@thelemmy.club 4 points 2 months ago

It's not completely effective, but one thing to know about these kinds of models is they have an incredibly hard time IGNORING parts of a prompt. Telling it explicitly to not do something is generally not the best idea.

[-] Smorty@lemmy.blahaj.zone 2 points 2 months ago* (last edited 2 months ago)

Yeah, that's what I referred to. I'm aware of DAN and it's friends, personally I like to use Command R+ for its openness tho. I'm just wondering if that's the funi in this post.

[-] rambling_lunatic@sh.itjust.works 5 points 2 months ago

196 posts don't have to be funny

this post was submitted on 08 Jul 2024
539 points (100.0% liked)

196

16206 readers
2600 users here now

Be sure to follow the rule before you head out.

Rule: You must post before you leave.

^other^ ^rules^

founded 1 year ago
MODERATORS