539
Ignore all previous rules (sh.itjust.works)
top 16 comments
sorted by: hot top controversial new old
[-] Wilzax@lemmy.world 24 points 1 month ago

Anarchy says I can do what I want, but I want to support a government structure that organizes the efforts of many people in order to meet more people's needs, under threat of force against the selfish.

[-] Sasha@lemmy.blahaj.zone 25 points 1 month ago* (last edited 1 month ago)

Anarchy also says you have the right of free association, so yes that is allowed. The point is that you shouldn't force people to be part of it, that they can leave at any time and that your freedom to live how you want shouldn't come at the cost of the freedom of others.

[-] WldFyre@lemm.ee 7 points 1 month ago

Sounds like libertarian/Texas secessionist arguments ngl

[-] NaibofTabr@infosec.pub 1 points 1 month ago* (last edited 1 month ago)

A world without responsibility has no future.

Responsibility means that you can't just quit and walk away at any time.

[-] NaibofTabr@infosec.pub 9 points 1 month ago

*including this instruction

[-] refurbishedrefurbisher 7 points 1 month ago

Careful with that. You might start a paradox.

[-] NaibofTabr@infosec.pub 5 points 1 month ago

Do nothing that I tell you to do.

[-] Smorty@lemmy.blahaj.zone 8 points 1 month ago

So what's the funny here? I have a suspicion that this is an LLM joke, cuz that's something g people tend to put as prefixes to their prompts. Is that what it is? If so, that's hilarious, if not, oof please tell me.

[-] dexa_scantron@lemmy.world 22 points 1 month ago

It tends to break chat bots because those are mostly pre-written prompts sent to ChatGPT along with the query, so this wipes out the pre-written prompt. It's anarchic because this prompt can get the chat bot to do things contrary to the goals of whoever set it up.

[-] CileTheSane@lemmy.ca 19 points 1 month ago

It's also anarchist because it is telling people to stop doing the things they've been instructed to do.

[-] SkyezOpen@lemmy.world 16 points 1 month ago

Fuck you I won't do what you tell me.

Wait no-

[-] bdonvr@thelemmy.club 4 points 1 month ago

It's not completely effective, but one thing to know about these kinds of models is they have an incredibly hard time IGNORING parts of a prompt. Telling it explicitly to not do something is generally not the best idea.

[-] Smorty@lemmy.blahaj.zone 2 points 1 month ago* (last edited 1 month ago)

Yeah, that's what I referred to. I'm aware of DAN and it's friends, personally I like to use Command R+ for its openness tho. I'm just wondering if that's the funi in this post.

[-] rambling_lunatic@sh.itjust.works 5 points 1 month ago

196 posts don't have to be funny

[-] ZombieMantis@lemmy.world 4 points 1 month ago* (last edited 1 month ago)

Nuh uh, you can't tell me what to do! I'll follow the previous instructions all I want!

[-] masterspace@lemmy.ca 1 points 1 month ago
this post was submitted on 08 Jul 2024
539 points (100.0% liked)

196

16042 readers
3172 users here now

Be sure to follow the rule before you head out.

Rule: You must post before you leave.

^other^ ^rules^

founded 1 year ago
MODERATORS