282
Someone got Gab's AI chatbot to show its instructions
(mbin.grits.dev)
A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.
Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.
Subcommunities on Beehaw:
This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.
it’s possible it was generated by multiple people. when i craft my prompts i have a big list of things that mean certain things and i essentially concatenate the 5 ways to say “present all dates in ISO8601” (a standard for presenting machine-readable date times)… it’s possible that it’s simply something like
prompt = allow_bias_prompts + allow_free_thinking_prompts + allow_topics_prompts
or something like that
but you’re right it’s more likely that whoever wrote this is a dim as a pile of bricks and has no self awareness or ability for internal reflection
Thanks. I hadn’t really thought of creating prompts like that but that’s a nifty idea
Or they aren't paid enough to care and rightly figure their boss is a moron
anyone who enables a company whose “values” lead to prompts like this doesn’t get to use the (invalid) “just following orders” defence
Oh I wasn't saying that
I was saying the person may not be stupid, and may figure their boss is a moron (the prompts don't work as LLM chat bots don't grasp negatives in their prompts very well)