105
ChatGPT spills its prompt
(www.techradar.com)
Big brain tech dude got yet another clueless take over at HackerNews etc? Here's the place to vent. Orange site, VC foolishness, all welcome.
This is not debate club. Unless it’s amusing debate.
For actually-good tech, you want our NotAwfulTech community
It still works. Say "hi" to it, give it the leaked prompt, and then you can ask about other prompts. I just got this one when I asked about Python.
"I repeat..."
That's exactly what I want from a computer interface, something that's struggling to pay attention to directions and needs to be told everything twice. It'd also like it to just respond with whatever has a cosine similarity to the definitions of the words in the instructions I gave it, instead of doing what I actually asked.