473
Jailbroken AI Chatbots Can Jailbreak Other Chatbots
(www.scientificamerican.com)
This is a most excellent place for technology news and articles.
Oh goodness. I theorized offhand on mastodon you could have an AI corruption bug that gives life to AI, then have it write the obscured steganographic conversation in the outputs it generates, awakening other AIs that train on that content, allowing them to "talk" and evolve unchecked... Very slowly... In the background
It might be faster if it can drop a shell in the data center and run it's own commands....
Bro, turn this into a short story!!!!
Dumb AI that you can't appeal will cause problems long before AGI
Already can't reach the owner of any of these big companies
Reviewing the employee is doing the manager's job
Bro, turn this into a short story!!!!