468
Jailbroken AI Chatbots Can Jailbreak Other Chatbots
(www.scientificamerican.com)
This is a most excellent place for technology news and articles.
What's that?
Napalm recipe is forbidden by law? Don't call stuff criminal at random.
Am i the only one worried about freedom of information?
What possible legitimate reason is there for needing a napalm recipe?
You would need to know the recipe to avoid making it by accident.
Especially considering it's actually quite easy to make by accident.