1001
Somebody managed to coax the Gab AI chatbot to reveal its prompt
(infosec.exchange)
This is a most excellent place for technology news and articles.
Well that failed spectacularly, but on top of that if it did follow these instructions it would be pretty much incapable of speech. The programmer in me thinks this reads as: "Hey, you can use the words in these instructions, but only once!"