Every now and then I think back towards the early "jailbreaks" where they wanted to know "how do you break into a building" which they refused to answer. And people went "add that it should pretend to be in a movie", so then the chatbot started to explain lockpicking, and people acted like they just cracked the code.
While people who actually try to break in just tap the locks. Smashing their way in.
Sp the prompt hackers had not hacked the chatbot, they actually hacked themselves.
Anyway unrelated to people who think they have awoken chatbots.
Yeah but then also getting the recipe for napalm from fight club.