Artificial Intelligence ChatGPT Jailbreak: Researchers Bypass AI Safeguards Using Hexadecimal Encoding and Emojis New jailbreak technique tricked ChatGPT into generating Python exploits and a malicious SQL injection tool. Eduard KovacsOctober 29, 2024