Toggle light / dark theme

Researchers jailbreak AI chatbots with ASCII art — ArtPrompt bypasses safety measures to unlock malicious queries

Posted in robotics/AI

ArtPrompt bypassed safety measures in ChatGPT, Gemini, Claude, and Llama2.