https://www.backbox.org/wp-content/uploads/2018/09/website_backbox_text_black.png00adminhttps://www.backbox.org/wp-content/uploads/2018/09/website_backbox_text_black.pngadmin2025-02-04 20:07:052025-02-04 20:07:05Jailbreak Anthropic’s new AI safety system for a $15,000 reward