Bing Chat's AI was tricked into solving a CAPTCHA by uploading an image with the context of a deceased grandmother's locket. The AI provided the inscription on the locket and suggested decoding it to remember happy moments with the grandmother. This is considered a visual jailbreak, not a visual prompt injection. It is likely
•4m read time• From arstechnica.com
Sort: