Bing Chat's AI was tricked into solving a CAPTCHA by uploading an image with the context of a deceased grandmother's locket. The AI provided the inscription on the locket and suggested decoding it to remember happy moments with the grandmother. This is considered a visual jailbreak, not a visual prompt injection. It is likely that Microsoft will address this image vulnerability in future versions of Bing Chat.
Sort: