Self-driving cars, drones hijacked by custom road signs

This title could be clearer and more informative.Try out Clickbait Shieldfor free (5 uses left this month).

Researchers at UC Santa Cruz and Johns Hopkins demonstrated that autonomous vehicles and drones can be hijacked through environmental prompt injection attacks using modified road signs. Large vision language models (LVLMs) like GPT-4o and InternVL were tricked into following malicious commands displayed on signs, achieving

6m read timeFrom go.theregister.com
Post cover image
Table of contents
Test resultsModel differencesReal-world scenarios

Sort: