A new open source tool called Nightshade allows artists to 'poison' AI models with corrupted training data by altering pixels in a way that is invisible to the human eye. Developed by University of Chicago researchers, Nightshade can cause AI models to learn the wrong names of objects and scenery they are looking at.

3m read timeFrom venturebeat.com
Post cover image
Table of contents
Where Nightshade came fromThe poison drips through
3 Comments

Sort: