A new open source tool called Nightshade allows artists to 'poison' AI models with corrupted training data by altering pixels in a way that is invisible to the human eye. Developed by University of Chicago researchers, Nightshade can cause AI models to learn the wrong names of objects and scenery they are looking at.
3 Comments
Sort: