A new image data tool dubbed Nightshade has been created by computer science researchers to “poison” data meant for text-to-image models. Nightshade adds imperceptible changes to creators’ images to uploaded online. If a data scraper subsequently adds one of these altered images to its data set, it will “introduce unexpected behaviors,” into the model, poisoning it.
A preview of a paper called Prompt-Specific Poisoning Attacks on Text-to-Image Generative Models was recently published…
Read the full article here