Some AI models are trained on hundreds of gigabytes worth of images, but some researchers show that poisoning just a handful of those could cause the AI model to hallucinate.
How can artists hope to fight back against the whims of tech companies wanting to use their work to train AI? One group of researchers has a novel idea: slip a subtle…
Read the full article here