
Internet researchers at Stanford found that Stable Diffusion, the viral text-to-image generator from Stability AI, was trained on illegal child sexual abuse material.
getty
Stable Diffusion, one of the most popular text-to-image generative AI tools on the market from the $1 billion startup Stability AI, was trained on a trove of illegal child sexual abuse material, according to new research from the Stanford Internet Observatory.
The model was trained on massive open datasets so that users…
Read the full article here