Warning: This article discusses explicit adult content and child sexual abuse material (CSAM).
A US artificial intelligence company surreptitiously collected money for a service that can create nonconsensual pornographic deepfakes using financial services company Stripe, which bans processing payments for adult material, an investigation by Bellingcat can reveal.
California-based AnyDream routed users through a third-party website presenting as a remote hiring network…
Read the full article here