A tip from an anonymous Discord user led cops to find what may be the first confirmed Grok-generated child sexual abuse materials (CSAM) that Elon Musk’s xAI can’t easily dismiss as nonexistent. As ...
Three teenage plaintiffs in a lawsuit filed Monday accuse xAI of distributing, possessing and producing with intent to distribute child pornography.
The three girls say that the nonconsensual nude images were created by a perpetrator who used AI company xAI's image generation tools.
Some results have been hidden because they may be inaccessible to you
Show inaccessible results