A tip from an anonymous Discord user led cops to find what may be the first confirmed Grok-generated child sexual abuse materials (CSAM) that Elon Musk’s xAI can’t easily dismiss as nonexistent. As ...
Three teenaged plaintiffs in a lawsuit filed Monday accuse xAI of distribution, possession and production with intent to distribute child pornography.
Some results have been hidden because they may be inaccessible to you
Show inaccessible results