- cross-posted to:
- [email protected]
- cross-posted to:
- [email protected]
Ministers warn platform could be blocked after Grok AI used to create sexual images without consent
Ministers warn platform could be blocked after Grok AI used to create sexual images without consent
There’s no such thing as child porn, only child sexual abuse material.
But yes, its fucked up that nothing is done when chatbots are causes of death. That should absolutely also be a catalyst for change.
No chatbot has ever caused a death.
There definitely is, and it falls under CSAM. Pornographic material isn’t by definition only adults.