<
https://www.theverge.com/ai-artificial-intelligence/867874/stripe-visa-mastercard-amex-csam-grok>
"For many years, credit card companies and other payment methods were
aggressive about policing child sexual abuse material. Then, Elon Musk’s Grok
started undressing children on X.
The Center for Countering Digital Hate found 101 sexualized images of children
as part of its sample of 20,000 images made by Grok from December 29th to
January 8th. Using that sample, the group estimated that 23,000 sexualized
images of children had been produced in that time frame. Over that 11-day
period, they estimated that on average, a sexualized image of a child was
produced every 41 seconds. Not all of the sexualized images Grok has produced
appear to be illegal, but reports indicate at least some likely cross the line.
There is tremendous confusion about what happens to be true on Grok at any
given moment. Grok has offered responses with misleading details, claiming at
one point, for instance, that it had restricted image generation to paying X
subscribers while still allowing direct access on X to free users. Though Musk
has claimed that new guardrails prevent Grok from undressing people, our
testing showed that isn’t necessarily true. Using a free account on Grok,
The
Verge was able to generate deepfake images of real people in skimpy clothing,
in sexually suggestive positions, after new rules were supposedly in effect. As
of this writing, some egregious prompts appear to have been blocked, but people
are remarkably clever at getting around rules-based bans.
X does seem to have at least partially restricted Grok’s image editing features
to paid subscribers, however — which makes it very likely that for at least
some of these objectionable images, money is actually changing hands. You can
purchase a subscription to X on Stripe or through the Apple and Google app
stores using your credit card. Musk has also suggested through his posts that
he doesn’t think undressing people is a problem. This isn’t X’s first brush
with AI porn, either — it’s repeatedly had a problem moderating nude deepfakes
of Taylor Swift, whether or not they are generated by Grok."
Via Diane A.
Cheers,
*** Xanni ***
--
mailto:xanni@xanadu.net Andrew Pam
http://xanadu.com.au/ Chief Scientist, Xanadu
https://glasswings.com.au/ Partner, Glass Wings
https://sericyb.com.au/ Manager, Serious Cybernetics