Thousands of victims have sued Apple over its alleged failure to detect and report illegal child pornography, also known as child sex abuse materials (CSAM).
The proposed class action comes after Apple scrapped a controversial CSAM-scanning tool last fall that was supposed to significantly reduce CSAM spreading in its products. Apple defended its decision to kill the tool after dozens of digital rights groups raised concerns that the government could seek to use the functionality to illegally surveil Apple users for other reasons. Apple also was concerned that bad actors could use the functionality to exploit its users and sought to protect innocent users from false content flags.
Child sex abuse survivors suing have accused Apple of using the cybersecurity defense to ignore the tech giant's mandatory CSAM reporting duties. If they win over a jury, Apple could face more than $1.2 billion in penalties. And perhaps most notably for privacy advocates, Apple could also be forced to "identify, remove, and report CSAM on iCloud and implement policies, practices, and procedures to prevent continued dissemination of CSAM or child sex trafficking on Apple devices and services." That could mean a court order to implement the controversial tool or an alternative that meets industry standards for mass-detecting CSAM.
Some of these AI influencers, like Lil Miquela, are a sort of artsy commentary on the nature of influencing or something conceptually interesting. But when I looked a little further into one of the AI-generated influencer accounts on Instagram โ one that had reportedly gotten some brand deals โ I found a different type of story.
One of the most popular AI influencers had a link in her bio to a profile on Fanvue, an OnlyFans competitor. On her Fanvue account, the influencer posted proactive photos โ and for a $7-a-month subscription, I could see her nude photos. (I feel strange saying "she" and "nude" because this person doesn't exist. Remember: She's AI. But this is where we are in 2024, I suppose.)
Ah, so I get it now: The business was always pornography โ Instagram and other social media were just at the top of the conversion funnel. These accounts weren't trying to become "Instagram influencers" who made money through promoting shampoo โ they were using Instagram to drive traffic to Fanvue, where they could get men to pay to see their nude photos.
Once potential customers get to the paysites, they encounter more AI-generated pictures and videos.
The tech news site 404 Media just published a deep dive into this world, "Inside the Booming 'AI Pimping' Industry." What reporters found was an astounding amount of AI-fueled accounts on both OnlyFans and Fanvue. Disturbingly, 404 Media found a number of these accounts used images that weren't purely dreamed up by AI. Some were deepfakes โ fake images of real people โ or were face swaps, using someone's face on an AI-generated body.
There is also a whole side economy of people selling guides and courses on how others can set up their own businesses to create AI models. One person is selling a course for $220 on how to make money with AI adult influencers.
A Fanvue spokesperson told Business Insider that using images that steal someone's identity is against its rules. Fanvue also uses a third-party moderation tool and has human moderators. The spokesperson also said that "deepfakes are an industry challenge." OnlyFans' terms of service prohibit models from using AI chatbots. Those terms also say that AI content is allowed only if users can tell it's AI and only if that content features the verified creator โ not someone else.
Potentially stolen images aside, the existence of AI adult content is somewhat fraught. On one hand, some of these AI creators claim that this is not unlike cartoon pornography. But real-life adult content creators have concerns about AI affecting their business. Some told Business Insider's Marta Biino recently that they find AI tools useful โ like AI chatbots they use to talk to fans. But they said they also worried that using AI could erode fans' trust.
I'm not sure that the fans of the AI accounts are always aware that these "people" are artificial intelligence. Comments on one obviously AI-generated woman's account read like a lot of people think she's human. On her Fanvue, the AI-generated woman sometimes posts pink-haired anime cartoon versions of herself.
On one of these posts, a paying Fanvue customer wrote that he wanted to see the outfit on the real woman โ not an anime version. I'm not sure that he knows neither one is real.