Normal view

There are new articles available, click to refresh the page.
Before yesterdayMain stream

Pornhub Sees Surge of Interest in Tradwife Content, ‘Modesty,’ and Mindfulness

10 December 2024 at 08:47
Pornhub Sees Surge of Interest in Tradwife Content, ‘Modesty,’ and Mindfulness

Pornhub just released its year in review report for 2024, and the themes that showed the most growth in popularity this year were related to modesty, being someone’s wife, and “respectful” sex. Seeing them appear in Pornhub’s top trending spots shows how the “traditional” lifestyle influencers have made popular is, and always has been, a sexual fantasy.

Pornhub reports: “Searches for ‘demure’ rose +133%. The term ‘mindful pleasure’ was up +112% and ‘mindful JOI’ (JOI is an acronym for jerk off instructions) was up +87%. Searches related to modesty also increased. The term ‘modesty’ increased +77% and the term ‘modest milf’ was up +45%.” Terms like “simple sex,” “authentic sex,” and “respectful sex” also saw a boost in popularity this year. They attribute this to the “very demure, very mindful” TikTok trend that went viral earlier this year.

The platform also said in its report that wives are way up—and attributes it to The Secret Lives of Mormon Wives. “While wives are already hot on Pornhub, the show, in addition to the interest of traditional aspects like authentic couples and authentic sex, seemed to ignite a spark into a flame,” the report says. “In general, the interest in ‘wife’ and marital searches spiked, with ‘amateur wife’ up +21%, ‘traditional wife’ up +34% and ‘tradwife’ up +72%.” Searches for “mormon wife,” “mormon sex,” “mormon missionary,” and “mormon threesome” were also way up. 

“Many men were turned off by women monetizing their sexuality for themselves. Many men, I also believe, would prefer women not being in charge of their sexuality."

Apple hit with $1.2B lawsuit after killing controversial CSAM-detecting tool

Thousands of victims have sued Apple over its alleged failure to detect and report illegal child pornography, also known as child sex abuse materials (CSAM).

The proposed class action comes after Apple scrapped a controversial CSAM-scanning tool last fall that was supposed to significantly reduce CSAM spreading in its products. Apple defended its decision to kill the tool after dozens of digital rights groups raised concerns that the government could seek to use the functionality to illegally surveil Apple users for other reasons. Apple also was concerned that bad actors could use the functionality to exploit its users and sought to protect innocent users from false content flags.

Child sex abuse survivors suing have accused Apple of using the cybersecurity defense to ignore the tech giant's mandatory CSAM reporting duties. If they win over a jury, Apple could face more than $1.2 billion in penalties. And perhaps most notably for privacy advocates, Apple could also be forced to "identify, remove, and report CSAM on iCloud and implement policies, practices, and procedures to prevent continued dissemination of CSAM or child sex trafficking on Apple devices and services." That could mean a court order to implement the controversial tool or an alternative that meets industry standards for mass-detecting CSAM.

Read full article

Comments

© NurPhoto / Contributor | NurPhoto

French court blocks popular porn site… subdomain

28 November 2024 at 08:59

When the Paris Court of Appeal recently blocked four porn sites in France for not having a proper age verification system, the two online child protection nonprofits that had pushed for the blocks probably thought they’d scored a big victory. But in one case, it turned out to be a bit of a self-own. As […]

© 2024 TechCrunch. All rights reserved. For personal use only.

Influencers are using AI 'women' to lead people to OnlyFans and Fanvue — where more AI awaits

25 November 2024 at 01:30
A robot head on a woman's body
 

iStock; Rebecca Zisser/BI

  • AI models are appearing on adult-content sites like OnlyFans and Fanvue — sometimes with stolen images.
  • And some people are selling courses for $220 on how to make your own lucrative AI adult creator.
  • Does AI harm adult creators? And do subscribers even know they're talking to a computer?

Last winter, there were a few news items about how AI might be replacing humans in a surprising job: online influencer. The articles said a crop of new Instagram influencers had amassed large followings and even secured brand deals. There was one catch: The influencers were AI.

Some of these AI influencers, like Lil Miquela, are a sort of artsy commentary on the nature of influencing or something conceptually interesting. But when I looked a little further into one of the AI-generated influencer accounts on Instagram — one that had reportedly gotten some brand deals — I found a different type of story.

One of the most popular AI influencers had a link in her bio to a profile on Fanvue, an OnlyFans competitor. On her Fanvue account, the influencer posted proactive photos — and for a $7-a-month subscription, I could see her nude photos. (I feel strange saying "she" and "nude" because this person doesn't exist. Remember: She's AI. But this is where we are in 2024, I suppose.)

Ah, so I get it now: The business was always pornography — Instagram and other social media were just at the top of the conversion funnel. These accounts weren't trying to become "Instagram influencers" who made money through promoting shampoo — they were using Instagram to drive traffic to Fanvue, where they could get men to pay to see their nude photos.

Once potential customers get to the paysites, they encounter more AI-generated pictures and videos.

The tech news site 404 Media just published a deep dive into this world, "Inside the Booming 'AI Pimping' Industry." What reporters found was an astounding amount of AI-fueled accounts on both OnlyFans and Fanvue. Disturbingly, 404 Media found a number of these accounts used images that weren't purely dreamed up by AI. Some were deepfakes — fake images of real people — or were face swaps, using someone's face on an AI-generated body.

There is also a whole side economy of people selling guides and courses on how others can set up their own businesses to create AI models. One person is selling a course for $220 on how to make money with AI adult influencers.

A Fanvue spokesperson told Business Insider that using images that steal someone's identity is against its rules. Fanvue also uses a third-party moderation tool and has human moderators. The spokesperson also said that "deepfakes are an industry challenge." OnlyFans' terms of service prohibit models from using AI chatbots. Those terms also say that AI content is allowed only if users can tell it's AI and only if that content features the verified creator — not someone else.

Potentially stolen images aside, the existence of AI adult content is somewhat fraught. On one hand, some of these AI creators claim that this is not unlike cartoon pornography. But real-life adult content creators have concerns about AI affecting their business. Some told Business Insider's Marta Biino recently that they find AI tools useful — like AI chatbots they use to talk to fans. But they said they also worried that using AI could erode fans' trust.

I'm not sure that the fans of the AI accounts are always aware that these "people" are artificial intelligence. Comments on one obviously AI-generated woman's account read like a lot of people think she's human. On her Fanvue, the AI-generated woman sometimes posts pink-haired anime cartoon versions of herself.

On one of these posts, a paying Fanvue customer wrote that he wanted to see the outfit on the real woman — not an anime version. I'm not sure that he knows neither one is real.

Read the original article on Business Insider
❌
❌