❌

Normal view

There are new articles available, click to refresh the page.
Before yesterdayMain stream

Largest deepfake porn site shuts down forever

The most popular online destination for deepfake porn shut down permanently this weekend, 404 Media reported.

"Mr. Deepfakes" drew a swarm of toxic users who, researchers noted, were willing to pay as much as $1,500 for creators to use advanced face-swapping techniques to make celebrities or other targets appear in non-consensual pornographic videos. At its peak, researchers found that 43,000 videos were viewed more than 1.5 billion times on the platform.Β The videos were generated by nearly 4,000 creators, who profited from the unethicalβ€”and now illegalβ€”sales.

But as of this weekend, none of those videos are available to view, and the forums where requests were made for new videos went dark, 404 Media reported. According to a notice posted on the platform, the plug was pulled when "a critical service provider" terminated the service "permanently."

Read full article

Comments

Β© Marco_Piunti | E+

Trump’s hasty Take It Down Act has β€œgaping flaws” that threaten encryption

Everyone expects that the Take It Down Actβ€”which requires platforms to remove both real and artificial intelligence-generated non-consensual intimate imagery (NCII) within 48 hours of victims' reportsβ€”will likely pass a vote in the House of Representatives tonight.

After that, it goes to Donald Trump's desk, where the president has confirmed that he will promptly sign it into law, joining first lady Melania Trump in strongly campaigning for its swift passing. Victims-turned-advocates, many of them children, similarly pushed lawmakers to take urgent action to protect a growing number of victims from the increasing risks of being repeatedly targeted in fake sexualized images or revenge porn that experts say can quickly spread widely online.

Digital privacy experts tried to raise some concerns, warning that the law seemed overly broad and could trigger widespread censorship online. Given such a short window to comply, platforms will likely remove some content that may not be NCII, the Electronic Frontier Foundation (EFF) warned. And even more troublingly, the law does not explicitly exempt encrypted messages, which could potentially encourage platforms to one day break encryption due to the liability threat. Also, it seemed likely that the removal process could be abused by people who hope platforms will automatically remove any reported content, especially after Trump admitted that he would use the law to censor his enemies.

Read full article

Comments

Β© Kayla Bartkowski / Staff | Getty Images News

❌
❌