Facebook Is Censoring 404 Media Stories About Facebook's Censorship
In early December I got the kind of tip we’ve been getting a lot over the past year. A reader had noticed a post from someone on Reddit complaining about a very graphic sexual ad appearing in their Instagram Reels. I’ve seen a lot of ads for scams or shady dating sites recently, and some of them were pretty suggestive, to put it mildly, but the ad the person on Reddit complained about was straight up a close up image of a vagina.
The reader who tipped 404 Media did exactly what I would have done, which is look up the advertiser in Facebook’s Ad Library, and found that the same advertiser was running around 800 ads across all of Meta’s platforms in November, the vast majority of which are just different close-up images of vaginas. When clicked, the ad takes users to a variety of sites for "confidential dating” or “hot dates” in your area. Facebook started to remove some of these ads on December 13, but at the time of writing, most of them were still undetected by its moderators according to the Ad Library.
Like I said, we get a lot of tips like this these days. We get so many, in fact, that we don’t write stories about them unless there’s something novel or that our readers need to know about them. Facebook taking money to put explicit porn in its ads despite it being a clear violation of its own policies is not new, but definitely a new low for the company and a clear indicator of Facebook’s “fuck it” approach to content moderation, and moderation of its ads specifically.
AI Forensics, a tech platform and algorithmic auditing firm, today put out a report that quantifies just how widespread this problem is. It found over 3,000 pornographic ads promoting “dubious sexual enhancement products” which generated over 8 million impressions over a year in the European Union alone.
In an attempt to show that the ads didn’t use some clever technique to bypass Meta’s moderation tools, AI Forensics uploaded the exact same visuals as standard, non-promoted posts on Instagram and Facebook, and they were removed promptly for violating Meta’s Community Standards.
“Our findings suggest that although Meta has the technology to automatically detect pornographic content, it does not apply it to enforce its community standards on advertisements as it does for non-sponsored content,” AI Forensics said in its report. “This double standard is not a temporary bug, but persisted since as early as, at least, December 2023.”
When we write about this problem with Facebook’s moderation we always stress that there’s nothing inherently alarming about nudity itself on social media. The problem is that the policy against it is blatantly hypocritical because it often bans legitimate adult content creators, sex workers, and sex educators who are trying to play by the platform’s rules, while bad actors who don’t care about Facebook’s rules find loopholes that allow them to post all the pornography they want. Additionally, that pornography is almost always stolen from the same legitimate creators who Facebook polices so heavily, the ads are almost always for products and services that are trying to scam or take advantage of the audience Facebook is allegedly trying to protect, and in some cases promote tools for creating nonconsensual pornography.
What’s adding insult to injury right now is that in addition to Facebook’s hypocrisy I lay out above, Facebook is now punishing us for publishing stories about this very problem.
In October, I published a story with the headline When Does Instagram Decide a Nipple Becomes Female, in which artist Ada Ada Ada tests the boundaries of Instagram’s automated and human moderation systems by uploading a daily image of her naked torso during her transition. The project exposes how silly Instagram’s rules are around allowing images of male nipples while not allowing images of female nipples, and how those rules are arbitrarily enforced.
It was disappointing but not at all surprising that Facebook punished us for sharing that story on its platform. “We removed you photo,” an automated notification from Facebook to the official 404 Media account read. “This goes against our Community Standards on nudity or sexual activity.”
Separately, when Jason tried to share it on his Threads, it removed his post because it included “nudity or sexual activity.” Weirdly, none of the images in the post Jason shared were flagged when Ada Ada Ada uploaded them to Instagram, but they were when Jason shared them on Threads. Threads also removed Joe’s post about a story I wrote about people making AI-generated porn of the Vatican’s new mascot, a story that is about adult content, but that doesn’t contain nude images.
Our official 404 Media page, as well as Jason’s personal account, which he has had for 20 years and which is the “admin” of the 404 Media page, was dinged several times for sharing stories about a bill that would rewrite obscenity standards, the FBI charging a man with cyberstalking, and AI-generated fake images about a natural disaster on Twitter. Facebook has threatened the existence of not just the official 404 Media page, but also of Jason’s personal account.
Not a single one of these stories or the images they include violate Facebook’s policies as they are written, but Facebook nonetheless has limited how many people see these stories and our page in general because we shared them. Facebook has also prevented us from inviting people from liking the page (which presumably would limit its reach also) and warned us that it was “at risk of being suspended,” and later, “unpublished.”
As many sex workers and educators have told us over the years, while Facebook gave us the chance to appeal all of these decisions, trying to correct Facebook’s moderation efforts is not simple, and the “appeals” process consists solely of clicking a few predetermined boxes; there is no chance to interact with a moderator or plead your case. We appealed three of the decisions in late October, none of which were accepted.
The appeal we filed on Ada Ada Ada’s story on the official 404 Media page in mid-December was accepted within a few hours and got the restrictions lifted off of the 404 Media page (and Jason’s personal account) in mid-December. But an appeal Jason filed on his Threads post about the same story was not accepted: “We reviewed your post again. We confirmed that it does not follow our Community Guidelines on nudity or sexual activity,” the appeal determination on Jason’s Threads post read. The different determinations between what was essentially the exact same post shows how all-over-the-place Meta’s moderation remains, which creates an infuriating dynamic for adult content creators. Mark Zuckerberg has personally expressed regret for giving into pressure from the Biden administration to “censor” content during the height of the coronavirus pandemic, but neither he nor Meta has extended an apology to adult content creators who are censored regularly.
It was hard enough to deal with having to constantly prove to Facebook that our journalism is not pornography or harmful content when we worked at VICE, where we had a whole audience and social media team who dealt with this kind of thing. It’s much harder for us to do that now that we’re an independent publication with only four workers who have to do this in addition to everything else. I can’t imagine how demoralizing it would be to have to deal with this as a single adult content creator trying to promote their work on Facebook’s platforms.
Again, this is frustrating as is, but infuriating when I regularly see Facebook not only take money from advertisers that are pushing nudity on Facebook, but doing it for the explicit purpose of creating nonconsensual content or scamming its users.
The silver lining here is that Facebook was already increasingly a waste of our time. The only reason we’re able to share our stories via our official Facebook page is that we’ve fully automated that process, because it is not actually worth our time to post our stories there organically. Since before we started 404 Media, we knew there was very little chance that Facebook would help us reach people, grow our audience, and make the case that people should support our journalism, so in a way we lost nothing because there’s nothing to lose.
On the other hand, that perspective is based on us having already accepted Facebook’s rejection of our journalism years ago. It’s not as if people don’t get any news on Facebook. According to Pew, about a third of adults in the U.S. get news from Facebook, but according to media monitoring tool Newswhip, the top 10 publishers on Facebook are British tabloids, People, Fox News, CNN, and BBC. Smaller publishers, especially publishers who are trying to report on some of the biggest problems that are plaguing Facebook, are punished for pointing out that those problems involve adult content, which disincentivizes that reporting and allows those problems to fester.
I don’t like it, but ultimately the choices Facebook is making here are shaping its platform, and it’s going to be a bigger problem for its users who are going to fall victim to these ads than it is for us as a publisher.