‘Minion Gore’ Videos Use AI to Post Murder to Instagram, TikTok, and YouTube
People are using the popular AI video generator Runway to make real videos of murder look like they came from one of the animated Minions movies and upload them to social media platforms where they gain thousands of views before the platforms can detect and remove them. This AI editing method appears to make it harder for major platforms to moderate against infamously graphic videos which previously could only be found on the darkest corners of the internet.
The practice, which people have come to call “Minion Gore” or “Minion AI videos” started gaining popularity in mid-December, and while 404 Media has seen social media platforms remove many of these videos, at the time of writing we’ve seen examples of extremely violent Minion Gore videos hosted on YouTube, TikTok, Instagram, and X, which were undetected until we contacted these platforms for comment.
Specifically, by comparing the Minion Gore edits to the original videos, I was able to verify that TikTok was hosting a Minionfied video of Ronnie McNutt, who livestreamed his suicide on Facebook in 2020, shooting himself in the head. Instagram is still hosting a Minionfied clip from the 2019 Christchurch mosque shooting in New Zealand, in which a man livestreamed himself killing 51 people. I’ve also seen other Minion Gore videos I couldn’t locate the source materials for, but appear to include other public execution videos, war footage from the frontlines in Ukraine, and workplace accidents on construction sites.
The vast majority of these videos, including the Minion Gore videos of the Christchurch shooting and McNutt’s suicide, include a Runway watermark in the bottom right corner, indicating they were created on its platform. The videos appear to use the company’s Gen-3 “video to video” tool, which allows users to upload a video they can then modify with generative AI. I tested the free version of Runway’s video to video tool and was able to Minionify a video I uploaded to the platform by writing a text prompt asking Runway to “make the clip look like one of the Minions animated movies.”
Runway did not respond to a request for comment.
I’ve seen several examples of TikTok removing Minion Gore videos before I reached out to the company for comment. For example, all the violent TikTok videos included in the Know Your Meme article about Minion Gore have already been removed. As the same Know Your Meme article notes, however, an early instance of the Minion Gore video of McNutt’s suicide gained over 250,000 views in just 10 days. I’ve also found another version of the same video reuploaded to TikTok in mid-December which wasn’t removed until I reached out to TikTok for comment on Tuesday.
TikTok told me it removes any content that violates its Community Guidelines, regardless of whether it was altered with AI. This, TikTok said, includes its policies prohibiting "hateful content as well as gory, gruesome, disturbing, or extremely violent content." TikTok also said that it has been proactively taking action to remove harmful AI-generated content that violates its policies, that it is continuously updating its detection rules for AI-generated content as the technology evolves, and that when made aware of a synthetic video clip that is spreading online and violates its policies, it creates detection rules to automatically catch and take action on similar versions of that content.
Major internet platforms create unique “hashes,” a unique string of letters and numbers that acts as a fingerprint for videos based on what they look like, for known videos that violate their policies. This allows platforms to automatically detect and remove these videos or prevent them from being uploaded in the first place. TikTok did not answer specific questions about whether Minion Gore edits of known violating videos would bypass this kind of automated moderation method. In 2020, Sam and I showed that this type of automated moderation can be bypassed with even simple edits of hashed, violating videos.
“In most cases, current hashing/fingerprinting are unable to reliably detect these variants,” Hany Farid, a professor at UC Berkeley and one of the world’s leading experts on digitally manipulated images and a developer of PhotoDNA, one of the most commonly used image identification and content filtering technologies, told me in an email. “Starting with the original violative content, it would be possible for the platforms to create these minion variations, hash/fingerprint them and add those signatures to the database. The efficacy of this approach would depend on the robustness of the hash algorithm and the ability to closely mimic the content being produced by others. And, of course, this would be a bit of a whack-a-mole problem as creators will replace minions with other cartoon characters.”
This, in fact, is already happening. I’ve seen a video of ISIS executions and the McNutt suicide posted to Twitter, which was also modified with Runway, but instead of turning the people in the video into Minions they were turned into Santa Claus. There are also several different Minion Gore videos of the same violent content, so in theory a hash of one version will not result in the automatic removal of another. Because Runway seemingly is not preventing people from using its tools to edit infamously violent videos, this creates a situation in which people can easily create infinite, slightly different versions of those videos and upload them across the internet.
YouTube acknowledged our request for comment but did not provide one in time for publication. Instagram and X did not respond to a request for comment.