Subscribe
Elizabeth Hernandez found out about the decade-old murder from a flurry of tips sent to her newsroom in August last year.
The tips were all reacting to a YouTube video with a shocking title: “Husband’s Secret Gay Love Affair with Step Son Ends in Grisly Murder.” It described a gruesome crime that apparently took place in Littleton, Colorado. Almost two million people had watched it.
“Some people in fact were saying, ‘Why didn’t The Denver Post cover this?’” Hernandez, a reporter at the paper, told me. “Because in the video, it makes it sound like it was a big news event and yet, when you Google it, there is no coverage.”
The reason for the lack of coverage was pretty clear to her. In the 26-minute long video, a stilted voice narrated over hazy still images of a neighborhood that really didn’t look like Littleton.
Hernandez called several law enforcement officials and quickly confirmed her suspicions. The murder was fake, and the video was made using generative AI.
The video was uploaded to a YouTube channel called True Crime Case Files. Before the channel was terminated while I was working on this story in January, it posted more than 150 similar videos over the past year. This one was the most popular.
The plots were disturbing, often hypersexual. They described parents selling teenagers into sex slavery with a sheriff, and transgender teachers committing murders to hide affairs with students. The video thumbnails were perverse, with clickbaity phrasing in big blocky text.
Other titles included:
- “Sheriff Murdered After Affair With His Secretary Got Exposed” with 30,000 views.
- “Wife Secret Affair with Neighbor’s Teenage Daughter Ends in Grisly Murder” with 34,000 views.
- “Coach Gives Cheerleader HIV after Secret Affair, Leading to Pregnancy” with 10,000 views.
Each one was made with AI and the crimes described did not happen. There was no language on the channel’s homepage or in video descriptions to tell a viewer otherwise.
According to the man who ran the page, that was by design.
“It needs to be called ‘true crime,’ because true crime is a genre,” the channel’s owner told me over the phone in December. “I wanted [the audience] to think about why […] they care so much that it was true, why it matters so much to them that real people are being murdered.” I was able to verify and contact the man who ran the channel; 404 Media is using a pseudonym for him, Paul, because he has received threats and his channel is no longer active.
I was curious about how his whole operation worked. Paul is not the first person to lie on the internet, but it felt like he was lying in a brand-new way. Paul had found his own niche within the AI-generated slop ecosystem that 404 Media has reported on for the last few months. He believed people wouldn’t want to watch his videos if they knew they were fake, and that he wasn’t any worse than the competition.
“True crime, it’s entertainment masquerading as news […] that’s all there is to it,” he said.
Paul told me he tried to get people to question the reality of his videos by naming characters strangely or inserting bizarre details into his scripts, but if you looked at the comments on his videos, there were a lot of people who couldn’t tell they weren’t real. Whatever moral lesson Paul said he wanted to impart clearly didn’t land for most commenters.
“I’m 100% confident that sexual relationship between the stepfather and the stepson started way before he was 19,” read one of the top comments on the video Hernandez found. There were countless others, all hypothesizing about the fake police investigation and fake criminals.
“I’m trying to overdose the viewer on luridness, to try to confront them with the fact that they seem to be so invested in the luridness of it all. People’s secret lives, their secret affairs that are really taboo,” Paul told me. Of course, Paul was also making money from the videos.
About half of each video was made using ChatGPT or an AI image service, Paul said. The other half, the bones of the story, small details and edits, are all his. He typically made one or two videos a week, which took about two and a half hours each, and billed himself as a filmmaker to me—one seizing on a new era of content without the need for expensive crews and camera equipment.
While he would not say how much his videos made him in ad revenue, he said he devoted himself to it full-time.
I made a thing
Paul told me he graduated college just before the COVID-19 pandemic forced Americans inside for a year. During that time, he moved back home with his parents. While some families baked sourdough bread, Paul’s family did something different. Together, they started binge-watching Dateline.
Paul mapped out the formula for the genre as he watched: a scandalous affair, some brutal crime, interrogating the suspects and a stunning trial of a perpetrator to bring things home.
But once he figured out that formula, the show became less appealing.
“Once you see how the sausage is made, you don’t really want to eat it too much,” he said.
As he tired of Dateline, Paul started experimenting withChatGPT. His first experiments with the product also relied on another generic television formula he was familiar with: Hallmark Christmas movies.
Paul did this by first typing a prompt into ChatGPT. Then, he took those generated still images of characters and backgrounds to make short Hallmark romcom parodies on YouTube. They had titles like “Princess meets Fisherman” or “Romance and Reindeer.” All included a disclaimer that they were generated with AI.
The videos bombed. Not one has more than 100 views. He attributes that to the limitations of the generative AI he used at the time, and to disclosing how he made each video.
“I labeled it [as] AI parody, and it didn’t do well […] I think part of it is people are just hostile towards AI. So when they see the word AI, they’re just freaked out by it,” Paul said.
I think the videos also sucked. Paul disagrees with me on that.
His next idea was to ditch any disclaimer about how the video was made. He noticed just how high the demand for true crime was and how low the production value could be for a fake documentary. With that, True Crime Case Files was born.
“It was almost sort of like a gold rush. I really felt like I needed to stake my claim before anybody else thought of it,” he said.
Even though Paul’s videos are themselves variations on a popular genre, his channel’s success had, at least in this singular respect, proved him right. There are several other copycat channels that either rip his videos entirely or mimic their style. A few even copied the title of Paul’s most popular video and posted their own AI-generated versions. None are quite as popular though.
‘It’s an absurdist art form’
To debate with Paul about the ethics of his videos means constantly retreading the same ground. It can feel futile, but here’s what he says:
“True crime […] at the end of the day, it’s a form of entertainment. Viewers are watching this not to be informed about things that will affect them personally. They’re really just there to be entertained and to have a thrilling mystery with some lurid elements,” he said.
Okay, sure. I buy that.
“It’s almost become this national pastime, like bullfighting,” he said. “People just passively observe it, and they don’t even question, ‘Why are we enjoying this violence so much?’”
True crime is, of course, astronomically popular. More than half of all Americans say they consume some form of the genre, and true crime made up three of Apple’s 10 most popular podcasts of 2024.
It’s also not a new argument that the true crime genre might have some problems. According to some experts, it might revictimize people who have already suffered. It acts similarly to local TV news that leads with the bloodiest crimes of the night, which research shows makes viewers more afraid. Other experts also say it influences our ideas of common crimes, victims and investigators; the way we see “true” evil committed on screen shapes how we expect it to happen in real life.
Paul says what he’s doing is no worse than the actual sins of the true crime genre. In fact, he says his version is better because he isn’t exploiting any real victims. Viewers get their pint of blood, he makes his videos—and money—and no one is harmed in the process.
“There’s really no difference between us except that [I am] not using real people and their suffering as my vehicle,” he told me in an email.
That doesn’t sit right with Annie Nichol. Nichol is a victim’s advocate in Washington. Her sister, Polly Klaas, was murdered in 1993 and was the subject of exhaustive true crime documentaries, podcasts and television adaptations.
She also has lots of problems with true crime media. Nichol, however, says what Paul is doing isn’t any better.
“Victims are used in this way by the media and by true crime content creators,” she told me in an interview. “Where our trauma is frequently co-opted and exploited for profit. So someone generating AI true crime for profit is certainly not helping victims in any way.”
Nichol says that even if Paul isn’t using real stories from survivors of violent crimes, the bigger societal impact is the same. The trappings of reality let the audience walk away with the same impressions as they would with an actual crime.
Though, when talking to Paul, any criticism falls on deaf ears.
“It’s an absurdist art form,” he said. “If people don’t understand it, that says a lot about human nature and their own natures and the nature of crime, and perhaps they’re not willing to question themselves, but I don’t have any misgivings about what I’m doing.”
Content not available
When I asked YouTube for comment on this story, I had a list of questions about how Paul’s channel was monetized, and if he had broken any policies that had caused him to run afoul of the platform’s guidelines. After I asked for comment, YouTube nuked the channel and four others associated with Paul, including the Hallmark parody channel.
“We terminated the channel in question for multiple violations of our Community Guidelines, including our policies covering child safety that prohibit the sexualization of minors,” Jack Malon, a YouTube spokesperson, wrote in a statement.
The channel’s views dropped dramatically in the past few months, and at least one video was hit with a community note calling it false, making it seem likely his videos had already been reviewed in some capacity. Paul told me he’s trying to appeal the ban. (Audio only versions of all his videos are still accessible on Spotify and other podcast players. Spotify did not respond to a request for comment for this story.)
But Pandora’s AI-generated box is open. Paul and creators like him have shown people there’s a new way to make money on the internet without much work, child safety policies be damned. The fact that YouTube deleted this channel doesn’t mean that it’s taking a broader stand against AI-generated content or AI-generated “true crime.”
YouTube channels with names like “Hidden Family Crime Stories,” “True Crime Cases,” “True Crime Home” and “Crime Tapes” are pumping out ever greater numbers of AI generated murder stories just like the ones Paul made.
He got his gold rush after all.