This week, Jesselyn Cook and Sebastian Murdock published a fascinating investigation into how child predators are using YouTube to watch videos of children in compromising situations — and how the platform’s algorithms and poor oversight has created an environment where kids are at risk. Jesselyn spoke with Must Reads about their reporting: Let's just clarify something at the top: there is not actual child pornography on YouTube, right?
Right. The videos we're talking about here are most often independently innocuous and compliant with YouTube's content policies — home videos of little girls in bathing suits, for example. There's nothing wrong with parents uploading that kind of footage to YouTube. The problem, in part, is how YouTube's automated recommendation engine serves those videos en masse to viewers with already-predatory viewing habits.
Tell us briefly what the YouTube recommended viewing algorithm does, and how it may drive people to extreme content.
The algorithm automatically generates video suggestions and brings users from one clip to the next. It's extremely powerful — driving more than 70% of traffic on a website with 2 billion users. Its goal is to keep you watching for as long as possible in order to maximize advertising revenue, according to one of the engineers who helped create it. To do this, the algorithm will use your viewing history and other factors to determine where your interests lie, then serve you a stream of related videos. For pedophiles, this can mean lots of clips of kids in their underwear.
Another issue are playlists and comments - explain how they are helping child predators find victims on the platform.
YouTube has a tool that lets you compile videos from other people's channels into your very own public playlist. It's a useful feature — why scour the site for videos of the best spicy food recipes when you can find a playlist from someone who's already done the work for you? Pedophiles are freely hunting for clips of little kids and pulling them into fetish-themed playlists with titles such as “wet clothed girls” and “Girls passed out.” YouTube removed those two playlists, along with several others, after HuffPost brought them to its attention.
Pedophiles are also exploiting children through videos' comments sections. More than a year ago, Disney and other major brands pulled their advertising dollars from YouTube over concerns about predatory comments on videos of children. YouTube suddenly jumped into action and announced that it would ban comments on videos featuring minors. But once again, this policy has been poorly enforced: YouTube recommended to HuffPost an abundance of videos starring kids that still have comments enabled. Our article tells the story of an 11-year-old girl who has been manipulated and groomed via comments on her own videos.
What could YouTube do to stop this problem?
Child safety activists and at least one senator are urging YouTube to simply stop algorithmically recommending videos that prominently show minors altogether. They argue this would drastically reduce kids' exposure to pedophiles, who would no longer be served a seemingly endless lineup of videos featuring children. YouTube has refused to do this, and legally, it doesn't have to: Under Section 230 of the Communications Decency Act, a law passed in 1996 — nearly two decades before the creation of YouTube and similar platforms — it is shielded from liability for the content it hosts and promotes. For that reason, Congress has threatened YouTube and other tech giants with government regulation. Many experts believe this to be the only viable solution, as YouTube seems either unable or unwilling to adequately police its own platform.
|