Forget Imaginary Nazis, YouTube Has a Real Pedo Problem

Like the rest of the Silicon Valley cabal of “progressive” tech behemoths, YouTube is obsessed with hunting “Nazis”. “Nazis”, of course, being left-speak for anyone who dares hold such heretical beliefs as that gender is a binary, biological fact, or that the failure of socialism has been demonstrated time and time again. Yet while YouTube has been finding Nazis under every stone, it turns out that some very nasty spiders indeed were lurking under the rocks.

YouTube’s algorithm is reportedly facilitating pedophiles by recommending videos of young children to them. YouTube announced Monday that it’s changing its livestreaming policy amid news of the reports.

A Brazilian woman was informed that an innocent home video of her 10-year-old daughter and a friend playing in a backyard pool suddenly garnered more than 400,000 views after being recommended to those who had watched “videos of prepubescent, partially clothed children,” according to one of the researchers’ examples.

“It’s YouTube’s algorithm that connects these channels,” research co-author Jonas Kaiser, wrote. “That’s the scary thing.”

To its credit, YouTube is taking action.

After being alerted to the issue, the company banned younger minors, who are 13 and younger, from livestreaming without an adult present.

YouTube also removed several videos and appeared to alter its algorithm, according to the New York Times.

Unfortunately, it’s not just the algorithm that’s the problem. Creepy users are also using the commenting system to spread their slimy network.

In March, it disabled comments on videos of children after it was learned that pedophiles were sharing time stamps to sections of video that showed nudity.

“The vast majority of videos featuring minors on YouTube, including those referenced in recent news reports, do not violate our policies and are innocently posted—a family creator providing educational tips, or a parent sharing a proud moment,” the company wrote.

The discovery of a “softcore pedophile ring” on the platform by a YouTuber in February led Disney to stop purchasing ads on the site.

dailydot


It should be pointed out that it’s not YouTube’s fault that paedophiles turn otherwise innocent content into their stalking grounds. The iconic “Coppertone girl” ad, seen by nearly everybody as playful innocence, apparently became a particular favourite of creepy rockspiders.

Indeed, YouTube have found an unlikely defender. Felix “Pewdiepie” Kjellberg is a giant presence on YouTube; he has also been targeted repeatedly by “woke” activists for his sometimes less-than-PC content (you guessed it, he’s a “Nazi”). But, unlike many other creators, Kjellberg is backing YouTube this time.

“PewdiePie” – Felix Kjellberg – has backed MattsWhatItIs and YouTube against a backlash from some creators.

YouTube’s king of content, however, said he supports the action. “This is obviously a swift and short-term solution,” Kjellberg said. He noted the channel Girls Couture Club, for example, which features several young women in activewear and bikinis. Many of the videos feature the girls modeling or doing yoga.

It is channels like this that led to YouTube‘s decision to crack down. Though the videos themselves are not inappropriate, it only takes a few commenters to turn things rotten. By sharing timestamps of moments in the videos in which, for instance, the girls are in compromising positions, commenters made straightforward content into something concerning.

“Where do you draw the line on that?” Kjellberg said. “Is family friendly no longer family friendly?”

He was clear that he considers this action by YouTube to be a necessary one.

“It’s clearly something they’ve done just to act swiftly at a time where it’s necessary. And to be honest, I appreciate that YouTube does that.”

dailydot


If only YouTube spent more time ferreting out actual paedophiles than hunting imaginary “Nazis” – let alone enabling creepy, child-grooming content like convicted murderers/drug dealers fawning over ten-year-old drag queens in front of a backdrop juxtaposing a little girl and the word “rohypnol”.

38%
×