Crime Companies Halt YouTube Ads Following Reports of Commenter Pedophile Network

NC Susan

Deceased
https://gizmodo.com/companies-halt-youtube-ads-following-reports-of-comment-1832771383

Companies Halt YouTube Ads Following Reports of Commenter Pedophile Network

Catie KeckWednesday 8:30pm

Photo: Danny Moloshok (AP)
Companies are pulling their advertising campaigns from YouTube amid reports that a network of pedophiles is openly operating in the comments sections of videos of young children, Bloomberg reported Wednesday. Disney and Nestlé are among those who have reportedly yanked spending after a YouTube video surfaced the ongoing problem.

A video shared by YouTuber Matt Watson on Sunday outlined what he described as a “soft-core pedophile ring” enabled by commenters on videos of children, particularly of young girls. These videos, which are monetized by the company, are flooded with comments by apparent pedophiles who trade contact information and links to child pornography. They also timestamp what Watson said are “points in the video where little girls are in compromising positions, sexually implicit positions.”

Watson called YouTube’s algorithm for surfacing these videos a “wormhole” of exploitative content. Once a YouTube user clicks through several of these videos, their suggested content column becomes flooded primarily with videos of children.

Wired was able to replicate Watson’s claims and said the videos it encountered often included little girls playing, swimming, or eating popsicles, and in some cases more graphic content. Once some of these videos are viewed, Wired said YouTube’s algorithm surfaces videos that appear to be popular with other pedophiles. In many cases, the site reported, videos of young children to which pre-roll ads are attached have racked up hundreds of thousands and even millions of views.

Companies are now opting to distance themselves from the controversy by either contacting YouTube about the problem or pulling the plug on ad campaigns entirely.

“I can confirm that all Nestlé companies in the US have paused advertising on YouTube,” a Nestlé spokesperson told Gizmodo in a statement by email. Bloomberg cited sources who claimed Disney has followed suit, though the company did not immediately return a request for comment.

A spokesperson for Epic Games, the developer behind Fortnite, told Wired that by way of its ad agency, the company had “reached out to YouTube to determine actions they’ll take to eliminate this type of content from their service.” Grammarly told Wired it also contacted YouTube about the issue.

Disturbing and predatory comments on YouTube videos of children resulted in a similar response from advertisers in 2017. The company said at the time that it was working to fix the issue, but it appears to remain a pervasive problem on the site.

A YouTube spokesperson said that the company is working to tackle the issue and has disabled comments on millions of videos of children. The company has also removed more than 400 accounts of some commenters on these videos, as well as some videos it believed may be putting young subjects at risk. The spokesperson added that YouTube is reporting any illegal comments to the National Center for Missing and Exploited Children.

“Any content—including comments—that endangers minors is abhorrent and we have clear policies prohibiting this on YouTube,” a YouTube spokesperson said in a statement by email. “We took immediate action by deleting accounts and channels, reporting illegal activity to authorities and disabling comments on tens of millions of videos that include minors. There’s more to be done, and we continue to work to improve and catch abuse more quickly.”
 

NC Susan

Deceased
www.bloomberg.com/news/articles/201...be-ads-amid-concerns-over-child-video-voyeurs

Nestle, Disney Pull YouTube Ads, Joining Furor Over Child Videos

Mark Bergen
Nestle, Epic Games also among companies stopping ad purchases

Walt Disney Co. is said to have pulled its advertising spending from YouTube, joining other companies including Nestle SA, after a blogger detailed how comments on Google’s video site were being used to facilitate a “soft-core pedophilia ring.” Some of the videos involved ran next to ads placed by Disney and Nestle.

All Nestle companies in the U.S. have paused advertising on YouTube, a spokeswoman for the company said Wednesday in an email. Video game maker Epic Games Inc. and German packaged food giant Dr. August Oetker KG also said they had postponed YouTube spending after their ads were shown to play before the videos. Disney has also withheld its spending, according to people with knowledge of the matter, who asked not to be identified because the decision hasn’t been made public.

On Sunday, Matt Watson, a video blogger, posted a 20-minute clip detailing how comments on YouTube were used to identify certain videos in which young girls were in activities that could be construed as sexually suggestive, such as posing in front of a mirror and doing gymnastics. Watson’s video demonstrated how, if users clicked on one of the videos, YouTube’s algorithms recommended similar ones. By Wednesday, Watson’s video had been viewed more than 1.7 million times.

“Any content --including comments -- that endangers minors is abhorrent and we have clear policies prohibiting this on YouTube. We took immediate action by deleting accounts and channels, reporting illegal activity to authorities and disabling violative comments,” a spokeswoman for YouTube said in an email.

Total ad spending on the videos mentioned was less than $8,000 within the last 60 days, and YouTube plans refunds, the spokeswoman said.

Two years ago, several major advertisers pulled spending from YouTube, the video site owned by Alphabet Inc.’s Google, after ads surfaced next to extremist and violent content. YouTube has also faced criticism for hosting inappropriate videos meant for kids. Google took several steps over the past two years to reassure advertisers about the problem. Many of the brands that boycotted YouTube, including Procter & Gamble Co. and AT&T Inc., have since returned to buying ads on the site.

YouTube on Tuesday released an updated policy about how it will handle content that “crosses the line” of appropriateness.
 

NC Susan

Deceased
Www.youtube.com/watch?v=O13G5A5w5P0

20 min

[/QUOTE]

Youtube is Facilitating the Sexual Exploitation of Children, and it's Being Monetized (2019)
2,666,105 views


MattsWhatItIs

Premiered Feb 17, 2019


Over the past 48 hours I have discovered a wormhole into a soft-core pedophilia ring on Youtube. Youtube’s recommended algorithm is facilitating pedophiles’ ability to connect with each-other, trade contact info, and link to actual CP in the comments. I can consistently get access to it from vanilla, never-before-used Youtube accounts via innocuous videos in less than ten minutes, in sometimes less than five clicks.. Additionally, I have video evidence that these videos are being monetized.

This loophole is wrong, something needs to be done. Its's being monetized. CP is being traded as well as social media and WhatsApp addresses. Youtube is facilitating this problem. It doesn't matter that they flag videos and turn off the comments, these videos are still being monetized, and more importantly they are still available for users to watch.
 

NC Susan

Deceased
https://nyp.st/2E7I9RK

Trolls allegedly uploading hidden suicide messages in YouTube Kids cartoons

By Alex Matthews, The Sun
February 22, 2019

Parents have been issued a chilling warning after trolls uploaded cartoons to YouTube Kids spliced with a hidden message encouraging children to kill themselves.

“This video was intentionally planted on YouTube Kids to harm our children. He waited until parents’ guards were down. How can anyone do this?” fumed one horrified mom, whose son watched the vile video.

It is not known who uploaded the sick clips — with YouTube sometimes leaving them up for days until they were taken down.

The now-deleted video, cut with an episode of popular kids cartoon “Splatoon,” features a clip from YouTube prankster Filthy Frank.

He appears on screen smirking and wearing sunglasses before describing how kids can harm themselves before signing off with “End it.”

Filthy Frank, who is followed by more than 6 million subscribers, racks up millions of views with “anti-PC, anti-social and anti-couth” videos.

Some feature him bathing in a bath of noodles and eating raw squid, but one was of him pretending to be a One Direction fan committing suicide.

He was unavailable for comment when approached by The Sun — but there is no suggestion he played any role in trolls editing the cartoon.

Free N. Hees, a pediatrician, reported the video to YouTube and got the platform to take it down.

She believes these videos could lead to a rise in child suicide.

“Exposure to videos, photos and other self-harm and suicidal promoting content is a huge problem that our children are facing today,” said Hees, who also blogs on child cyber safety under the pseudonym PediMom.

“We need to fight to have the developers of social media platforms held responsible when they do not assure that age restrictions are followed.”

The National Society for the Prevention of Cruelty to Children (NSPCC) in Britain was appalled when The Sun brought the videos to its attention and slammed YouTube and Google for failing children.

“The very fact that these videos are still there, and can be watched, shows that Google and YouTube are not doing enough to safeguard kids,” a spokesperson for the charity said.

“They’re taking them down when they’re eventually alerted — but it shouldn’t have to be up to organizations like The Sun Online to tell them to take these videos down. We don’t know yet what impact these videos will have on kids — but what we do know is that they will be highly distressing for any young child that sees them. It’s massively concerning; these things should not be available on sites like YouTube.”

The video in question has now been removed from YouTube Kids — a channel the platform markets as a safe place for youngsters online.

“We created YouTube Kids to make it safer and simpler for children to explore the world through online video — from their favorite shows and music to learning how to build a model volcano (or make slime), and everything in between,” says a blurb on the website.

“There’s also a whole suite of parental controls, so you can tailor the experience to your family’s needs,” the site continues.

This is not the first time sick content has been spliced with children’s cartoons on YouTube.

In 2017, Peppa Pig cartoons were edited to show distressing scenes — with Peppa getting her teeth pulled out, characters having sex and others being violently attacked.

The recent suicide videos were revealed after an investigation by The Sun showed kids as young as 8 are being targeted by predators and bombarded with sexually explicit messages on a new social media app.

TikTok, which lets users create and share short videos with music and camera effects, has been branded a “magnet for pedophiles” by concerned campaigners and parents.

Meanwhile, another mom has spoken out after her 7-year-old son told other kids they would be “killed in their beds” while playing the deadly “Momo” challenge.

The sick suicide game has swept the web and is already believed to have caused the tragic deaths of two teenagers in Colombia.

The NSPCC has ordered YouTube to improve its technology before easily influenced youngsters start harming themselves.

“We’re really concerned because these videos are being specifically targeted towards children with very dark content,” said Tony Stower, head of child safety online for the NSPCC.

“What worries us is that the YouTube algorithm isn’t clever enough for the site to notice that these videos are not acceptable for kids. As a result they are not putting them behind an appropriate age gate.

“We are pushing the government to introduce a duty of care on sites like YouTube and Facebook. They need to prioritize child protection online. But with products like YouTube Kids, all of that safeguarding is an afterthought. Before they start marketing, they need to make sure their sites are safe for children,” he added.

A YouTube spokesperson claimed many of its offensive videos are taken down before they are viewed.

“We work hard to ensure YouTube is not used to encourage dangerous behavior and we have strict policies that prohibit videos which promote self-harm.

“Every quarter we remove millions of videos and channels that violate our policies and we remove the majority of these videos before they have any views.”
 
Top