- YouTube announced on Wednesday plans to crack down on extremist videos on its site that advocate for neo-Nazi, white supremacy, and other bigoted videos.
- The new policy, laid out in company blog post, will ban “videos alleging that a group is superior in order to justify discrimination, segregation or exclusion.”
- The new policy will also prohibit videos that deny that “well-documented violent events,” including the Holocaust and the shooting at Sandy Hook Elementary.
- A YouTube spokesperson on Wednesday told Business Insider that “thousands” of channels will be removed as a result of the new policy.
- Visit Business Insider’s homepage for more stories.
YouTube unveiled a sweeping new plan Wednesday to crack down on extremist videos that advocate neo-Nazi and bigoted idealogies, as well as videos promoting obvious conspiracy theories, eliciting cautious accolades from critics who say the site has been too lax in removing dangerous content.
The new policy, laid out in company blog post, will ban “videos alleging that a group is superior in order to justify discrimination, segregation or exclusion based on qualities like age, gender, race, caste, religion, sexual orientation or veteran status.”
The new policy will also prohibit videos that deny the existence of “well-documented violent events,” including the Holocaust and the shooting at Sandy Hook Elementary.
A YouTube spokesperson on Wednesday told Business Insider that “thousands” of channels will be removed as a result of the new policy. The company did not name any existing channels that will be removed.
YouTube’s policy update on Wednesday consists of three main parts:
- Broadening the definition of “hateful” content that’s not allowed on YouTube
- Reducing recommendations of ‘borderline’ content like flat earth claims and miracle cures
- Strengthening efforts to remove ads from videos that ‘brush up’ against hate speech rules
The move comes as the Google-owned video site faces a growing outcry over its role in enabling the spread of misinformation and hate speech. YouTube’s algorithms have repeatedly been found to recommend crackpot videos peddling bizarre conspiracy theories on mass shootings and phony cures for illnesses.
And it comes just one day after YouTube publicly refused to take action against YouTuber star, Steven Crowder, for making consistent homophobic and racist comments about Vox journalist, Carlos Maza.
Crowder – who has 3.8 million subscribers – frequently refers to the Maza’s sexuality and ethnicity on his show, using phrases like “lispy queer” and a “gay Latino” to describe the journalist.
Maza tweeted about the harassment last week and YouTube responded on Tuesday, saying that although Crowder’s language was “clearly hurtful,” it did not constitute a violation of its policies.
On Wednesday, after the new policy update was released, YouTube announced that it had “suspended” monetization for Crowder’s channel, chalking up the decision to a “pattern of of egregious actions” that “has harmed the broader community.”
Update on our continued review–we have suspended this channel’s monetization. We came to this decision because a pattern of egregious actions has harmed the broader community and is against our YouTube Partner Program policies. More here: https://t.co/VmOce5nbGy
— TeamYouTube (@TeamYouTube) June 5, 2019
Critics say big questions remain
Whether the new policies prove effective remains to be seen.
Many observers applauded YouTube’s efforts to crack down on toxic content, but voiced wariness about the video site’s track record.
“For too long, hate speech and harassment have flourished on YouTube despite clear policies against them,” said Muslim Adovcates’ Madihha Ahussain, special counsel for anti-Muslim bigotry, in an emailed statement.
“A critical question remains: how will the company enforce this new policy-especially against popular and profitable YouTubers who espouse anti-Muslim bigotry?” Ahussain said.
In its blog post on Wednesday, YouTube said it had already taken a tougher stance on hateful and supremacist content by limiting recommendations and not allowing certain features like comments or the ability to share for those videos. The company said those efforts, which began in 2017, reduced the views on these videos by 80% on average.
“It’s our responsibility to protect that, and prevent our platform from being used to incite hatred, harassment, discrimination and violence,” YouTube said in its blog post Wednesday. “We are committed to taking the steps needed to live up to this responsibility today, tomorrow and in the years to come.”
More “authoritative voices”
Its policy update on Wednesday now prohibits supremacy videos, including those that are “inherently discriminatory,” like those that “glorify Nazi ideology,” the company said.
It said the move to purge “borderline” videos builds on January tests that limited recommendations of such videos in the US. The number of views these borderline videos received from recommendations dropped by 50% on average, YouTube said. By the end of 2019, the company said, it will implement these changes to borderline videos across more countries.
While recommending less misinformation and content that is potentially harmful, YouTube also vowed to continue its efforts to promote more videos from authoritative sources – like top news sites.
Finally, the company said that it will be “strengthening its enforcement” of deciding which channels are allowed to run ads (and thus, make money) through its YouTube Partner Program. The company said that now any channels which “repeatedly brush up against our hate speech policies” will not be able to monetize.
Do you work at Google? Got a tip? Contact this reporter via Signal or WhatsApp at +1 (209) 730-3387 using a non-work phone, email at firstname.lastname@example.org, Telegram at nickbastone, or Twitter DM at @nickbastone.