YouTube is stepping up its fight against online terror and abuse, laying out four new steps it’s taking amid recent terrorism-related incidents around the world.
The moves also come a couple of months after advertisers began pulling their ads from YouTube and other Google properties after learning they were appearing beside extremist content.
Kent Walker, general counsel for Google, wrote in a blog post and an op-ed that appeared in the Financial Times over the weekend that thousands of the company’s employees “review and counter abuse of our platforms,” but that the company is trying to do even more.
That includes making videos that don’t violate company policies but that “contain inflammatory religious or supremacist content” harder to find. Creators of such videos won’t be able to monetize them, and the videos won’t have comments and endorsements enabled.
“We think this strikes the right balance between free expression and access to information without promoting extremely offensive viewpoints,” Walker wrote.
In addition, Google says it’s devoting more engineering resources and advanced machine-learning research to try to help quicker identification and removal of extremist and terrorism-related content.
Also, the company says it will rely on even more humans.
“Because technology alone is not a silver bullet, we will greatly increase the number of independent experts in YouTube’s Trusted Flagger program,” Walker said in the blog post. “Human experts still play a role in nuanced decisions about the line between violent propaganda and religious or newsworthy speech.” The boost will include adding 50 new NGOs to the program, which already had 63 organizations involved.
The fourth step involves an expanded role in anti-radicalization efforts that include advertising aimed at “potential ISIS recruits” that redirects them to anti-terrorist videos. Google also said it’s working with Facebook, Microsoft and Twitter to tackle terrorism-related concerns.
The big internet companies are facing increased pressure from governments, politicians and others who lay some of the blame on online platforms for spreading extremist messages. YouTube has long dealt with such concerns — its platform has hosted user-generated video since before Facebook and Twitter began to focus on video. But this year, the company faced a massive backlash in the form of lost advertising.
Although at least one report pegged the loss of YouTube and other Google advertising over this issue in the hundreds of millions of dollars, Google would not confirm that. Rather, the company told SiliconBeat in March that it had begun an “extensive review” and was looking to make changes after brands in the U.K., then elsewhere, halted advertising from Google platforms.
Photo: A You Tube logo on Dec. 4, 2012 during LeWeb Paris 2012 in Saint-Denis near Paris. (Eric Piermont/AFP/Getty Images)
Tags: Advertising, extremism, Google, terrorism, video, youtube