DUBAI: Google has announced the introduction of four new measures to curb the spread of terrorist content on YouTube, saying more immediate action needs to be taken to face the challenge posed by such material.
The internet giant, which owns video-sharing site YouTube, said it would take a tougher position on videos containing supremacist or inflammatory religious content by issuing a warning and not monetizing or recommending them for user endorsements, even if they do not clearly violate its policies.
In a blogpost, Kent Walker, senior vice-president and general counsel at Google, said “the threat posed by violence and hate is a critical challenge for us all. Google and YouTube are committed to being part of the solution.
“While we and others have worked for years to identify and remove content that violates our policies, the uncomfortable truth is that we, as an industry, must acknowledge that more needs to be done. Now,” he added.
Walker went on to outline the new measures that will be taken to counter the spread of inflammatory content online.
Firstly, Google will increase its use of technology to help identify extremist and terrorism-related videos.
“This can be challenging: a video of a terrorist attack may be informative news reporting if broadcast by the BBC, or glorification of violence if uploaded in a different context by a different user… We will now devote more engineering resources to apply our most advanced machine learning research to train new ‘content classifiers’ to help us more quickly identify and remove extremist and terrorism-related content,” Walker wrote.
Secondly, “we will greatly increase the number of independent experts in YouTube’s Trusted Flagger program,” Walker noted.
“Machines can help identify problematic videos, but human experts still play a role in nuanced decisions about the line between violent propaganda and religious or newsworthy speech.”
Walker also said a tougher stance would be taken on videos that do not clearly violate Google’s and YouTube’s policies — for example, videos that contain inflammatory religious content.
In the future, “these will appear behind an interstitial warning and they will not be monetized, recommended or eligible for comments or user endorsements. That means these videos will have less engagement and be harder to find.”
Lastly, Google will expand its collaboration with counter-extremist groups to identify content that may be used to radicalize and recruit extremists, it said.
The company will also reach potential Daesh recruits through targeted online advertising and redirect them towards anti-terrorist videos in a bid to change their minds about joining.
— With Reuters
Google debuts new measures to curb extremist videos on YouTube
-
{{#bullets}}
- {{value}} {{/bullets}}