Google moves on extremist YouTube content


Now it is taking a new approach: Bury them.

Criticism of Google's stance on brand safety have been swirling for sometime, with News Corp's Nicole Sheffield recently stating "the reality is you put one image out to advertisers and media clients about what you are and this is only one part of the overall audience you are attracting", while speaking to Australia's Google MD Jason Pellegrino at the AdNews Media + Marketing Summit Sydney.

Alphabet Inc's Google is ready to make their cyberspace safer with the tools, which will be removing terrorist or violent extremist content after identifying it. The process for handling videos that do not necessarily violate specific rules of conduct is more complicated. If you're unaware, some of the most high-profile brands had quit the video streaming platform for the ads were being placed on or next to extremist content.

"Terrorism is an attack on open societies, and addressing the threat posed by violence and hate is a critical challenge for us all", Google's general counsel Kent Walker explained. The company said that it thinks "this strikes the right balance between free expression and access to information without promoting extremely offensive viewpoints".

Google will increase its use of technology to identify extremist and terrorism-related videos across its sites, which include YouTube, and will boost the number of people who screen for terrorism-related content, Walker said.

Google said it would rely on the specialized knowledge of groups with experts on issues like hate speech, self-harm and terrorism.

The company will also increase its use of technology to help identify extremist videos, in addition to training new content classifiers to quickly identify and remove such content.

The final steps in Google's plan are interesting. But it has also become a magnet for extremist groups that can reach a wide audience for their racist or intolerant views. This is their latest move against extremist and terror-based video content on YouTube.

More news: Cristiano Ronaldo Reportedly Looking for a New Team

During the final six months of 2017, Twitter suspended nearly 377,000 accounts for promoting terrorism.

One of the London Bridge attackers reportedly became a follower of Jibril through social networks such as YouTube, the BBC reported.

And, Walker said, YouTube would expand its role in counter-radicalisation efforts.

Google did not immediately respond to a request for comment. "We have used video analysis models to find and assess more than 50% of the terrorism-related content we have removed over the past six months", Walker said. To describe it simply, adverts that attempt to lure folks to join the ranks of a terrorist organisation will redirect users to content that debunk terrorist recruiting messages.

He acknowledged that more has to be done in the industry and quickly.

"Still today there is illegal content easily accessible on YouTube - including terrorist propaganda".

The post comes after a raft of brands began pulling advertising from YouTube amid brand safety concerns.