LONDON: Matt Brittin, Google’s head of Europe, the Middle East and Africa, is used to being in the spotlight. Extremist content, brand safety, corporate tax avoidance — he has publicly faced questioning about it all.
Now, in light of Russian interference in the 2016 US election and the recent Senate committee hearings in Washington, it’s all about politics.
The US-based tech company has admitted that the Kremlin-linked Internet Research Agency spent $4,700 on advertising as part of a misinformation campaign during the election. It also revealed that 1,108 Russian-linked videos uploaded to YouTube generated 309,000 views in the US.
Although Google has since launched initiatives to provide increased transparency and enhance security in light of the revelations, Brittin told reporters at Google’s European headquarters in Dublin that more needs to be done.
“Any misrepresentation is bad and we have continued to look at how we can improve our policies and transparency,” he said. “Any time there’s (an) electoral process we really want to make sure that our role in that is clean and legitimate and supportive in all of the things that you would expect. And we work hard to do that.”
According to Brittin, who is president of EMEA business and operations at Google, “bad actors” were attempting to use Google’s systems and platforms for “bad purposes”, and had been trying to do so for some time.
“We’ve constantly tried to put in place policies and controls and verifications to try to stop that from happening,” he said. “We’ve made some good progress and we obviously need to do more.”
In light of Russia’s suspected interference in the 2016 presidential election, Google has undergone a deep review of how its ad systems are used. Although some changes have already been made to the company’s policies, a transparency report for election ads to be released in early 2018 should shine more light on the topic.
The furor over political ads, however, is far from Google’s only problem. Concerns over privacy, tax evasion, ad fraud and brand safety have shadowed the company over the past few years. In March, for example, Brittin had to issue an apology to the advertising industry after brands found their ads appearing next to controversial content on YouTube.
All of which goes hand-in-hand with a discernible backlash against the tech industry. While Facebook has received the greatest levels of flack, Google stands accused of being too big and too powerful. It is an accusation that Brittin acknowledges.
“Because of the pace of change in how everyday people are using technology, communicating, accessing information, creating and sharing their own content, that change throws up a whole bunch of new questions for all of us,” said Brittin. “And what I want to make sure that Google does is [be] in the room when there’s a conversation about those things going on and we can explain what we do today. Because quite often that’s misunderstood or not researched that thoroughly.”
Brittin uses fake news as an example.
“Fake news has become a topical term — an umbrella term spanning everything from what people don’t like that’s written about them to genuinely misrepresentative stuff,” said Brittin. “So in a world where 140 websites about US politics come from a Macedonian village, that’s clearly misrepresentative and fake and we need to work hard to tackle that. Bad actors and anyone with a smartphone being able to create content is a challenge.
“We’ve tried to do two things in this category. We try to help quality content thrive, and we have tried to identify and weed out the bad actors. The amount of work we do on weeding out the bad actors is phenomenal and not that widely known.”
Google said it took down 1.7 billion ads for violating its advertising policies in 2016, a figure that represents double the amount taken down in 2015. It also removed over 100,000 publishers from AdSense and expanded its inappropriate content policy to include dangerous and derogatory content. It is also using artificial intelligence and machine learning tools to better detect suspicious content.
Meanwhile, projects such as the Digital News Initiative, a partnership between Google and publishers in Europe, are supporting high-quality journalism through technology and innovation.
“I think about three groups really: users, creators (in the broader sense, whether it’s entrepreneurs or journalists or content creators of videos or app developers), and advertisers,” said Brittin.
“And if we want the next five billion people to come online to have the benefits of the services and the content that we enjoy today, we need to make sure that that ecosystem continues to work well.
“The online world is just like the world. There are complexities and challenges and there are bad actors there too, and what we need to do as an industry is come together to make it as safe as we can do. We can’t always guarantee 100 percent safety, but what we can do is put in place rules and principles and practices and so on that help people to use this and navigate the highway safely.”
© 2024 SAUDI RESEARCH & PUBLISHING COMPANY, All Rights Reserved And subject to Terms of Use Agreement.