YouTube shooting shows how fake news spirals on social media

YouTube shooting shows how fake news spirals on social media
YouTube employees are seen walking away from Youtube headquarters following an active shooter situation in San Bruno, California, U.S., on Tuesday April 3, 2018. (REUTERS)
Updated 04 April 2018
Follow

YouTube shooting shows how fake news spirals on social media

YouTube shooting shows how fake news spirals on social media

PARIS: Within minutes of the shooting at YouTube offices in California, social media was awash with conspiracy theories and images of the supposed “shooter” wearing a Muslim headscarf.
Some Facebook videos were quick to claim that it was a “false flag” attack, carried out to discredit the powerful US gun lobby in the wake of the Parkland high school massacre in Florida.
With wildly exaggerated accounts of the death toll circulating, several pictures of the purported attacker and some of the “victims” posted to Twitter Tuesday turned out to be of well-known YouTubers.
Other widely-shared posts speculated that the attacker had been provoked by YouTube censoring political content, and one Twitter user posted a picture of the suspect as Hillary Clinton in a headscarf.
His account was later suspended.
Hoaxers too took advantage of the situation to post several pictures of the US comic Sam Hyde, who is known for Internet pranks.
None of which came as any surprise to researchers at the Massachusetts Institute of Technology, whose report last month found that false news spreads far faster on Twitter than real news — and by a substantial margin.
“We found that falsehood diffuses significantly farther, faster, deeper, and more broadly than the truth, in all categories of information,” said Sinan Aral, a professor at the MIT Sloan School of Management.
They found that false political news reached more people faster and went deeper into their social networks than any other category of false information.

While Russian troll factories have got much of the blame for attempting to poison the political discourse in election campaigns across the US and Europe, the team from the MIT Media Lab found that fake news spreads not because of bots but from people retweeting inaccurate reports.
Researchers found that “false news stories are 70 percent more likely to be retweeted than true one. It also takes true stories about six times as long to reach 1,500 people as it does for false stories.”
While real news stories are rarely retweeted by more than a thousand people, the most popular fake news items are regularly shared by up to 100,000.
Emma Gonzalez, one of the Parkland students who has become a leader of the #NeverAgain movement pushing for tougher gun control, has become a particular target for misinformation attacks in recent weeks.
A doctored picture of her ripping up the US constitution trended last week, exposing her to vicious online vitriol. She had actually been ripping up a gun target in a photo shoot for Teen Vogue magazine.

Another fake meme went viral showing Gonzalez allegedly attacking a gun supporter’s truck, when it was in fact an image of the then shaven-headed pop star Britney Spears in a infamous meltdown from 2007.
Rudy Reichstadt, of the Conspiracy Watch website, said disinformation feeds on the “shock and stupor” that traumatic events create.
“We now have conspiracy theory entrepreneurs who react instantly to these events and rewrite unfolding narratives to fit their conspiratorial alternative storytelling.”
He said US shock jock and Infowars founder Alex Jones, a prominent pro-gun activist, had set the template for generating fake news to fit a particular agenda.
He plays up “conspiracy theories every time there is a new shooting,” Reichstadt told AFP. “He is a prisoner of his own theories and is constantly trying to move the story on (with new elements) to keep the conspiracy alive.”
The France-based researcher said there was now a whole ecosystem of fake news manufacturers, from those who “use clickbait sensationalism to increase their advertising revenue to disinformation professionals and weekend conspiracy theorists who sound off on YouTube.”
The MIT study, which was inspired by the online rumors which circulated after the Boston marathon attack in 2013, focused on what it called “rumor cascades” — unbroken chains of retweets after a Twitter user makes a false claim.
Aral said they concluded that people are more likely to share fake news because “false news is more novel, and people are more likely to share novel information. Those who do are seen as being in the know.”