How Russian Propaganda Spreads On Social Media
How Russian Propaganda Spreads On Social Media
October 29, 20177:05 AM ET
LAURA SYDELL
Earlier this year, a Facebook group page called Blacktivist caught the eye of M'tep Blount.
As a supporter of Black Lives Matter, Blount figured Blacktivist would be a similar group. The Facebook page came with a message: "You have a friend who is a part of this group," and it had a huge following — over 400,000 as of late August.
Blount found that Blacktivist's page shared information about police brutality. Videos often showed police beating African-Americans in small towns. "It was like, 'Wow! This is happening in this community too. I really hope they do something about it but they probably aren't going to,' " she says.
As it turns out, the Blacktivist page was not like Black Lives Matter, at all. It appears to have been linked to Russia, and Facebook has since taken it down. The group was carefully crafted to attract people like Blount whose behavior on Facebook showed they mistrusted police and were concerned about civil rights.
It was just one of the many calculated ways in which social media platforms have been used lately to covertly sow divisions within society. Later this week, Facebook, Google and Twitter will face members of Congress to answer questions in three public hearings about their role in enabling Russian interference in the 2016 U.S. presidential election. The hearings are also expected to shed light on how Russian propaganda has spread in the U.S. through these major social media platforms.
Jeff Hancock, a psychologist who heads Stanford University's Social Media Lab, says that propaganda via a page like Blacktivist was not aimed at changing Blount's mind. It was actually meant to trigger strong feelings.
"Propaganda can actually have a real effect," he says. "Even though we might already believe what we're hearing, this can heighten our arousal, our emotions."
Hancock has studied the ways people are affected by seeing information that confirms some of their beliefs. In his study, he asked people how they felt about an issue before showing them stories. For example, those who thought Hillary Clinton was corrupt were shown stories confirming it. If people were worried about police brutality, he showed them posts of police brutalizing civilians.
"When we have more confirmation that a possible risk is there, whether it's real or not, we perceive it as more risky," Hancock says. So, in Blount's case, if she was already worried about police brutality, then the more times she is exposed to those images the stronger she will feel about it, he says.
This kind of propaganda, he says, is designed to enhance divisions among people and increase "the anger within each other. It's really truly just a simple divide-and-conquer approach."
It's an approach that Russia has frequently used around the world, says Michael McFaul, a former U.S. ambassador to Russia. "They think that that leads to polarization, (which) leads to arguments among ourselves and it takes us off the world stage," he says.
Another potent example is the Twitter account @TEN_GOP, which had more than 100,000 followers. It called itself the unofficial account of the Tennessee Republican Party.
But it was purportedly set up by Russians. The account has since been shut down. But for months, it sent out a stream of fake news such as a tweet falsely stating that there was voter fraud in Florida. That sort of news got plenty of amplification. Though there is no evidence that President Trump or any of his supporters knew of the Russia link, the account was often retweeted by his aide Kellyanne Conway and the president's son Donald Trump Jr. Donald Trump himself thanked the account for its support.
Clint Watts, a fellow at the Foreign Policy Research Institute who has been investigating Russian use of social media, said it showed the power of just one Twitter account and its ability to "actually influence the discussion and be cited in the debate."
Watts says this kind of media propaganda is simply how it works in the digital age, whether it's the Russians, the North Koreans or a fake news site.
Facebook has already handed over details of 3,000 ads worth $100,000 by Russians to Congress. The company has promised more transparency about who is behind the advertising campaigns. Twitter says it will no longer take ad money from two Russian media outlets, RT and Sputnik. Despite efforts by Facebook, Twitter and Google to take action on their own, Democratic lawmakers are pushing legislation that would require Internet platforms to disclose more information about political ads.
McFaul, the former ambassador, believes the companies can do more. "They're not obligated to post a story that they know to be false," he says. "They already regulate free speech and advertisement. You can't advertise guns, for instance, on Facebook."
And there is still a lot that isn't known about the use of digital platforms to spread fake news and propaganda. But Americans may have a chance to learn more when Twitter, Facebook and Google sit down to answer questions in front of Congress this week.
https://www.npr.org/sections/alltechconsidered/2017/10/29/560461835/how-russian-propaganda-spreads-on-social-media
Tags
Who is online
218 visitors
McFaul, the former ambassador, believes the companies can do more. "They're not obligated to post a story that they know to be false," he says. "They already regulate free speech and advertisement. You can't advertise guns, for instance, on Facebook."
I'm curious to know whether NewsTalkers has a way of protecting it's members from bad faith posters and impostors.
Is there a way that the site is able to monitor and identify those whose objective is to post for the purpose of seeding division in America like the Russians did this past campaign and election?
It's actually what the rest of the world calls "news". Only you and your AI friends think otherwise.
Well, I personally know of at least one person hedre who is run by puppetmaster Vladimir Putin.
But I'm going to keep their identity secret for now.
I will finally reveal their identity soon (when I am subpoenaed to testify by Robert Mueller's committee sometime in the near future-- stay tuned!)
Excellent observation! BTW how did you know she is Korean? (Most people wouldn't be curious enough to try to find out who she is...)
It sure seems like there was at least one member here who was spamming #releasethememo with an unusual aggressiveness ... just like a Russian bot.
mber here who was spamming #releasethememo with an unusual aggressiveness ... just like a Russian bot
In response to article decrying propaganda, it's amusing perpetuate fake news made up by Adam Schiff that's been debunked for weeks. Schiff played you like a drum and rather than being embarrassed, you continue advertising your gullibility.
A relevant example of using Russian propaganda was when a member seeded articles from Putin's own propaganda source, Sputnik News, to attack the President.
Okay fellas - This article wasn't posted to point fingers. It's a legitimate question about how sites protect themselves.
I don't want this turning into a "whose worse" be-otch session.
"Russian Propaganda" is a drop in the ocean compared to the amount of good ole' American propaganda generated at home by politicians and pundits.
NT is much more likely to have a member that is paid by a political campaign to post favorable information than a Russian bot.
I don't know Sean. I think the group is small enough that at least the long standing members know each other well enough to know whether something is off.
Truth be told, I don't think we're that sophisticated.
I'd like to hear how the Russians were helping Trump by provoking black activists. Unless it was an extremely calculated effort by turning would be Trump voters off from Hillary by having more Fergusons then there is no connection. But listening to the Trump haters you'd think half of Russia spent 100% of their time helping Trump.
Show some patience my good man! Robert Mueller works in mysterious ways-- all will be revealed soon.
As the cat said when they cut its tail off:
It won't be long now!
The story was a conduit to the conversation of security and how to better identify those who aren't here to move the discussions forward.
I can certainly find another article that doesn't highlight how the black community was hijacked to promote a fictional story line if you think it would help.
Your article specifically mentions Russian trolls who disguised themselves as black activists. Do you not read the articles you post or is it some lack of comprehension on your part?
Yes, I read what I post. Did you read my intro comment?
I'm interested in hearing what you think on the issue of cyber security and how to combat bad actors.
I'm not interested in silliness. It's your choice.
Nope, just a couple of dozen hackers on three shifts at IRT or whatever it was in Russia.
I mean who has the time to waste all day posting on social media............
Wait ? What?
Robert Mueller works in mysterious ways-- all will be revealed soon.
In the meantime, I will be conducting my own "witch-hunt"-- so let's start with this:
The accounts allegedly shared content on issues such as police violence to appeal to young African-Americans.
Buzzfeed says it has tied them to the Internet Research Agency, a group known for its pro-Kremlin social media posts.
A researcher at the University of Oxford said there were "parallels" with Russian activity on Facebook.
Jonathan Albright of the Tow Centre for Digital Journalism, who worked with Buzzfeed on the investigation, said the accounts seemed to be part of an "ongoing campaign" that began in early 2015.
Popular posts criticised Democratic presidential candidate Hillary Clinton, supported independent Senator Bernie Sanders and highlighted police violence against black communities, according to Buzzfeed. (link)