╌>

'Dark pools' of hate flourish online. Here are four controversial ways to fight them | Science | AAAS

  

Category:  News & Politics

Via:  jbb  •  last year  •  6 comments

By:   newsfromscience

'Dark pools' of hate flourish online. Here are four controversial ways to fight them | Science | AAAS
A new study maps the "ecology" of online hate groups across platforms

S E E D E D   C O N T E N T



For moderators of social platforms such as Facebook, trying to quell the sickening churn of online hate culture is like playing a twisted game of whack-a-mole. As soon as one online hate group is squashed, another pops up in its place—often on another platform. And leaving the radical viewpoints of such groups unchecked, experts say, can have deadly consequences.

With this problem in mind, a new study treats online hate as a living, evolving organism and tracks its spread and interactions over time. The research team, led by physicist and complexity researcher Neil Johnson of George Washington University in Washington, D.C., created mathematical models to analyze data from social platforms including Facebook and the Russian social network Vkontakte (VK), where users can form groups with others of similar views.

"This is an important and very timely study," says Ana-Maria Bliuc, a social and political psychologist at the University of Dundee in Australia, who was not involved in the work. "[It] goes beyond what we know … by providing evidence on how online platforms help 'haters' unify across platforms and create 'hate bridges' across nations and cultures."

Hate has many definitions. The researchers defined "hate groups" as those whose users expressed animosity or advocated violence against a particular race or social group. Using human-guided algorithms and their knowledge of some already banned groups, researchers identified more than 1000 hate groups on multiple platforms, including those that referred to themselves as Neo-Nazi, anti-Semitic, or supporters of the Islamic State group.

As they began the project, the researchers expected the online hate ecosystem to be similar to a well-stocked "supermarket," with white supremacists in one aisle, anti-Semites in another, and misogynists in another still. "But that's not what we found at all," Johnson says. Instead, it was more like a continuous spectrum, where many kinds of hostility bleed into one other—a "superconnected flytrap" that pulls people further into a broad, ever-changing online hate community.

The researchers mapped interactions between related groups, primarily by tracking posts that linked to other groups. Their breakthrough came when they realized they had to track these interactions across social platforms. Instead of hate groups gathering in a single place, they often meet on many different networks—and tighter restrictions on one platform simply cause hate communities on other platforms to strengthen, Johnson and his co-authors report this week in Nature.

The cross-platform linkages, which the researchers refer to as "hate highways," form especially quickly when a group feels threatened or watched, they say. In the aftermath of the 2018 shooting at Marjory Stoneman Douglas High School in Parkland, Florida, for example, many media outlets discussed the shooter's interest in the Ku Klux Klan (KKK). In turn, online KKK groups likely felt increased scrutiny, Johnson says. He and his colleagues found a spike of posts in KKK Facebook groups linking to hate groups on different platforms, such as Gab or VK, strengthening the "decentralized KKK ideological organism."

Hate highways can be powerful in uniting people across geographic boundaries, Johnson says. "With one link between 10,000 Neo-Nazis in the U.K. and 10,000 Neo-Nazis in New Zealand, and then another with 10,000 in the U.S., suddenly, within two hops, you've connected 30,000 people with the same brand of hate," he says.

What's more, some of the more permissive platforms can serve as incubators, especially when a group's presence on one platform is banned. When Facebook cracked down on KKK groups in 2016 and 2017, for example, many of their former members fled to VK, a social media platform popular in Russia and Eastern Europe. There, they found a virtual "welcoming committee," Johnson says, with entry pages directing them to communities of "people that hate like you."

The researchers discovered that many of these VK groups had few links to outside social media platforms, but lots of links to other hate groups in the VK ecosystem: "dark pools" that grew in isolation from other social media networks.

Later, the members of several KKK groups returned to Facebook after finding ways to sneak around existing content restrictions. One such group consisted of posters writing primarily in Ukrainian. When the Ukrainian government banned VK in May 2017, those communities reincarnated on Facebook as КуКлухКлан, the Cyrillic form of "Ku Klux Klan," to avoid detection by English search algorithms.

That suggests the typical method of combatting such groups—by banning particularly active and hateful ones—is ineffective, the researchers say. Instead, they recommend four policies derived from their models to destabilize online hate communities.

The first advocates quietly removing small groups from the platform, while leaving larger ones in place. Because small groups eventually grow into larger ones, this policy nips them in the bud. The second policy is to randomly ban a small subset of users. Such a ban, Johnson says, would be less likely to enrage a large group, and he proposes it would also decrease the likelihood of multiple lawsuits.

Two other options are more controversial. One advocates creating antihate groups that would attempt to engage with the hate communities, in theory keeping them too preoccupied to actively recruit. The last would introduce fake users and groups to sow dissent among the hate groups. "If you add noise, the narratives in the groups can begin to drift off track," Johnson says.

"This [is] worrisome on many fronts," says Sarah Roberts, an internet scholar at the University of California, Los Angeles, who studies content moderation. For one thing, she says, users who engage in either of these two policies could experience negative mental health effects. "What support systems would they have to deal with these engagements? What conflict resolution training? And what would happen if these interactions spread from online engagements into real-world interpersonal violence?"

Bliuc agrees. She says the engagement strategy is "an interesting idea," but that more work is needed to understand how—and whether—such online interaction can change the opinions of ideologically opposed groups, especially in light of research that shows it can result in further polarization.

Johnson admits the models in his study are "highly idealized." Platforms wouldn't have to use every policy, he says—each site could decide which strategies best fit their operation. "I think different platforms would choose different policies," Johnson says. "But if they each do this our calculations suggest that these four policies will dampen the hate down."

Beyond the question of how well the policies could work, however, is the question of how they fit in with the guiding practices of most social media platforms. Roberts doubts they would be willing to take on the massive ethical and legal risks—and investments—necessary for a unified stance. "The fundamental hypothesis that seems to underpin the study … is that the platforms themselves have definitively and unequivocally taken a stand against 'hate' on their sites," she says. "I don't know that this is an assumption that can be made."


Tags

jrDiscussion - desc
[]
 
JBB
Professor Principal
1  seeder  JBB    last year

How dark pools of hate evolve from seemingly innocuous sites...

 
 
 
Right Down the Center
Senior Guide
1.1  Right Down the Center  replied to  JBB @1    last year

Dark pools of hate sounds a little like deep state.  Some shadowy entity that no one can quite prove.

 
 
 
Snuffy
Professor Participates
1.1.1  Snuffy  replied to  Right Down the Center @1.1    last year

That can't be right.  Wasn't the seeder among the loudest proclaiming there was no deep state?  

 
 
 
Right Down the Center
Senior Guide
1.1.2  Right Down the Center  replied to  Snuffy @1.1.1    last year

Some folks seem to pick and choose what conspiracy theory to subscribe to.

 
 
 
Hallux
PhD Principal
1.1.3  Hallux  replied to  Right Down the Center @1.1.2    last year

Some folks just don't choose and simply subscribe to whatever it is up Tucker's butt, they want one too.

 
 
 
Right Down the Center
Senior Guide
1.1.4  Right Down the Center  replied to  Hallux @1.1.3    last year

I don't give any thought to what Tucker has up his butt.  

 
 

Who is online



112 visitors