Michigan terror plot: why rightwing extremists
are thriving on Facebook
The platform provides tools for radicalization and
coordinated violence, and critics say it’s been slow to ban dangerous groups
Lois
Beckett
@loisbeckett
Sat 10 Oct
2020 06.00 BSTLast modified on Sat 10 Oct 2020 06.01 BST
In a year
of escalating political violence in the United States, Facebook has served as a
key organizing tool for violent extremists.
An alleged
plot to kidnap the Michigan governor, Gretchen Whitmer, was planned in part on
Facebook, with one leader of the scheme broadcasting a video of his
frustrations with Whitmer to a private Facebook group, and participants later
sharing footage of their paramilitary exercises and bomb-making training,
according to an FBI affidavit.
A related
Michigan militia group facing terrorism charges also used Facebook to recruit
new members, according to the Michigan state police.
Before
Michigan, there was the militia group in Kenosha, Wisconsin, that used a
Facebook event to encourage armed citizens to take to the streets, and the
anti-government “boogaloo” cop-killer in California this May allegedly met his
accomplice on Facebook. The deadly neo-Nazi rally in Charlottesville, Virginia,
in 2017, was originally organized as a Facebook event.
Facebook
has defended itself as working hard to keep users safe and to adapt to emerging
threats on its platform, as well as coordinating closely with law enforcement.
But evidence has mounted for years that Mark Zuckerberg’s goal of using Facebook
to “bring the world closer together” and to “give people the power to build
community” has also built powerful tools for radicalization and coordinated
violence.
Facebook
suggested this week that its conduct in the Michigan case had been “proactive”
and exemplary, and that it had played an important role in flagging extremist
content to law enforcement authorities. A spokesperson said the company had
“proactively reached out and cooperated with the FBI” to provide information
for the Michigan investigation more than six months ago.
“We remove
content, disable accounts and immediately report to law enforcement when there
is a credible threat of imminent harm to people or public safety,” a Facebook
spokesperson said.
The FBI
said in an affidavit that its Michigan investigation began when it “became
aware through social media that a group of individuals were discussing the
violent overthrow of certain government and law-enforcement components” in
early 2020.
Matt
Perault, a former Facebook public policy director who leads a center for technology
policy at Duke University, argued that the Michigan case should be seen as an
example of Facebook being part of the solution to dealing with extremist
radicalization in the US.
“The data
is pretty clear in this case that the ability of an informant to join a
Facebook group to identify the conduct, and Facebook’s work with law
enforcement, seems like it was pretty helpful,” he said, apparently helping “to
head off a horrific event before it occurred”.
But federal
authorities’ descriptions of a rightwing plot to kidnap a Democratic governor,
put her on “trial” as a tyrant and instigate a civil war, have also renewed
questions about whether Facebook’s current strategy for policing extremism on
its platform is adequate, or whether the company needs to fundamentally rethink
its approach, acknowledging that what’s good for Facebook may be bad for
democracy.
While
dangerous groups can and do organize across many different platforms, Facebook
is “uniquely dangerous” because it is designed for “algorithmic recruitment”,
Evan Greer, the deputy director of digital human rights group Fight for the
Future, wrote on Twitter on Thursday.
Facebook
has often framed the discussions of how its platform has been used by violent
extremists as a question of “free speech”, but many critics say that misses the
real issue about the ways in which the company uses its algorithms to recommend
extremist content and extremist groups to its users.
“It’s one
thing to provide a forum where people can say what they want, even if it’s
controversial,” Greer told the Guardian. “It’s a totally different thing to
actively help violent bigots recruit other violent bigots into their group
using data harvesting and algorithmic recommendations.”
When
researchers at the Network Contagion Research Institute began mapping the
spread of anti-government “boogaloo” rhetoric on the platform in early 2020,
the co-founder Joel Finkelstein said, Facebook began offering up advertisements
for purchases relevant to their interest in a coming civil war.
“It started
sending us ads for the boogaloo. Buy a boogaloo bag. Get a boogaloo AK-47
inscription on your gun,” Finkelstein said. “That was shocking.”
“We
realized the algorithms of Facebook have never met an apocalyptic, militant
cult set on killing cops that they didn’t like, and couldn’t merchandise.”
‘The chosen
platform of the militia movement’
The
Michigan kidnapping plot, and related charges against members of an
anti-government militia, are a new case study in the role Facebook has played
in emerging extremist threats. Authorities said they had arrested 13 men in
connection with violent plots against elected officials and law enforcement
officers.
Early
details suggest at least some of the alleged Michigan plotters identified with
“boogaloo” ideology, a nascent rightwing movement obsessed with civil war and
insurrection, which spread rapidly on Facebook in late 2019 and early 2020.
Officials described seven men facing terrorism charges as being part of an
anti-government militia group, the Wolverine Watchmen, and said the “commander”
of the group was known online as “Boogaloo Bunyan.”
The algorithms
of Facebook have never met an apocalyptic, militant cult set on killing cops
that they didn’t like, and couldn’t merchandise
Joel Finkelstein
Facebook
has updated its policies related to violent extremist groups multiple times
this year, including taking down a network of boogaloo groups as a dangerous
organization in June, and then restricting militia groups in late August, as
part of a crackdown on groups that did not meet the company’s criteria for
being dangerous enough to ban, but that did “have demonstrated significant
risks to public safety”.
As part of
a “strategic network disruption” of boogaloo groups on 30 June, Facebook
removed a group for the Wolverine Watchmen, the company said.
As
companies like Facebook are pressured by activists to take down material from
extremist groups, they may also be juggling requests from law enforcement “to
leave up material that prosecutors could use to prosecute people”, Perault
noted. Social media activity “makes information visible that might not
otherwise be visible”, and can be crucial to building criminal cases.
“Tech
companies are not going to be able to solve the issue of people doing terrible
things,” Perault said. “People will do terrible things using any communication
technology they have access to, including more traditional technologies like
phones.”
But some
analysts said Facebook’s action in addressing both the “boogaloo” groups and
militia organizing this summer was starkly overdue.
Armed
militia groups in the United States have an extensive, well-documented history
of deadly violence going back to the 1990s.
An alleged plot to kidnap the Michigan governor,
Gretchen Whitmer, was planned in part on Facebook.
“From 2008
to 2020, Facebook was sort of the chosen social media platform of the militia
movement,” said Mark Pitcavage, a senior research fellow at the Anti-Defamation
League’s Center on Extremism. “That’s a solid 12 years that the militia
movement thrived on Facebook.”
Facebook’s
belated action to restrict the militia groups in late August had an effect on
the broader movement: they “nuked” it, with many groups and pages taken down,
Pitcavage said. “It really made a big crater.”
“It would
have been better if they had done it in 2008,” Pitcavage said, but he
appreciated that it was better that the company to take action now than “in
2022 or 2024”.
At times,
Facebook has chosen not to significantly restrict or ban extremist groups on
its platform until after a member of the group has killed someone, even when
experts have sounded warnings about the group for months or years before an
attack.
This was
true of boogaloo groups on Facebook. A February 2020 report by the Network
Contagion Research Institute warned about the growth of boogaloo rhetoric on
Facebook, specifically that it included violent rhetoric about killing law
enforcement that might translate into action. After the report was made public,
Facebook told NBC News it was monitoring the groups for threats of violence,
but did not take any immediate action to ban boogaloo groups, even through
violent insurrection and killing law enforcement were central themes of
boogaloo discussions.
The company
finally announced a ban on a network of boogaloo groups on 30 June, four months
after a clear public warning that a cop-killer ideology was spreading on
Facebook, and nearly a month after two officers in California had already been
shot to death: the federal security officer David Patrick Underwood, on 29 May
in Oakland, and the California sheriff’s deputy Damon Gutzwiller, in a
subsequent ambush attack.
Early
details from Michigan suggest that one of the groups linked to the plot may have
been active on Facebook for eight months before the company finally designated
them as part of a dangerous network.
Michigan
state police described the Wolverine Watchmen in an affidavit as a militia
group that “engaged in firearms training and tactical drills to prepare for the
‘boogaloo’, a term referencing a violent uprising against the government or
impending politically-motivated civil war,” and said that they had “recruited
members using a social media platform, Facebook, since November 2019”.
Facebook
took down the Watchmen group on 30 June 2020. A spokesperson said that if the
Wolverine Watchmen group had been identified as a credible threat earlier, it
would have been removed at an earlier time.
It was the
“acts of real-world violence” by movement adherents in the spring of 2020 that
led Facebook to designate a boogaloo network as a dangerous organization and
ban it from the platform, a company spokesperson said in June, but it had been
monitoring the movement closely since 2019. Facebook had been identified
elements of the boogaloo movement “as far back as 2012”, a spokesperson said,
and had been monitoring monitoring debates inside the movement over “whether to
instigate violent conflict or be prepared to react when it occurs” for months
before it finally announced a ban.
Some
activists are now pressing Facebook not only to move faster in banning
dangerous groups from its platform, but to fundamentally rethink the way it
shares and promotes content, and suggests connections between different users.
Finkelstein,
the Network Contagion Institute co-founder, said that Facebook did not
currently have enough incentive to regulate itself, since extremist content was
very engaging. “It’s not in their financial interest” to change, he said.
“They’re creating social hazards in ways that we can’t police.”
Kari Paul contributed reporting



Sem comentários:
Enviar um comentário