'It let white supremacists organize': the toxic
legacy of Facebook's Groups
By the time Facebook banned QAnon content in 2020, a
Guardian report had exposed that groups dedicated to the dangerous conspiracy
theory were spreading on the platform at a rapid pace.
Facebook has said it will no longer algorithmically
recommend political groups to users, but experts warn that isn’t enough
Kari Paul
Thu 4 Feb
2021 11.00 GMT
https://www.theguardian.com/technology/2021/feb/04/facebook-groups-misinformation
Mark
Zuckerberg, the Facebook CEO, announced last week the platform will no longer
algorithmically recommend political groups to users in an attempt to “turn down
the temperature” on online divisiveness.
But experts
say such policies are difficult to enforce, much less quantify, and the toxic
legacy of the Groups feature and the algorithmic incentives promoting it will
be difficult to erase.
“This is
like putting a Band-Aid on a gaping wound,” said Jessica J González, the
co-founder of anti-hate speech group Change the Terms. “It doesn’t do enough to
combat the long history of abuse that’s been allowed to fester on Facebook.”
Groups – a
place to create ‘meaningful social infrastructure’
Facebook
launched Groups, a feature that allows people with shared interests to
communicate on closed forums, in 2010, but began to make a more concerted
effort to promote the feature around 2017 after the Cambridge Analytica scandal
cast a shadow on the platform’s Newsfeed.
In a long
blogpost in 2017 February called Building Global Community, Zuckerberg argued
there was “a real opportunity” through groups to create “meaningful social
infrastructure in our lives”.
He added:
“More than one billion people are active members of Facebook groups, but most
don’t seek out groups on their own – friends send invites or Facebook suggests
them. If we can improve our suggestions and help connect one billion people
with meaningful communities, that can strengthen our social fabric.”
After
growing its group suggestions and advertising the feature extensively –
including during a 60-second spot in the 2020 Super Bowl – Facebook did see a
rise in use. In February 2017 there were 100 million people on the platform who
were in groups they considered “meaningful”. Today, that number is up to more
than 600 million.
That fast
rise, however, came with little oversight and proved messy. In shifting its
focus to Groups, Facebook began to rely more heavily on unpaid moderators to police
hate speech on the platform. Groups proved a more private place to speak, for
conspiracy theories to proliferate, and for some users to organize real-life
violence – all with little oversight from outside experts or moderators.
Facebook in
2020 introduced a number of new rules to “keep Facebook groups safe”, including
new consequences for individuals who violate rules and increased responsibility
given to admins of groups to keep users in line. The company says it has hired
35,000 people to address safety on Facebook, including engineers, moderators
and subject matter experts, and invested in AI technology to spot posts that
violate it guidelines.
“We apply
the same rules to Groups that we apply to every other form of content across
the platform,” a Facebook company spokesperson said. “When we find Groups
breaking our rules we take action – from reducing their reach to removing them
from recommendations, to taking them down entirely. Over the years we have
invested in new tools and AI to find and remove harmful content and developed
new policies to combat threats and abuse.”
Researchers
have long complained that little is shared publicly regarding how, exactly,
Facebook algorithms work, what is being shared privately on the platform, and
what information Facebook collects on users. The increased popularity of Groups
made it even more difficult to keep track of activity on the platform.
“It is a
black box,” said González regarding Facebook policy on Groups. “This is why
many of us have been calling for years for greater transparency about their
content moderation and enforcement standards. ”
Meanwhile,
the platform’s algorithmic recommendations sucked users further down the rabbit
hole. Little is known about exactly how Facebook algorithms work, but it is
clear the platform recommends users join similar groups to ones they are
already in based on keywords and shared interests. Facebook’s own researchers
found that “64% of all extremist group joins are due to our recommendation
tools”, an internal report in 2016 found.
“Facebook
has let white supremacists organize and conspiracy theorists organize all over
its platform and has failed to contain that problem,” González said. “In fact
it has significantly contributed to the spread of that problem through its
recommendation system.”
‘We need to
do something to stop these conversations’
Facebook’s
own research showed that algorithmic recommendations of groups may have
contributed to the rise of violence and extremism.
On Sunday,
the Wall Street Journal reported that internal documents showed executives were
aware of risks posed by groups and were warned repeatedly by researchers to
address them. In one presentation in 2020 August, researchers said roughly “70%
of the top 100 most active US Civic Groups are considered non-recommendable for
issues such as hate, misinfo, bullying and harassment”.
“We need to
do something to stop these conversations from happening and growing as quickly
as they do,” the researchers wrote, according to the Wall Street Journal, and
suggested taking measures to slow the growth of Groups until more could be done
to address the issues.
Several
months later, Facebook halted algorithmic recommendations for political groups
ahead of the US elections – a move that has been extended indefinitely with the
policy announced last week. The change seemed to be motivated by the 6 January
insurrection, which the FBI found had been tied to organizing on Facebook.
In response
to the story in the Wall Street Journal, Facebook’s vice-president of
integrity, who oversees content moderation policies on the platform, said the
problems were indicative of emerging threats rather than inability to address
long-term problems. “If you’d have looked at Groups several years ago, you
might not have seen the same set of behaviors,” he said.
Facebook let white supremacists and conspiracy
theorists organize all over its platform and has failed to contain that problem
Jessica J González
But
researchers say the use of Groups to organize and radicalize users is an old
problem. Facebook groups had been tied to a number of harmful incidents and
movements long before January’s violence.
“Political
groups on Facebook have always advantaged the fringe, and the outsiders,” said
Joan Donovan, a lead researcher at Data and Society who studies the rise of
hate speech on Facebook. “It’s really about reinforcement – the algorithm
learns what you’ve clicked on and what you like and it tries to reinforce those
behaviors. The groups become centers of coordination.”
Facebook
was criticized for its inability to police terror groups such as the Islamic
State and al-Qaida using it as early as 2016. It was used extensively in
organizing of the Unite the Right Rally in Charlottesville in 2019, where white
nationalists and neo-Nazis violently marched. Militarized groups including
Proud Boys, Boogaloo Bois and militia groups all organized, promoted and grew their
ranks on Facebook. In 2020 officials arrested men who had planned a violent
kidnapping of the Michigan governor, Gretchen Whitmer, on Facebook. A
17-year-old in Illinois shot three people, killing two, in a protest organized
on Facebook.
These same
algorithms have allowed the anti-vaccine movement to thrive on Facebook, with
hundreds of groups amassing hundreds of thousands of members over the years. A
Guardian report in 2019 found the majority of search results for the term
“vaccination” were anti-vaccine, led by two misinformation groups, “Stop
Mandatory Vaccination” and “Vaccination Re-education Discussion Forum” with
more than 140,000 members each. These groups were ultimately tied to harassment
campaigns against doctors who support vaccines.
In
September 2020, Facebook stopped health groups from being algorithmically
recommended to put a stop to such misinformation issues. It also has added
other rules to stop the spread of misinformation, including banning users from
creating a new group if an existing group they had administrated is banned.
The origin
of the QAnon movement has been traced to a post on a message board in 2017. By
the time Facebook banned content related to the movement in 2020, a Guardian
report had exposed that Facebook groups dedicated to the dangerous conspiracy
theory QAnon were spreading on the platform at a rapid pace, with thousands of
groups and millions of members.
‘The calm
before the storm’
Zuckerberg
has said in 2020 the company had removed more than 1m groups in the past year,
but experts say the action coupled with the new policy on group recommendations
are falling short.
The
platform promised to stop recommending political groups to users ahead of the
elections in November and then victoriously claimed to have halved political
group recommendations. But a report from the Markup showed that 12 groups among
the top 100 groups recommended to users in its Citizen Browser project, which
tracks links and group recommendations served to a nationwide panel of Facebook
users, were political in nature.
Indeed, the
Stop the Steal groups that emerged to cast doubt on the results of the election
and ultimately led to the 6 January violent insurrection amassed hundreds of
thousands of followers – all while Facebook’s algorithmic recommendations of
political groups were paused. Many researchers also worry that legitimate
organizing groups will be swept up in Facebook’s actions against partisan
political groups and extremism.
“I don’t
have a whole lot of confidence that they’re going to be able to actually sort
out what a political group is or isn’t,” said Heidi Beirich of the Southern
Poverty Law Center Facebook’s Real Oversight Board, a group of academics and
watchdogs criticizing Facebook’s content moderation policies.
“They have
allowed QAnon, militias, and other groups proliferate so long, remnants of
these movements remain all over the platform,” she added. “I don’t think this
is something they are going to be able to sort out overnight.”
“It doesn’t
actually take a mass movement, or a massive a sea of bodies, to do the kind of
work on the internet that allows for small groups to have an outsized impact on
the public conversation,” added Donovan. “This is the calm before the
storm.”

Sem comentários:
Enviar um comentário