Zuckerberg blames contractors for failing to
remove Kenosha militia's 'call to arms'
Facebook CEO points to ‘operational mistake’ by teams
meant to bar organizations deemed dangerous
Julia
Carrie Wong in San Francisco
@juliacarriew
Email
Sat 29 Aug
2020 02.20 BSTFirst published on Fri 28 Aug 2020 23.52 BST
https://www.theguardian.com/technology/2020/aug/28/facebook-militia-posts-kenosha-protests
Mark
Zuckerberg blamed an “operational mistake” by contractors for Facebook’s
failure to remove the “call to arms” of a Kenosha, Wisconsin, militia prior to
the shooting Tuesday night that left two people dead and another injured.
The Kenosha
Guard militia had established a Facebook page in June 2020 and this week used a
Facebook event page to invite “any patriots willing to take up arms and defend
out [sic] City tonight from the evil thugs”, referencing those protesting the
police shooting of Jacob Blake. Facebook has admitted that both the page and
the event should have been banned under the company’s new policy addressing
groups linked to violence, such as militias. The company nevertheless failed to
remove the page or event despite multiple users who reported the content to
Facebook, the Verge reported.
“It was
largely an operational mistake,” Zuckerberg, Facebook’s CEO, said in remarks
during a weekly meeting with staff. Facebook has a specially trained team
dedicated to enforcing its ban on “dangerous organizations”, Zuckerberg said.
“The contractors and the reviewers who the initial complaints were were
funneled to … basically didn’t pick this up.” Once reports were sent to the
specialized team – after the fatal shooting – both the page and the event were
removed.
Zuckerberg
said Facebook had found no evidence to suggest the alleged shooter responded to
the Facebook “call to arms”, which was also amplified by the conspiracy site
InfoWars.
Facebook
published a portion of Zuckerberg’s remarks following a report by BuzzFeed News
on the anger and pushback among Facebook employees over the company’s handling
of the violence in Kenosha.
“At what
point do we take responsibility for enabling hate filled bile to spread across
our services? [A]nti semitism, conspiracy, and white supremacy reeks across our
services,” one employee wrote to Zuckerberg, according to the report.
“We need to
get better at avoiding mistakes and being more proactive,” another wrote.
“Feels like we’re caught in a cycle of responding to damage after it’s already
been done rather than constructing mechanisms to nip these issues before they
result in real harm.”
In the
publicized remarks, Zuckerberg continued: “What we’re trying to do here now is
we have our teams proactively out there looking for content and removing
content that praises the shooting, or the shooter … We are going to continue to
enforce our policies and continue evolving the policies, to be able to identify
more potential dangerous organizations and improve our execution in order to
keep on getting ahead of this.”
On
Thursday, the Guardian reported on Facebook and Instagram’s failure to keep
content praising the alleged shooter off of their platforms. Two fundraisers
for the alleged shooter were shared more than 19,000 times on Facebook, and
Instagram memes praising the alleged shooter attracted tens of thousands of
likes. Until Friday morning, Instagram was also auto-completing search results
with hashtags that praised the alleged shooter or called him a hero.
The initial
two fundraisers were removed by the online fundraising platforms that hosted
them, but a third was launched Thursday on the Christian site GiveSendGo and
has raised more than $125,000. The new fundraiser has been shared more than
3,500 times on Facebook since Thursday, and many of those posts remain live,
despite Facebook’s ban.
Facebook’s
rules ban “content that praises, supports, or represents events that Facebook
designates as terrorist attacks, hate events, mass murders or attempted mass
murders, serial murders, hate crimes and violating events”. Extremist groups
often adulate mass killers, and several recent mass shooters have indicated
that they took inspiration from figures such as the Norwegian far-right
terrorist Anders Breivik.
A Facebook
spokesperson said on Friday that the company had removed specific posts
praising the shooter that the Guardian had shared with it. The company also
banned the page of Joshua Feuerstein, an evangelical Christian social media
influencer with 2.6m Facebook fans, for repeatedly violating its rules.
Feuerstein had shared a meme in support of the alleged shooter with the
caption, “SHARE THIS BEFORE THEY TAKE IT DOWN AGAIN!”, garnering more than
3,700 shares in about two hours on Thursday. Feuerstein also has an Instagram
account, where his “story” currently includes a meme mocking the Kenosha victim
who was shot in the arm.
The
spokesperson said the company is blocking certain hashtags and searches and
removing fundraisers in order to enforce its ban on content that praises the
shooter or the shooting. The company is also “hashing” photos and videos that
violate this ban, a process that involves applying a unique digital fingerprint
to digital images, allowing systems to automatically block users from
re-uploading an image that has already been deemed to violate a rule.
“This is a
highly adversarial space and we know that people will continue trying to skirt
our detection – so our teams are working around the clock to stay ahead of them
and help us keep content related to the attack off of the platform,” the spokesperson
said.
A Kenosha Militia Facebook Event Asking Attendees
To Bring Weapons Was Reported 455 Times. Moderators Said It Didn’t Violate Any
Rules.
CEO Mark Zuckerberg said that the reason the militia
page and an associated event remained online after a shooting that killed two
people was due to “an operational mistake.”
Ryan Mac
BuzzFeed
News Reporter
Posted on
August 28, 2020, at 6:45 p.m. ET
https://www.buzzfeednews.com/article/ryanmac/kenosha-militia-facebook-reported-455-times-moderators
In a
companywide meeting on Thursday, Facebook CEO Mark Zuckerberg said that a
militia page advocating for followers to bring weapons to an upcoming protest
in Kenosha, Wisconsin, remained on the platform because of “an operational
mistake.” The page and an associated event inspired widespread criticism of the
company after a 17-year-old suspect allegedly shot and killed two protesters
Tuesday night.
The event
associated with the Kenosha Guard page, however, was flagged to Facebook at
least 455 times after its creation, according to an internal report viewed by
BuzzFeed News, and had been cleared by four moderators, all of whom deemed it
“non-violating.” The page and event were eventually removed from the platform
on Wednesday — several hours after the shooting.
"To
put that number into perspective, it made up 66% of all event reports that
day."
“To put
that number into perspective, it made up 66% of all event reports that day,”
one Facebook worker wrote in the internal “Violence and Incitement Working
Group” to illustrate the number of complaints the company had received about
the event.
BuzzFeed
News could not verify the content on the militia page or its associated event
because they had been removed from the platform. A previous story from the
Verge noted that the page had issued a “call to arms” and hosted a number of
commenters advocating for violence in Kenosha following the police shooting of
29-year-old Black man Jacob Blake.
A Facebook
spokesperson declined to comment.
The
internal report seen by BuzzFeed News reveals the extent to which concerned
Facebook users went to warn the company of a group calling for public violence,
and how the company failed to act. “The event is highly unusual in retrospect,”
reads the report, which notes that the next highest event for the day had been
flagged 18 times by users compared to the 455 times of the Kenosha Guard event.
After
militia gathered in Kenosha on Tuesday night, a 17-year-old with a rifle
allegedly killed two protesters. Facebook has maintained that the suspect,
whose Facebook and Instagram profiles were taken down after the incident, had
no direct connection with the Kenosha Guard page or event.
During
Facebook’s Thursday all-hands meeting, Zuckerberg said that the images from
Wisconsin were “painful and really discouraging,” before acknowledging that the
company had made a mistake in not taking the Kenosha Guard page and event down
sooner. The page had violated Facebook’s new rules introduced last week that
labeled militia and QAnon groups as “Dangerous Individuals and Organizations”
for their celebrations of violence.
The company
did not catch the page despite user reports, Zuckerberg said, because the
complaints had been sent to content moderation contractors who were not versed
in “how certain militias” operate. “On second review, doing it more
sensitively, the team that was responsible for dangerous organizations
recognized that this violated the policies and we took it down.”
During the
talk, Facebook employees hammered Zuckerberg for continuing to allow the spread
of hatred on the platform.
Provided to
BuzzFeed News
One
Facebook user received this response after trying to report the Kenosha Guard
Facebook page noting that it did not "go against one of our specific
Community Standards."
“At what
point do we take responsibility for enabling hate filled bile to spread across
our services?” wrote one employee. “[A]nti semitism, conspiracy, and white
supremacy reeks across our services.”
The
internal report seen by BuzzFeed News sheds more light on Facebook’s failure.
“Organizers…
advocated for attendees to bring weapons to an event in the event description,”
the internal report reads. “There are multiple news articles about our delay in
taking down the event.”
One
Facebook user who flagged the Kenosha Guard page “for a credible threat of
violence” was told “it doesn’t go against one of our specific Community
Standards,” according to a screenshot they sent to BuzzFeed News.
In addition
to the four manual reviews that determined the Kenosha Guard page to be
non-violative, the Facebook report also noted a number of reviews that “were
handled by automation” had reached the same conclusion. As part of a proposed
change, the Facebook employee writing the report said that the company should
monitor spikes in feedback reports for events and “trigger investigation
immediately given this has proved to be a good signal for imminent harm.”
The report
seems to acknowledge that Facebook was late to act.
“This post
provides more details around what happened and changes we are making to detect
and investigate similar events sooner,” the worker wrote. “This is a sobering
reminder of the importance of the work we do, especially during this charged
period.”
Sem comentários:
Enviar um comentário