OPINION
GUEST ESSAY
I Worked at Facebook. It’s Not Ready for This
Year’s Election Wave.
Jan. 29,
2022
By Katie
Harbath
https://www.nytimes.com/2022/01/29/opinion/facebook-2022-election.html?searchResultPosition=1
Ms. Harbath
is the chief executive of Anchor Change, a company focused on issues at the
intersection of tech and democracy, and director of technology and democracy at
the International Republican Institute, a nonpartisan organization committed to
advancing democracy and freedom. She is a former public policy director at
Facebook.
The world
is not ready for the coming electoral tsunami. Neither is Facebook. With so
many elections on the horizon — France, Kenya, Australia, Brazil, the
Philippines and the United States will hold elections this year — the
conversation now should focus on how Facebook is preparing.
I know what
it’s like to prepare for an election at Facebook. I worked there for 10 years,
and from 2014 through the end of 2019, I led the company’s work across
elections globally. It has poured more than $13 billion into building up its
safety and security efforts in the United States since the 2016 elections, when
the platform was too slow to recognize how its products could be weaponized to
spread misinformation.
Responsible
election plans cannot be spun up in days or weeks. It takes time not only to
organize internally but also to make meaningful and necessary connections with
the communities around the world working to secure elections. Facebook must
begin serious, concerted, well-funded efforts today.
For some of
the elections happening in the first half of this year, Facebook is cutting it
close. But there’s still time for Facebook to commit to a publicly available
road map that outlines how it plans to build up its resources to fight
misinformation and hate speech around the world. Algorithms that find hate
speech and election-related content; labels that give people more context, like
those in the United States applied to content that questioned the election
results; and efforts to get people accurate information about where, when and
how to vote should all be a part of the baseline protections Facebook deploys
across the globe. On top of these technical protections, it needs people with
country-specific language and culture expertise to make tough decisions about
speech or behavior that might violate the platform’s rules.
I’m proud
of the progress the company made in bringing more transparency to political and
issue ads, developing civil society partnerships and taking down influence
operations. None of that progress happened spontaneously. To combat the
Internet Research Agency, a Russian troll farm that exposed 126 million
Americans to its content before and after the 2016 elections, for example,
Facebook needed new policies, new expertise and a revamped team at the platform
dedicated to these issues. Because of those innovations, the company was able
to take down 52 influence networks in 2021.
Facebook
couldn’t do this work alone. Partnerships with organizations such as the
Atlantic Council, the National Democratic Institute, the International
Republican Institute and many others were crucial.
But even
then, providing the technical infrastructure to combat misinformation is only
half the battle. Facebook faced scrutiny again in 2020 and 2021 for how it
handled everything from President Donald Trump’s Facebook account to false
election fraud claims and Jan. 6. Many of the conversations I had at the time
revolved around balancing the right to free speech with the harm that speech
could cause someone.
This is one
of the central dilemmas companies like Facebook grapple with. What is the right
call for company administrators when a sitting president of the United States
violates their platform’s community standards, even as they believe that people
should be able to hear what he has to say? When are people exercising their
right to organize and protest against their government, as opposed to preparing
for a violent insurrection?
Similar
issues come up in other countries. Last year the Russian government pressured
Apple and Google to remove an app created by allies of Aleksei Navalny, an
opponent of President Vladimir Putin’s. Refusing the government would have put
their employees in Russia at risk. Complying would go against free-expression
standards. The companies chose to protect their employees.
These are
the kinds of difficult questions that crop up in every country, but Facebook
also needs country-specific monitoring. Human expertise is the only way to
truly understand how heated discussions are shifting in real time and to be
sensitive to linguistic and cultural nuances. The word “dill” in Russian
translates to “ukrop,” for example, which has been used as a slur against
Ukrainians. Some Ukrainians, however, reclaimed the word and even named a
political party after it. A global framework that fails to account for these
kinds of situations or that is overly reliant on technology to address them is
not prepared to confront the reality of our complex world.
Facebook
has invested billions in this kind of work. But a majority of its investment
for classifying misinformation, for example, has focused on the United States,
even though daily active users in other countries make up the vast majority of
the user base. And it’s not clear which efforts Facebook will extend from U.S.
elections to those in other countries. It’s unlikely that within the next two
years, much less the next few months, Facebook can build up protections in
every country. But it must start planning now for how it will exponentially
scale up people, products and partnerships to handle so many elections at once
in 2022 and 2024.
It should
be transparent about how it will determine what to build in each country. In
2019, Facebook had more than 500 full-time employees and 30,000 people working
on safety and security overall. Even with that amount of human talent, it could
cover the national elections in only three major countries at once. At least
that many people were needed for the United States in 2020. In two years,
people in the United States, India, Indonesia, Ukraine, Taiwan, Mexico and
Britain are to go to the polls in national elections. Facebook will need to
consider hiring at least 1,000 more full-time employees to be ready for the
next big election cycle. If the company is cutting it close for 2022, it has
just enough time to be really ready for 2024.
These
problems are not ones that Facebook can fix on its own. Its parent, Meta, is a
private company but one with tremendous influence on society and democratic
discourse. Facebook needs to continue to recognize the responsibility it has to
protect elections around the world and invest accordingly. Governments, civil
society and the public should hold it accountable for doing so.
Katie
Harbath is the chief executive of Anchor Change, a company focused on issues at
the intersection of tech and democracy, and director of technology and
democracy at the International Republican Institute, a nonpartisan organization
committed to advancing democracy and freedom. She formerly worked at Facebook,
where she helped lead its work on elections.
Sem comentários:
Enviar um comentário