sexta-feira, 26 de junho de 2020

Verizon pulls ads from Facebook over inaction on hate speech / Zuckerberg announces new civil-rights protections as Facebook stares down boycott / Down the rabbit hole: how QAnon conspiracies thrive on Facebook

Zuckerberg announces new civil-rights protections as Facebook stares down boycott

The Facebook CEO said Friday the changes have been in the works for months.

By NANCY SCOLA
06/26/2020 02:54 PM EDT
Updated: 06/26/2020 04:03 PM EDT

CEO Mark Zuckerberg announced Friday steps Facebook is taking to protect the vote in the run-up to election day, including a more aggressive effort to combat voter suppression based on race and ethnicity — moves that come as the company is facing an advertising boycott led by a coalition of civil rights groups and progressive activists.

The changes, which include banning posts about immigration officials checking papers at voting places and ramping up their ability to enforce false claims about voting in the 72 hours before election day, reflect, Zuckerberg said, the recommendations of its pre-existing civil-rights audit committee.

He said the changes come in the context of the 2020 elections in the U.S. that were "already shaping up to be heated — and that was before we all faced the additional complexities of voting during a pandemic and protests for racial justice across the country."


The significance of what Zuck said: The changes have been in the works for months, Zuckerberg said, and he previewed some of them in May. But there's no arguing that their announcement comes at a useful time for Facebook. Complaints about its civil-rights record have been building for years, and recently reached a fever pitch around Facebook's decision in late May to leave up a post by President Trump that used the racially charged phrase “when the looting starts, the shooting starts” to discuss Black Lives Matter protesters.

Recently, those complaints coalesced into an attempt by major civil rights organizations like the NAACP and progressive activists like the group Sleeping Giants to convince advertisers to boycott Facebook for the month of July. After a slow start, that effort seemed to gain traction this week. The multinational consumer product giant Unilever announced hours before Zuckerberg's remarks that it was stopping advertising on both Facebook and Twitter through the end of the year. Others who are temporarily pulling ads include wireless carrier Verizon, ice cream maker Ben & Jerry's and clothier Eddie Bauer.

The significance of where he said it: Zuckerberg opened up his regular employee townhall to the public to announce the changes — reflecting his new willingness to throw the company, and himself, into the mix on the country's most controversial debates.

What's next: Boycott organizers have set July 1, or five days from now, as a kick-off point for advertisers to pause its use of Facebook. The question becomes whether Facebook's moves today and in the next week do anything to slow its momentum.

The president of one of the groups behind the boycott, Color of Change's Rashad Robinson, said quickly after Zuckerberg wrapped that his announcement had changed little. "Zuckerberg's address was 11 minutes of wasted opportunity to commit to change," tweeted Robinson. "I hope companies advertising on Facebook were watching - if they want to put their money where their mouth is on racial justice, then it's time to #StopHateForProfit," he added, using the hashtag for the boycott campaign.

Robinson's tweet is particularly eye-catching because he has been in close contact with Facebook over the years, including a sit-down dinner at Zuckerberg's house in November of 2019. Its sharpness reflects that he and at least some other civil-rights leaders have shifted their strategy when it comes to dealing with Facebook. If they haven't fully given up on dialogue, they're at least pairing it now with treating the company as something akin to a political opponent.

That said, at the moment, Facebook's most immediate worry is advertisers. Facebook's revenue depends nearly entirely on advertising, so the reaction to keep on eye on now: the results of the quiet conversations happening among CEOs, chief marketing officers and other corporate executives about what they're thinking about Facebook.



Verizon pulls ads from Facebook over inaction on hate speech

Company is the biggest yet to join growing movement, which includes Ben & Jerry’s and Patagonia, to boycott the social network

The pressure on Facebook to moderate hate speech has accelerated in recent weeks.

Kari Paul
Published onFri 26 Jun 2020 00.05 BST

Verizon is pulling its advertising from Instagram and Facebook, the biggest name so far in a growing movement to boycott the social network for not doing enough to stop hate speech on its platforms.

The company said on Thursday it will join other companies including Ben & Jerry’s, Patagonia and REI in suspending advertising from Facebook-owned platforms until the company “can create an acceptable solution that makes us comfortable”.

“We have strict content policies in place and have zero tolerance when they are breached, we take action,” Verizon’s chief media officer, John Nitti, said in a statement. “We’re pausing our advertising until Facebook can create an acceptable solution that makes us comfortable and is consistent with what we’ve done with YouTube and other partners.”

Facebook acknowledged the growing pressure on a call with advertisers on Wednesday, where a Facebook executive admitted there is a “trust deficit” with its clients on the platform.

The “Stop Hate for Profit” campaign was launched Wednesday by advocacy groups including the Anti-Defamation League, the NAACP and Color Of Change. It asks advertisers to pressure the tech giant to adopt stricter policies against racist and hateful content on its platforms by pausing all spending on advertising with the company for the month of July.

As part of the campaign, the groups alerted Verizon that one of its advertisements on Facebook had appeared next to a video from the conspiracy group QAnon drawing on hateful and antisemitic rhetoric.

Facebook makes $70bn in annual advertising revenue while “amplifying the messages of white supremacists” and “permitting incitement to violence”, according to the campaign.

The advocacy groups argue Facebook has failed to address misinformation and hate speech by making Breitbart News a “trusted news source” despite its history of working with white nationalists and neo-Nazis, allegedly allowing housing discrimination against communities of color, and failing to remove Holocaust denial posts.

The pressure on Facebook to moderate hate speech has accelerated in recent weeks as the platform refused to flag false and incendiary statements from Donald Trump, despite moves from rival platform Twitter to do so.

A recent study by the Anti-Defamation League found that the vast majority (77%) of online harassment experienced by respondents took place on Facebook.

“There is more progress to be made but we continue to make significant investments in technology and processes to help us remove hate, harassment and bullying from Facebook,” a company spokesperson said in response to the study.

In response to the advertiser boycotts, Carolyn Everson, the vice-president of Facebook’s global business group, said in a statement: “We deeply respect any brand’s decision, and remain focused on the important work of removing hate speech and providing critical voting information.”




Down the rabbit hole: how QAnon conspiracies thrive on Facebook

Guardian investigation finds more than 3m aggregate followers and members support QAnon on Facebook, and their numbers are growing

Julia Carrie Wong
@juliacarriew  Email
Thu 25 Jun 2020 11.00 BSTLast modified on Thu 25 Jun 2020 11.02 BST

QAnon is a movement of people who interpret as a kind of gospel the online messages of the anonymous figure, ‘Q’.

In early May, QAnon braced for a purge. Facebook had removed a small subset – five pages, six groups and 20 profiles – of the community on the social network, and as word of the bans spread, followers of Q began preparing for a broader sweep.

Some groups changed their names, substituting “17” for “Q” (the 17th letter of the alphabet); others shared links to back-up accounts on alternative social media platforms with looser rules.

More than just another internet conspiracy theory, QAnon is a movement of people who interpret as a kind of gospel the online messages of an anonymous figure – “Q” – who claims knowledge of a secret cabal of powerful pedophiles and sex traffickers. Within the constructed reality of QAnon, Donald Trump is secretly waging a patriotic crusade against these “deep state” child abusers, and a “Great Awakening” that will reveal the truth is on the horizon.

QAnon evolved out of the baseless Pizzagate conspiracy theory, which posited that Hillary Clinton was running a child sex ring out of a Washington DC pizza restaurant, and has come to incorporate numerous strands of rightwing conspiracy mongering. Dedicated followers interpret Q’s cryptic messages in a kind of digital scavenger hunt. Despite the fact that Q’s prognostications have reliably failed to come true, followers rationalize the inaccuracies as part of a larger plan.

Q’s initial commentary on the Facebook bans was concise: “Information Warfare,” Q posted on the website 8kun. Two days later, in a post that included a collage of dozens of news headlines about the takedowns, Q went further, speculating that there had been a “coordinated media roll-out designed to instill ‘fear’” in believers and dissuade them from discussing QAnon on social media. “When do you expend ammunition?” Q wrote. “For what purpose?”

The anticipated purge never came. Instead, QAnon groups on Facebook have continued to grow at a considerable pace in the weeks following the takedown, with several adding more than 10,000 members over 30 days.

A Guardian investigation has documented:

More than 100 Facebook pages, profiles, groups, and Instagram accounts with at least 1,000 followers or members each dedicated to QAnon.
The largest of these have more than 150,000 followers or members.
In total, the documented pages, groups and accounts count more than 3m aggregate followers and members, though there is likely significant overlap among these groups and accounts.

These groups and pages play a critical role in disseminating Q’s messages to a broader audience and in recruiting more believers to the cult-like belief system, researchers say.

“Facebook is a unique platform for recruitment and amplification,” said Brian Friedberg, a senior researcher at the Harvard Shorenstein Center’s Technology and Social Change Project who has been studying QAnon for years. “I really do not think that QAnon as we know it today would have been able to happen without the affordances of Facebook.”

Moreover, Facebook is not merely providing a platform to QAnon groups. Its powerful algorithms are actively recommending them to users who may not otherwise have been exposed to them.

The Guardian did not initially go looking for QAnon content on Facebook. Instead, Facebook’s algorithms recommended a QAnon group to a Guardian reporter’s account after it had joined pro-Trump, anti-vaccine and anti-lockdown Facebook groups. The list of more than 100 QAnon groups and accounts was then generated by following Facebook’s recommendation algorithms and using simple keyword searches. The Instagram accounts were discovered by searching for “QAnon” in the app’s discovery page and then following Instagram’s algorithmic recommendations.

Receiving QAnon recommendations from Facebook does not appear to be that uncommon. “Once I started liking those pages and joining those groups, Facebook just started recommending more and more and more and more, to the point where I was afraid to like them all in case Facebook would flag me as a bot,” said Friedberg. Erin Gallagher, a researcher who studies social media extremism, said she was also encouraged to join a QAnon group by Facebook, soon after joining an anti-lockdown group.

Facebook’s own internal research in 2016 found that “64% of all extremist group joins are due to our recommendation tools”, the Wall Street Journal reported, primarily through the same “Groups you should join” and “Discover” algorithms that promoted QAnon content to the Guardian. “Our recommendation systems grow the problem,” the internal research said.

 I really do not think QAnon as we know it today would have been able to happen without the affordances of Facebook
Brian Friedberg

Facebook did not directly respond to questions from the Guardian about its policy considerations around QAnon content. “Last month, we took down accounts, Groups, and Pages tied to this conspiracy theorist movement for violating our policies,” a company spokesperson said in a statement. “We also remove Groups and Pages that violate other policies from recommendations and demote in search results. We’re closely monitoring this activity and how our policies apply.”

The company also claimed that “all of the Pages” and “the vast majority of Groups” documented by the Guardian had been removed from recommendation algorithms prior to the Guardian’s query. The company did not provide evidence for this claim, which is contradicted by screenshots of pages and groups appearing in recommendations that were taken in May. The Guardian also continued to receive recommendations to join additional QAnon groups after its initial query to Facebook.

Asked about this discrepancy, Facebook said that the pages and groups in question had been marked as “non-recommendable” as of 8 April 2020 for violations of policies against clickbait, viral misinformation and hate speech, but that a page or group can be restored to eligibility for recommendations if its behavior improves for several months.

Over the course of reporting this article – about one month – the aggregate membership of the documented groups and pages grew from 2.75m to more than 3m, or approximately 8.5%. Groups and pages that the Guardian had documented to have been promoted through Facebook’s recommendation algorithms grew 19.9%. One page that appeared in recommendations – “We are ‘Q’” – saw its following grow nearly 60%, from about 24,000 to about 38,000 over the month – despite the page not having posted any new content since February.

To Friedberg, the window for Facebook to act on QAnon may have already passed. “I’m starting to wonder if we’re just waiting for the next shoe to drop – another act of violence,” he said. “That seems to be what the platforms wait for, and that in and of itself is terrifying.”

A ban that stuck
While QAnon thrives on Facebook, another social media site took timely and decisive action against it. Nearly two years ago, Reddit, the link-sharing network of interest-based message boards, carried out a site-wide purge of QAnon – and made it stick.

Reddit had been central to the development of the QAnon movement, which began in October 2017 with the emergence of “Q” on 4chan, the anarchic image board that has served as a launching pad for memes and internet culture but also racist extremism and harassment campaigns. Q, whose cryptic messages and predictions claimed to be based on a high-level government security clearance, quickly decamped from 4chan to the even more extreme 8chan, where believers could read Q’s latest “crumbs” directly from the source.

Q went briefly silent in 2019 when 8chan was forced offline in the wake of the El Paso massacre, but re-emerged on the new site founded by 8chan’s owners, 8kun.

Anonymous internet posters claiming to be high-level government officials are not entirely uncommon; in recent years, other so-called “anons” have emerged with claims that they were revealing secrets from inside the FBI or CIA. But Q is the first such figure to have achieved such a broad audience and real-world political influence. This is largely due to the activism of three dedicated conspiracy theorists who latched on to Q’s posts in the early days, according to an investigation by NBC News. These activists worked to develop a mythology and culture around QAnon and cultivated an audience for it on mainstream social media platforms.

Reddit was significantly easier to use for the kind of crowd-sourced research and interpretation that forms the core of participation in QAnon, and the site was host to a large pool of potential recruits, such as the 1.2m members of the subreddit r/conspiracy. It had also long enjoyed and at times even earned a reputation as one of the danker cesspools of the social web, for years tolerating communities known as “subreddits” dedicated to sharing non-consensual sexualized images of women or advocating rape.

But the violent anger of adherents to QAnon crossed the line for Reddit in less than a year. On 12 September 2018, citing its ban on content that “incites violence, disseminates personal information, or harasses”, the company banned 18 QAnon subreddits, the largest of which had more than 70,000 members.

Social media bans are often difficult to maintain, but Reddit’s move was uncommonly effective. Today, QAnon remains unwelcome on Reddit, with the few subreddits that address it dedicated to either debunking the theory or providing support to people who have lost friends and family members to QAnon.

‘Taking the red pill’
QAnon did not disappear after Reddit pulled the plug, however. Instead, its believers moved on to other platforms, including YouTube, Twitter, Discord and – crucially – Facebook. At the time of the Reddit ban, one of the largest closed Facebook groups dedicated to QAnon, “Qanon Follow the White Rabbit” had 51,000 members, according to NBC News. Today that group has grown to more than 90,000 members.

And while YouTube and Twitter have played an important role in providing a broadcast platform for QAnon content, the specific structures provided by Facebook are uniquely suited to the participatory “work” of engaging with QAnon. Facebook also provides QAnon with an even larger pool of potential recruits than Reddit could, especially for the somewhat older, Evangelical crowd that has proven susceptible to QAnon’s messaging.

Will Partin, a research analyst with Data & Society, and Alice Marwick, a professor of communication at the University of North Carolina, describe QAnon as a “dark participatory culture”, which is to say that it is a community that takes advantage of the infrastructure of social networking sites to bring disparate people together and foster discussion, collaboration, research and community, but directs those energies toward anti-democratic, regressive and even violent ends.

“Everything about our research suggests that these people are not irrational; they’re hyper-literate, even if they’ve come to beliefs that are empirically inaccurate ,” Partin said. “That’s partly because they have a fundamentally different epistemology to judge what is true and false.”

A man in the crowd holds a QAnon sign as crowds gather to attend Donald Trump’s campaign rally in Las Vegas, Nevada, 21 February 2020. Photograph: Patrick T Fallon/Reuters

The digital architecture of Facebook groups is also particularly well-suited to QAnon’s collaborative construction of an alternative body of knowledge, Friedberg said. The platform has created a ready-made digital pathway from public pages to public groups to private groups and finally secret groups that mirrors the process of “falling down the rabbit hole or taking the red pill”.

“You can mechanically take those steps,” he said. “Very few of the contemporary Q-following base actually need to engage with 8chan at all.”

To ban or not to ban
While Facebook has policies banning hate speech, incitement to violence and other types of content that it considers undesirable on a family- and advertiser-friendly platform, QAnon does not fit neatly into any single category.

Much of what is shared in QAnon groups on Facebook is a mix of pro-Trump political speech and pro-Trump political misinformation. Memes, videos and posts are often bigoted and disconnected from reality, but not all that different from the content that is shared in non-QAnon, pro-Trump Facebook groups.

The pages and groups that were removed in early May violated the company’s ban on “coordinated inauthentic behavior” – ie the kind of digital astroturf tactics that Russian operatives used to support Trump’s presidential campaign in 2016. Those rules are aimed at operations in which actors make false representations about their identities in order to mislead people – a description that could encompass Q – but Facebook only applies its policy to deceptive behavior that occurs on its platform, not on 8kun.

 When a common sense of what is real and what is correct breaks apart, it becomes nearly impossible to reach a democratic consensus
Will Partin

To enact a blanket ban akin to Reddit’s under its current rubric of rules, Facebook would likely have to designate QAnon as a “dangerous organization” – the category it uses to ban both terrorist and hate groups and any content published in support or praise of them. QAnon is hardly an organization, though as a movement it has certainly caused harm and could be considered dangerous.

There are innate societal and individual harms to convincing people of a version of reality that is simply false, as QAnon does, said Data & Society research analyst Will Partin. “When a common sense of what is real and what is correct breaks apart, it becomes nearly impossible to reach a democratic consensus.”

And QAnon followers’ enthusiasm for misinformation is not confined to politics; as the coronavirus pandemic took hold, the groups became a hotbed for medical misinformation – something Facebook has claimed to be working hard to combat. Analyses by Gallagher, the social media researcher, and the New York Times demonstrated how QAnon groups fueled the viral spread of “Plandemic”, a 26-minute video chock full of dangerously false information about Covid-19 and vaccines.

Facebook’s algorithms appear to have detected this synergy between the QAnon and anti-vaccine communities. Several QAnon groups are flagged with an automated warning label from Facebook that reads, “This group discusses vaccines” and encourages users to go to the website of the Centers for Disease Control for reliable information on health.

It appears that anti-vaccine propagandists are also taking notice, and attempting to capitalize. Larry Cook, the administrator of Stop Mandatory Vaccination, one of the largest anti-vaxx Facebook groups, has begun incorporating QAnon rhetoric into the medical misinformation he peddles, as well as making explicit invitations to QAnon believers to join his group.

Cook has begun referencing the “deep state” and stoking fear of forced vaccination and “FEMA camps”.

“I have discussed the concept many, many, many times that vaccines destroy our connection to God and that we are in a spiritual war with Principalities of Darkness that have a death wish for our children, and humanity at large,” he wrote in one QAnon-inflected post. (Cook also uses the site to aggressively promote his various products and a subscription-only platform for “medical freedom patriots”.)

But the potential for damage from QAnon goes well beyond. For those individuals who truly believe in the QAnon narrative, the crimes of the “cabal” are so grievous as to make fighting them a moral imperative. “They’re talking about a group of people who are operating our government against our wishes and they’re molesting and torturing children and destroying our society,” said Joseph Uscinski, a professor of political science who studies conspiracy theories. “It’s an incitement to violence.”

Indeed, there have been numerous incidents of real-world violence linked to QAnon, and in May 2019, the FBI identified QAnon as a potential domestic terrorism threat in an intelligence bulletin. While anti-government conspiracy theories were not new, the bulletin stated, social media was allowing them to reach a larger audience, and the online narratives were determining the targets of harassment and violence for the small subset of individuals who crossed over into real-world action.

Despite this, Uscinski is skeptical of the idea that kicking QAnon off Facebook would help anyone. He regularly polls conspiracy theories and consistently finds that QAnon is “one of the least believed things” out there, well below belief in theories about Jeffrey Epstein’s death, anti-vaccine hoaxes, and Holocaust denialism. Uscinski also cautions against overly exoticizing the QAnon narrative, noting that “most of the component parts of QAnon have been around forever”, with parallels in the Satanic Panic of the 1980s or the plot of Oliver Stone’s JFK. And he’s concerned about the free speech implications of censorship by tech platforms.

“It’s a potentially dangerous belief; it’s very disconnected from reality; I don’t really think we want more people getting into it,” he said of QAnon. “Do the internet companies bear some responsibility? Yes. Would it be better if they took it down? Probably. Does that take care of it? No.”

Partin said that he generally favored Facebook taking a “more aggressive approach to moderation”, including addressing the recommendation algorithms and trying to reduce the spread of misinformation out of dedicated conspiracy communities and into the mainstream.

“If Facebook flipped a switch and every Q post disappeared tomorrow, that probably would be harmful for QAnon,” he said. “But there is resiliency built in. Getting deplatformed is harmful, but the idea that it would somehow make this disappear is fanciful.”

Friedberg worried that it may already be too late. “Facebook should have taken action on this a long, long time ago, and the longer that they wait, the more deeply entrenched in mainstream politics this becomes,” he said. Facebook has been reluctant to appear in any way biased against Republicans, and if (or when) QAnon reaches Congress, it will be even more politically difficult for Facebook to take a stand.

In May, Republican voters in Oregon nominated a QAnon believer to run for the US Senate in November. Another QAnon supporter, Marjorie Taylor Greene, is likely to be elected to the House of Representatives after she came first in a Republican primary in a conservative Georgia district on 11 June.

 “In some ways, the second that Trump officially acknowledges QAnon is the second it becomes a partisan political issue that Facebook may not be able to take action against,” said Friedberg. “We’re watching a normalization process of these conspiracies, and I think the beast that is Facebook was really the answer to this all along.”

Indeed, Trump himself has repeatedly retweeted QAnon accounts on Twitter, which believers take as confirmation of their alternate reality. And on 20 June, just before Trump’s campaign rally in Tulsa, Oklahoma, Trump’s adult son Eric posted a QAnon meme on his Instagram account. Eric Trump deleted the image relatively quickly, but not before screenshots spread across the Facebook Q-sphere.

“So Eric Trump posted a pic with a ‘Q’ in the imagery,” an administrator of one of the larger QAnon groups wrote. “The pic has been taken down but the message was received!”

Sem comentários: