Amid the Capitol riot, Facebook faced its own insurrection

- Advertisement -


When supporters of Donald Trump stormed the US Capitol on January 6, battling police and forcing lawmakers into hiding, a rebellion of a different kind was brewing inside the world’s largest social media company.

- Advertisement -

Thousands of miles away, Facebook engineers in California were racing to replace internal controls to slow the spread of misinformation and inflammatory content. Emergency actions – some of which were withdrawn after the 2020 election – including banning Trump, blocking comments in groups with hate speech records, filtering the rally “Stop the Steel” and labeling content moderators empowered to act more assertively. America is a “temporary high risk location” for political violence.

At the same time, frustration broke out within Facebook, which some saw as the company’s halt and often reversed response to rising extremism in the US.

advertisement

“Didn’t we have enough time to understand how to handle discourse without enabling violence?” An employee wrote on an internal message board at the height of the January 6 turmoil. “We’ve been fueling this fire for a long time and we shouldn’t be surprised that it’s now out of control.”

It’s a question that still hangs at the company today, as Congress and regulators investigate Facebook’s part in the January 6 riots.

- Advertisement -

New internal documents provided by former Facebook employee-turned-whistleblower Frances Haugen provide a rare glimpse into how the company has become embroiled in the January 6 riots. It quickly became apparent that even after years under the microscope for inadequate policing on its platform, the social network had remembered how riot participants—on Facebook—to prevent Congress from authenticating Joe Biden’s election victory. spent the week

The documents also bolster Haugen’s claim that Facebook put its growth and profits ahead of public safety, yet opening a clear window into how Facebook’s conflicting impulses – to protect its business and protect democracy – are being addressed. The days and weeks leading up to the effort clashed. January 6 coup.

This story is based in part on disclosures made to the Securities and Exchange Commission and provided in revised form to Congress by Hogen’s legal advisor. Revised versions obtained by Congress were obtained by a consortium of news organizations including The Associated Press.

The emergency measures Facebook implemented on January 6 were called “Break the Glass,” essentially a toolkit of options designed to stop the spread of dangerous or violent content, which the social network first described as bitter. was used in the election. . According to an internal spreadsheet that analyzed the company’s response, 22 of those measures were withdrawn at some point after the election.

“As soon as the polls were over, they turned them back off or they changed the settings back to prioritize development over security,” Haugen said in an interview with “60 Minutes.”

An internal Facebook report after January 6, previously reported by BuzzFeed, called the company a “piecemeal” approach to the rapid development of “Stop the Steel” pages, related misinformation sources, and violent and provocative comments. blamed for keeping.

Facebook says the situation is more nuanced and that it carefully calibrates its controls to react quickly to spikes in hateful and violent content, as it did on January 6. The company said it is not responsible for the actions of the rioters and has strict controls. The place before that day would not have helped.

Spokesperson Danny Lever said Facebook’s decision to phase out certain security measures includes clues to the Facebook platform as well as information from law enforcement. “When those signs changed, so did the measures.”

Lever said some measures persisted well into February and others are still active today.

Some employees were unhappy with Facebook’s handling of problematic content even before the January 6 riots. An employee who left the company in 2020 left a lengthy note saying that promising new tools, backed by strong research, were being constrained by Facebook “for fear of public and policy stakeholder backlash” ( Translation: concerns about negative reactions from Trump allies and investors) )

“Similarly (though even more concerning), I have seen already built and functional security measures withdrawn for the same reasons,” wrote the employee, whose name has been blacked out.

Research by Facebook long before the 2020 campaign left no doubt that its algorithms could pose a serious risk of spreading misinformation and potentially radicalizing users.

A 2019 study, titled “Carroll’s Journey to Qian-A Test User Study of Misinformation and Polarization Risks Encountered Through Recommendation Systems,” was conducted with a test account set up to reflect the views of a prototypical “strong conservative” Describes the results of an experiment conducted by a — but not extremist — 41-year-old North Carolina woman. This test account, using the fake name Carol Smith, indicated a preference for mainstream news sources such as Fox News, following humorous groups that mocked liberals, professed Christianity and were fans of Melania Trump.

Within a single day, Facebook-generated page recommendations for this account evolved into “quite a disturbing, polarizing situation,” the study found. By Day 2, the algorithm was recommending more extremist content, including a…

Credit: www.independent.co.uk / Facebook

- Advertisement -
Mail Us For  DMCA / Credit  Notice

Recent Articles

Stay on top - Get the daily news in your inbox

Related Stories