Testimony of ex-Facebook employee-whistleblower Frances Haugen before the Senate Commerce Committee subcommittee on Tuesday painted a picture of a company driven by a single desire for growth and solutions to known problems for fear of affecting its bottom line. was not ready to implement.
Under questioning from senators, Ms Hogan revealed that the world’s largest social network has deliberately made choices that prioritize viral content, even at the risk of real-world harm. For their part, senators are now vowing to take action against the company on a range of issues that have plagued Congress since Facebook exploded in the public consciousness nearly two decades ago.
Here are some of the key moments and findings from Tuesday’s hearing.
Facebook is frequently compared to ‘Big Tobacco’
Perhaps no industry in modern American history has a worse reputation for lying than the big four makers of cigarettes – Philip Morris (now known as Altria), RJ Reynolds, Brown & Williamson, and Lorillard’s – which emerged in 1998. A $206 billion settlement was reached with lawyers in the U.S. 46 of 50 US states sued to charge Medicaid costs for the treatment of smoking-related illnesses and to prevent deceptive and unethical marketing practices.
Two years before that settlement, a tobacco executive named Jeffrey Wiegand appeared on CBS News’ 60 Minutes to reveal that his employers (Brown and Williamson) intentionally combined their tobacco with chemicals to make them more intoxicating. had become an addiction.
As he opened the hearing featuring Ms Haugen – who also appeared on 60 Minutes to discuss what she told officials about Facebook’s conduct – Senator Richard Blumenthal clarified the comparison. In his opening statement, the Connecticut Democrat recalled his role in the tobacco lawsuit as state attorney general and said Facebook is facing a “big tobacco, jaw-dropping moment of truth.”
For her part, Ms Haugen compared her former employer to a tobacco company, while telling senators that Facebook hides data in order for the public to be able to understand the harm it does.
“Facebook hides behind walls that prevent researchers and regulators from understanding the true dynamics of their systems. Facebook will tell you that privacy means they can’t give you data. This is not true,” said Ms Haugen.
“When tobacco companies claimed that filtered cigarettes were safe for consumers, scientists could independently invalidate these marketing messages and confirm that they did in fact pose a major risk to human health. The public cannot do this with Facebook, we are given no choice but to take their marketing messages on blind faith.
Facebook executives discourage build tools that could have detected more threats because there aren’t enough staff to deal with them
Ms Haugen told senators that Facebook’s staffing problems stemmed from a cycle of scandal that made talented people less willing to work there.
“Facebook is caught in a cycle where it struggles to get hired, has trouble understanding projects, which leads to scandals, which then makes it difficult to get hired, ” He said.
Ms Haugen later said she saw a “pattern of behavior” in which projects were so under-employed that there was “a sort of implicit discouragement from having a better detection system” for problems.
“My last team on Facebook was on the counter espionage team within the threat intelligence organization, and at any given time our team could only handle a third of the cases we knew about,” she explained. “And we knew that even if we built a basic detector, we would have many more cases.”
Facebook executives misled the public as to why it disabled features that could have prevented the January 6 attack on the Capitol
In her interview with 60 Minutes, Ms Haugen said Facebook’s explanation that it had discontinued civil protection measures implemented during the 2020 election after the election (but switched back after the January 6 uprising) Free speech concerns were projecting “false choice”.
“They have said that there were safeguards in implicit free speech before the election, but the choices being made on the platform were really about being responsive and twitchy… changed. Ran for elections because they knew they were dangerous,” she said. “Because they wanted that development back… After the election, they returned to their original defaults”.
The fact that Facebook had to turn security measures back on after the Capitol ransacked, Ms Hogen said, was “deeply problematic”.
Mark Zuckerberg Dismisses Changes That Could Have Stopped Violence
Under questioning from Commerce Committee Chair Maria Cantwell, Ms Haugen told senators that Facebook CEO Mark Zuckerberg was informed of changes that could be made to the platform, which could reduce the reach of viral content. who were promoting violence in places like Myanmar, but the CEO dismissed them as they would have made the platform “less viral”.
Based on a Facebook metric called “meaningful social interaction,” she said, “Mark Zuckerberg was presented directly with a list of soft interventions … about making slightly different choices to make the platform less viral less Twitch.” In.”
“Mark was presented with these options, and in April of 2020 opted not to remove the downstream MSI, even in isolated countries at risk – which are countries at risk of violence – if it has any effect on the overall MSI metric,” he explained.
Ms Haugen fears what could happen if Facebook’s power is not controlled by regulation
In her opening statement, Ms Haugen said Facebook officials want senators to “believe that you have to choose between a Facebook full of divisive and extreme content or lose one of the most important values our nation founded.” , free of cost …
Credit: www.independent.co.uk /