Facebook’s former product manager on the company’s civil misinformation team, 37-year-old data scientist Frances Haugen has come forward and has revealed herself to be the whistleblower behind the leak. Thousands of pages of potentially harmful internal documents To wall street journal.
Ms. Haugen, who left the Big Tech giant in March this year, appeared on 60 minutes Engaged on the public good and “pay for your profits with our safety” on CBS Sunday evening to discuss its efforts to uncover bad behavior on Facebook and accuse the company of prioritizing inflammatory content will produce
His past chargespublished anonymously in WSJ In recent weeks, this has included how celebrities, politicians and high-profile users were treated differently from ordinary people by the site and exempted from certain moderating protocols under a system called “XCheck”, that human The company’s responses to tip-offs about smugglers and drug cartels using its pages were often subdued and Facebook actively engaged in targeted self-promotion under “Project Amplify”.
She also revealed that the company is involved in a lawsuit with a group of its own shareholders, who allege that paying £3.65bn to the US Federal Trade Commission to solve the Cambridge Analytica data scandal was so difficult because It was designed to protect founder Mark. Zuckerberg from personal liability.
Perhaps Ms. Hogen’s most shocking allegation was that Facebook’s own research showed that its lifestyle app Instagram was harmful to the mental health and self-esteem of teenage girls, but it failed to take steps to address the problem.
According to documents they obtained, an internal study found that 32 percent of young women who said they felt bad about their bodies admitted that they felt worse after logging into Instagram.
talking to 60 minutes host scott pelli, the analyst, who has previously worked at other Silicon Valley big beasts like Google, Pinterest and Yelp, said that although he didn’t believe Mr. Zuckerberg was determined to create a negative space: “Facebook’s version that As it exists today, our societies are falling apart and causing ethnic violence around the world.”
“What I saw over and over again on Facebook was a conflict of interest between what was good for the public and what was good for Facebook,” she said. “And Facebook has repeatedly chosen to optimize for its interests, such as making more money.”
Ms. Hogen said during the event that she believed changes to the algorithm that appeared on user newsfeeds in 2018 helped the company move toward prioritizing content designed to provoke a strong emotional response. I had seen.
“Its own research is showing that content that is hateful, that is divisive, that is polarizing, it is easier to drive people to anger than other emotions,” she told CBS.
She also alleged that Facebook turned on security systems during the 2020 US presidential election to prevent the spread of misinformation, but “they turned them back on as soon as the election ended or they changed the settings to the first one.” to prioritize development over security”, to enable some of the people who staged the January 6 Capital Riot in Washington, DC, to organize via Facebook, among other platforms.
Before the program aired, Nick Clegg, the company’s vice president of global affairs (and former Liberal Democrat leader) released an internal memo in which he wrote: “The evidence that exists does not support the idea that Facebook, or social media, is common. In general, the main reason for polarization.”
Mr Clegg later appeared on CNN on Sunday to reiterate his view that it was “ridiculous” to suggest that social media was to blame for the attack on the Capitol by supporters of Donald Trump, the 45th president in the ballot box. were seeking to reverse the defeat.
CBS was also answered by Facebook’s Director of Policy Communications Lena Pietsch, who said in a statement: “Every day our teams need to keep our platform safe and positive with the rights of billions of people to express themselves openly. There has to be a balance to protect. Place. We continue to make significant improvements to combat the spread of misinformation and harmful content. To suggest that we encourage bad content and do nothing, it is not true.
“If any research had identified precise solutions to these complex challenges, the tech industry, governments and societies would have solved them much earlier.”
Granthshala Reached out to Facebook for further comment and was directed to an earlier statement by Ms. Pietsch.
Yael Eisenstadt, another former employee who became an outspoken critic of the social network, told Vocal Ms Hogen’s revelations were “a big moment” for the company.
“Over the years, we have known many of these issues – through journalists and researchers – but Facebook has been able to claim that they have an ax to grind and therefore we should not trust their words. This time, The documents speak for themselves.”
Francis Haughan will next appear before a Senate subcommittee on Tuesday at a hearing titled “Protecting Kids Online” about the company’s research into the impact of Instagram on young users’ mental health.
Credit: www.independent.co.uk /