Facebook’s success was built on algorithms. Can they also fix it?

- Advertisement -

Now, hours of testimony and thousands of pages of documents from Facebook whistleblower Frances Haugen have re-examined the impact Facebook and its algorithms have on teenagers, democracy, and society at large. The result has raised questions about whether Facebook, and perhaps platforms like it, can use a bevy of algorithms to determine what images, videos and news users see.

Haugen, a former Facebook product manager with a background in “algorithmic product management”, has focused primarily on the company’s algorithms designed to show users the content they are most likely to engage with. Chances are. It has said it is responsible for many of Facebook’s problems, including polarization, misinformation and other toxic content. Facebook, she said in a “60 Minutes” appearance, understands that if it makes the algorithm safer, “people will spend less time on the site, they will click on fewer ads, they will make less money.” (Facebook CEO Mark Zuckerberg pushed back On the idea that the company prioritizes profit over the safety and well-being of users.)
Facebook’s global policy management chief Monica Bickert said in an interview with Granthshala after Haugen’s Senate hearing on Tuesday that it is “not true” that the company’s algorithms are designed to promote inflammatory content, and That Facebook actually does the “opposite”. The devaluation of the so-called click-bait.
Sometimes in his testimony, Haugen suggested a radical rethink on how News Feed should operate, addressing issues presented through extensive documentation from within the company. “I am a strong supporter of chronological rankings, ordered by time,” she said in her testimony before a Senate subcommittee last week. “Because I think we don’t want computers to decide what we focus on.”
- Advertisement -

But algorithms that pick and choose what we see are central not only to Facebook but to many social media platforms following in Facebook’s footsteps. For example, TikTok would be unrecognizable without a content-recommendation algorithm running the show. And the bigger the platform, the greater the need for algorithms for filtering and sorting content.

Algorithms are not going away. But there are ways for Facebook to make them better, experts in algorithms and artificial intelligence told Granthshala Business. However, it will require something Facebook appears reluctant to offer so far (despite executive talk points): greater transparency and control for users.


What’s in an algorithm?

The Facebook you experience today is a very different social network than it was in its early days, with a constant flow of algorithmically selected information and ads. In 2004, when Facebook first launched as a site for college students, it was both easier and more difficult to navigate: If you wanted to see what friends were posting, you had to go through it all at once. Have to go to their profile.
This began to move in a major way in 2006, when Facebook introduces News Feed, giving users a fire hose of updates from family, friends, and the guy they went on some bad dates with. From the beginning, Facebook reportedly used algorithms to filter the content viewed in News Feed. one in 2015 Time magazine storyChris Cox, the company’s chief product officer, said curation was still necessary because there was too much information to show every user of it all. Over time, Facebook’s algorithms evolved, and users became accustomed to the algorithms used to determine how Facebook content would be presented.

An algorithm is a set of mathematical steps or instructions, especially for a computer, that describes what to do with some input in order to produce some output. You can think of it roughly similar to a recipe, where the ingredients are the input and the final dish is the output. On Facebook and other social media sites, however, you and your actions – what you type or the pictures you post – are the input. What the social network shows you – whether it’s a post from your best friend or an ad for camping gear – is the output.

- Advertisement -

At their best, these algorithms can help personalize feeds so that users can discover new people and content that matches their interests based on prior activity. In a worst-case scenario, as Haugen and others have pointed out, they run the risk of directing people down an annoying rabbit hole that could expose them to toxic materials and misinformation. In any case, they keep people scrolling for longer, potentially helping Facebook earn more money by showing users more ads.

Several algorithms work together to create the experience you see online on Facebook, Instagram, and elsewhere. This can make it even more complicated to tease out what’s going on inside such systems, especially at a large company like Facebook, where multiple teams build different algorithms.

“If some high power Facebook would go and say, ‘Fix algorithms in XY’, it’s really hard because they’ve become really complex systems with many inputs, many loads, and they’re going to have multiple systems working together. kind of,” said Hilary Ross, a senior program manager at Harvard University’s Berkman Klein Center for Internet and Society and the Institute for Social Media Rebooting.

more transparency

However, there are ways to clarify these processes and tell users more about how they work. Margaret Mitchell, who leads artificial intelligence ethics for AI model builder Hugging Faces and formerly co-led Google’s ethical AI team, believes it does this by allowing you to see details about Why you are seeing what you see on social networks, such as in response to posts, advertisements, and other things you see and interact with.
Why is whistleblower Frances Haugen Facebook's worst nightmare?

“You can even imagine something to be said in this. You may even be able to select preferences for things you want to customize for yourself,” she said, “such as how often you watch content from your immediate family.” Wishes, pictures of high school friends, or kids. All those things can change over time. Why not let users control them?

Transparency is important, she said, because it encourages good behavior from social networks.

Another way social networks can be pushed towards increased transparency is by increasing independent auditing of their algorithmic practices, according to Sasha Costanza-Chalk, director of research and design at the Algorithmic Justice League. They envision this as entirely independent researchers, investigative journalists, or people inside regulatory bodies – not the social media companies themselves, or the companies they hire – that have algorithms to ensure the law. Have the knowledge, skills and legal authority to demand access to the system. are not violated and best practices are followed.

James Mickens, a computer science professor at Harvard and co-director of the Berkman Klein Center’s Institute for Rebooting Social Media, called for Insight to conduct elections without disclosing private information about voters (such as who each person voted for). Suggested to see if audit can be done. About how algorithms can be audited and improved. He believes this could provide some insight into building an audit system that would allow people outside Facebook to provide surveillance while protecting sensitive data.

Other Metrics for Success

Experts say a major barrier to making meaningful improvements is social networks’ focus on the importance of engagement, or how long it takes users to scroll, click, and otherwise interact with social media posts and ads.

Haugen revealed internal documents Those from Facebook who attribute the social network know that its “core product mechanics, such as optimization for virginity, recommendations, and engagement, are an important part” of why hate speech and misinformation “flourish” on its platform.

This is difficult to change, experts said, although many agree that it may involve considering users’ feelings when using social media, not just the amount of time they spend using it.

“Engagement is not synonymous with good mental health,” Mikens said.

Can Algorithms Really Help Fix Facebook’s Problems? Mickens, at least, is hopeful that the answer is yes. They feel that they can be tailored more towards the public interest. “The question is: what will convince these companies to think this way?” he said.

In the past, some would have said that this would require pressure from advertisers whose dollars back these platforms. But in his testimony, Haugen seemed to bet on a different answer: pressure from Congress.


Credit : www.cnn.com

- Advertisement -
Mail Us For  DMCA / Credit  Notice

Recent Articles

Stay on top - Get the daily news in your inbox

Related Stories