NEW YORK — Facebook, following horrific testimony that its platform was harming children, is offering a slew of features, including prompting teens to take breaks using its photo-sharing app Instagram, and “nudges” on seeing teens repeatedly. Will do Content that is not conducive to their well-being.

- Advertisement -

Menlo Park, Calif.-based Facebook also plans to introduce new controls for teen adults on an optional basis so that a parent or guardian can monitor what their teen is doing online. The move comes after Facebook announced late last month that it was halting work on its Instagram for Kids project. But critics say the plan lacks details and doubts the new features will be effective.

The new controls were outlined on Sunday by Nick Clegg, Facebook’s vice president for global affairs, who made the rounds on various Sunday news shows, including CNN’s “State of the Union” and ABC’s “This Week with George Stephanopoulos.” Where he was told about Facebook. The use of algorithms as well as its role in spreading harmful misinformation prior to the January 6 Capitol riots.

advertisement

“We’re constantly iterating to improve our products,” Clegg told Dana Bash on “State of the Union” on Sunday. “We, with the wand of the wand, cannot make everyone’s life perfect. What we can do is improve our products, so that our products are safe and enjoyable to use.”

Clegg said Facebook has invested $13 billion to keep the platform safe over the years and that the company has 40,000 people working on these issues. And while Clegg said Facebook did its best to keep harmful content out of its platforms, he says it was open to more regulation and oversight.

- Advertisement -

“We need more transparency,” he told CNN’s Bash. He noted that the systems Facebook has in place should be taken into account by regulation, if necessary, so that “people can match what actually happens according to our systems.”

The flurry of interviews came after whistleblower Frances Haugen, a former data scientist with Facebook, went before Congress last week accusing the social media platform of failing to make changes to Instagram, as internal research apparently found some teens harmed and dishonest in it. People’s fight against hate and misinformation. Haugen’s allegations were supported by thousands of pages of internal research documents secretly copied before leaving the job at the company’s civil integrity unit.

Josh Golin, executive director of FairPlay, a watchdog for the children and media marketing industry, said he doesn’t think it would be effective to introduce controls to help parents monitor teens because many teens set up secret accounts anyway. Huh. He was also skeptical about how effective it would be to motivate teens to take breaks or move away from harmful content. He said Facebook needs to show how they will implement it and offer research that shows these tools are effective.

“There is tremendous reason to be skeptical,” he said. He added that regulators need to restrict what Facebook does with its algorithms.

He added that he also believes Facebook should cancel its Instagram project for kids.

When Clegg was grilled by both Bash and Stephanopoulos in separate interviews about the use of algorithms in amplifying misinformation prior to the January 6 riots, he responded that people would see more if Facebook removed the algorithm. , no less hate speech, more, no less, misinformation.

Clegg told both hosts that the algorithm worked as a “giant spam filter.”

Democratic Sen. Amy Klobuchar of Minnesota, who chairs the Senate Commerce Subcommittee on Competition Policy, Antitrust and Consumer Rights, told Bash in a separate interview Sunday that it is looking to update children’s privacy laws and bring more transparency into the use of algorithms. It’s time to provide.

“I appreciate that he’s ready to talk things out, but I believe it’s time to talk,” Klobuchar said, referring to Clegg’s plan. “Now is the time to take action.”