Molly Russell died after suffering from “the negative effects of online content” that a coroner concluded that “send shockwaves through Silicon Valley”, online safety campaigners have said.
In his ruling, senior coroner Andrew Walker said online content viewed by the 14-year-old on sites such as Instagram and Pinterest was “not secure” and “should not be available for a child to view”.
In response, the Russell family said it was “time to replace the toxic corporate culture at the heart of the world’s largest social media platform”.
Children’s charity NSPCC said the findings of the investigation “should be a turning point” and that tech companies should now be taken into account through a proposed online safety bill, putting children’s safety second to business decisions on social media sites. After the allegation of keeping”.
In their statement, the Russell family said: “It is time for the government’s online security bill to urgently deliver the long-promised legislation.
“It is time to protect our innocent youth, rather than allowing the platform to prioritize their profits by monetizing their miseries.
“For the first time today, tech platforms have been formally held responsible for the death of a child.
“In the future, we as a family hope that any other social media companies called upon to assist with inquiries to follow the example of Pinterest, who have taken steps to learn lessons and learn more about the inquiry process.” Together with honesty and respect.
“For Molly’s sake… let’s make the online world a place that prioritizes the safety and well-being of youth over the money that can be earned from them.”
Prime Minister Liz Truss has ratified the first online security bill, which, which has been in place for more than five years, will soon return to parliament after several delays.
The bill aims to introduce rules for social media and other user-generated content-based sites that compel children to remove illegal content from their platforms, with a particular emphasis on protecting children from harmful content.
All platforms within scope will have a duty to find and remove illegal content, with the largest also expected to deal with what is named “legal but harmful” content, as well as clear and strong terms and conditions.
In the wake of the investigation into Molly’s death, the NSPCC urged the government to take prompt action to bring the bill in.
Sir Peter Wanless, chief executive of the NSPCC, said: “The ruling should send shockwaves through Silicon Valley – hoping tech companies take them into account when they put the safety of children second to business decisions The horrors of this moment for children everywhere cannot be underestimated.
“Molly’s family will forever pay the price for the blatant failure of Meta and Pinterest that she shouldn’t see any kids, but the Online Safety Bill is a one-time opportunity to address this imbalance between families and big tech.
“This should be a turning point, and further delay or water in legislation addressing the preventable abuse of our children would be incomprehensible to parents across the UK.”
Online safety campaigners have long called for increased regulation of social media and broader platforms, arguing that the current system of self-regulation, led by each site’s own terms of service and content moderation processes, is inadequate.
Earlier this month, new Culture Secretary Michelle Donelan said the government was planning to “change” the online safety bill, but would not reduce the protections involved, especially for children.
After the coroner’s decision, the pressure on the government to bring the bill back to Parliament expeditiously is likely to increase.
Credit: www.independent.co.uk /