Facebook unveils new Instagram safeguarding measures for children

Facebook announced several new features to keep young people safe after claims that its platforms harm children.

New features include asking teenagers to take a break from using the Instagram photo sharing app and “nudging” teenagers when they repeatedly watch the same content that may not be good for their wellbeing.

Facebook is also planning to introduce new optional controls that will allow parents and guardians to monitor their children’s online activities.

The initiatives, announced on Sunday by Facebook vice president of global affairs Sir Nick Clegg, came after the tech company announced late last month that it was suspending work on its Instagram for Kids project.

However, critics say the company only acted under pressure from outside.

Others have said the plan lacks detail and are skeptical of the effectiveness of the new features.
Sir Nick told CNN’s State Of The Union program, “We are constantly working to improve our products.
“We cannot make everyone’s life perfect with a magic wand. What we can do is improve our products so that our products are safe and comfortable to use. ”

Sir Nick said Facebook has invested $ 13 billion (£ 9.5 billion) over the past few years to keep the platform secure and that the company has 40,000 employees working on these issues.

Whistleblower Frances Haugen, a former data scientist at Facebook, went before the United States Congress last week to accuse the social media platform of neglecting to make changes to Instagram after internal research showed that some teenagers were affected obviously harm has been done and dishonest in their public fight against them hatred and misinformation.

Ms. Haugen’s allegations were accompanied by tens of thousands of pages of internal research documents that she secretly copied before she left her job in the company’s civil integrity department.

Ian Russell, who founded the molly Rose Foundation suicide prevention charity after his 14-year-old daughter Molly committed suicide in 2017 after viewing unsettling content on Instagram, described the announcement as “a crisis communications initiative.”

He said: “Since Molly’s death almost four years ago, I’ve grown used to hearing the social platforms’ reaction when they tragically fall short and their practices are inevitably challenged.

“Your usual PR response is to announce a small change in their policies or processes; speak fine words that show your understanding and empathy; and then go about their business as closely as possible.

“Any positive change that makes the platforms safer is welcome, of course, but over time I ask myself, ‘Why are the platforms waiting before they react? Why don’t they use their unique influence and wealth to innovate and lead? ‘”

Andy Burrows, director of online child safety policy at NSPCC, said: “It is no surprise that Facebook has renewed its commitment to adopting child safety measures only in response to media pressure, rather than ensuring that its websites are approved by Naturally safe are the first place.

“In this case, that announcement came a week after damning evidence revealed that Facebook was aware of the way it harms children and did not react.

“It’s time tech companies stopped treating child safety as a PR exercise, so we need an online safety law that will force companies to finally make children a priority.”

You can find more stories from where you live at Near you.


Leave a Comment