Social media companies have been warned not to encrypt messages unless they can guarantee that platforms will remain free of illegal sexual content.
The all-party parliamentary group (APPG) on social media says encryption could cripple attempts to detect illegal images.
And the MPs who opened an investigation last November into the “worrying” increase in so-called “self-generated” material on child sexual abuse. urge companies to get involved and do more to protect children from online grooming and sexual abuse.
MEPs say many witnesses have raised “very serious concerns” about the impact of encryption on child protection.
In her report Selfie Generation – What’s Behind The Rise of Self-Generated Indecent Images Of Children? The MPs write: “The APPG considers it completely unacceptable that a company should encrypt a service with many children, which would do so much harm to child protection.
“We recommend technology companies not to encrypt their services until a workable solution has been found that is equivalent to the current precautions for recognizing these images.”
Self-created content can include material recorded with webcams, which is very often filmed in the child’s own room and then shared online. In some cases, children are cared for, deceived, or blackmailed into producing and sharing a sexual image or video of themselves.
The report states that the trend “seems to have been exacerbated by the Covid-19 crisis”. And experts believe an increase in the number of offenders sharing child sexual abuse material during lockdown could fuel demand beyond the pandemic.
Labor MP Chris Elmore, chairman of the APPG, said social media companies need to be more proactive in eliminating abusive images and making it clear to young users how to complain about them.
He said, “It is high time we took meaningful action to resolve this unacceptable mess.
“Children are exposed to a real risk of unimaginable cruelty, abuse and, in some cases, death every day.
“Social media companies fundamentally fail to meet their obligations and simply ignore the obvious moral obligation to protect young users.
“They have to get to grips with institutional reorganization, including the introduction of a duty of care by companies towards their young users.”
The term “self-generated” should “not be understood to mean that such children share moral responsibility for their abuse,” he added.
Susie Hargreaves, Director of the UK Safer Internet Center, said: “We see the consequences of abuse and when children are attacked and forced to abuse themselves by criminal adult predators, it has a heartbreaking effect on children and their families.
“There is hope and there are opportunities for children and young people to defend themselves.
“The report removal tool we launched with Childline this year enables young people to have illegal pictures and videos removed of themselves.”
She added, “New laws will also help make a difference and the upcoming online safety law is a unique opportunity to make the UK a safer place to be online, especially for children.”
You can find more stories from where you live at Near you.