Several solutions were suggested within Dame Rachel’s report, including amending the Online Safety Act (OSA) to include a “clear duty of care” for social media platforms to stop showing adverts to children.
The Online Safety Act aims to make the internet safer for people in the UK, especially children, through a set of laws and duties that online platforms must follow. It is implemented and enforced by Ofcom.
It requires firms to remove harmful material quickly when it is identified.
Dame Rachel also suggested adding changes to Ofcom’s Children’s Code of Practice to “explicitly protect children from body stigma content”.
But according to Ofcom, this is already part of the Code under “non-designated content”.
“Body stigma content can be incredibly harmful to children, which is why our codes require sites and apps to protect children from encountering it, and to act swiftly when they become aware of it,” a spokesperson for the regulator said.
And Dame Rachel called for stronger regulation and enforcement of online sales of age-restricted products – suggesting the government should consider restricting children’s access to some social media platforms.
“Urgent action is needed to create an online world that is truly safer by design,” said Dame Rachel.
“We cannot continue to accept an online world that profits from children’s insecurities and constantly tells them they need to change or must be better.”
A government spokesperson said it was “always clear” the OSA wasn’t the end of the conversation.
They added the government had recently launched a national consultation on “bold measures to protect children online”, including potentially banning social media for under 16s.
