Further data from the NSPCC's FOFI request to English and Welsh forces showed that when the platforms used by the perpetrators were recorded, half were on Snapchat and a quarter was made on meta products such as Instagram, Facebook and WhatsApp. This data suggests that private messaging sites were involved in more crime than any other type of platform. This means that perpetrators can abuse the secrets these spaces provide to hurt children undetected. Through these apps, we put pressure on criminals' fearful mail and young people to share images of child abuse. This is something our child line counselors hear from their children regularly.
Last year, Childline provided 903 counseling sessions related to threats that publish sexual images online. A 13-year-old girl told us: I think he's in his 30s. I told him I didn't want to send him any more photos and he started threatening me and said he would post them online. I feel angry with myself and lonely. I want to support my friends, but I don't want to talk to them because I'm worried about being judged. ”
Tens of thousands of image crimes of child sexual abuse continue to be documented by police every year. Much of this material is shared repeatedly online. In 2025, we still see that blatant disregard from tech companies will stop this illegal content from spreading to their sites. The NSPCC will join other charities such as the Marie Collins Foundation, Lucy Faithful Foundation, and Barnardo, where Secretary of Home Affairs Yvette Cooper and Secretary of State for Science, Innovation and Technology Peter Kyle raise concerns about the enforcement of the online safety law. The law fought to be introduced by the NSPCC over five years ago is designed to protect children online, but Ofcom's recently published code of practice for illegal harm believes it cannot protect children from abuse in private messaging services. By the fact that user-to-user services are only needed to remove illegal content that is “technically executable”, OFCOM's code creates a loophole that allows services to avoid providing the most basic protection. Having individual rules for private messaging services allows high-tech bosses to install robust protection for children.
The government must take responsibility for high-tech companies to keep their children safe. Otherwise, many have spent years campaigning to ensure that these platforms are safe shelters for child sexual abuse perpetrators and effectively undermine online safety laws.