The NSPCC is urging the UK Government to ensure children are better protected in private messaging environments, as recently published Home Office data reveals police forces in Wales recorded more than 2,000 child sexual abuse image offences last year (2023/24).

A total of 2,194 child sexual abuse image crimes were recorded by police in Wales last year, equating to around six per day.

In total, 192 crimes were recorded in the Dyfed-Powys region, with 535 in North Wales.

South Wales Police saw the highest recorded, with 964 crimes and 503 in Gwent.

A separate Freedom of Information request submitted by the NSPCC to police forces across England and Wales last year showed that of the offences where law enforcement recorded the platform used by perpetrators, exactly half (50%) took place on Snapchat and a quarter on Meta products – 11% on Instagram, 7% on Facebook and 6% on WhatsApp.

In response, a joint letter from charities, including the NSPCC, Marie Collins Foundation, Lucy Faithfull Foundation, Centre of expertise on child sexual abuse, and Barnardo’s, has been sent to Home Secretary Yvette Cooper and Secretary of State for Science, Innovation, and Technology Peter Kyle.

The letter expresses collective concern regarding Ofcom's final Illegal Harms Code of Practice published in December 2024. The charities argue that as it stands, children will not be protected from the worst forms of abuse on private messaging services under Ofcom’s plans, despite this being a core aim of the Online Safety Act.

Ofcom has stated that user-to-user services are only required to remove illegal content where it is ‘technically feasible’. This exception creates an unacceptable loophole, allowing some services to avoid delivering the most basic protections for children.

Data from police forces on the number of recorded offences where the platform was known indicates private messaging sites are involved in more crimes than any other type of platform, with perpetrators exploiting the secrecy offered by these spaces to harm children and go undetected.

The NSPCC wants the UK Government to push Ofcom to review and strengthen their most recent codes of practice on tackling this threat to children's safety online.

The charity is also calling for private messaging services, including those using end-to-end encryption, to make sure there are robust safeguards in place to ensure their platforms do not act as a ‘safe haven’ for perpetrators of child sexual abuse.

End-to-end encryption is a secure communication system where only communicating users can participate. This means that service providers can be blinded to child sexual abuse material being shared through their platform.

Insight from Childline provides further evidence of how young people are being targeted or blackmailed to share child abuse images via the calculated use of private messaging apps.

Last year, Childline delivered 903 counselling sessions to children and young people relating to blackmail or threats to expose or share sexual images online. This was a 7% increase compared to 2022/23.

One girl, aged 13 years, said: “I sent nude pics and videos to a stranger I met on Snapchat. I think he’s in his thirties. I don’t know what to do next. I told him I didn’t want to send him any more pictures and he started threatening me, telling me that he’ll post the pictures online. I’m feeling really angry with myself and lonely. I would like support from my friends, but I don’t want to talk to them about it as I’m worried about being judged.”

Chris Sherwood, NSPCC Chief Executive, said: "These offences cause tremendous harm and distress to children, with much of this illegal material being repeatedly shared and viewed online. It is an outrage that in 2025 we are still seeing a blatant disregard from tech companies to prevent this illegal content from proliferating on their sites.

“Having separate rules for private messaging services lets tech bosses off the hook from putting robust protections for children in place. This enables crimes to continue to flourish on their platforms even though we now have the Online Safety Act.”

“The Government must set out how they will take a bold stand against abuse on private messaging services and hold tech companies accountable for keeping children safe, even if it requires changes to the platform’s design – there can be no excuse for inaction or delay. "