Online grooming crimes have hit record levels across London.
The figures, provided by Metropolitan Police, show that 596 Sexual Communication with a Child offences were recorded last year* – quadrupling since the offence came into force in 2017/18 when there were 144 crimes.
Meanwhile figures provided by 44 police forces across the UK, show that 7,263 Sexual Communication with a Child offences were recorded last year*. Where police forces could be directly compared, the number of crimes had almost doubled (99%) from the first year of when the offence first came into force in 2017/18.
Nationally, of the 2,111 offences where police could identify the platform used, 40% took place on Snapchat, 9% on WhatsApp and 9% on Facebook and Instagram.
Where gender was known, 80% of children targeted were girls. Meanwhile, the youngest victim of online grooming recorded was a 4-year-old boy.
The charity highlights that while these are the offences recorded by police, the real number of crimes is likely to be much higher due to abuse happening in private spaces where harms can be harder to detect.
To tackle this issue, the NSPCC is publishing new research setting out solutions which can be used to prevent, detect and disrupt grooming in private messaging spaces.
Online child sexual abuse crimes can have a long-term impact on a child, leaving them with feelings of guilt, shame, depression, confusion, anxiety and fear.
One 14-year-old who contacted Childline said: “I feel so insecure all the time, so, when this guy I’ve met online, who’s a few years older, started flirting with me, that made me feel so special. He seemed to care, but now he’s insisting I send him nudes, and I don’t know if he just gave me attention, so I’d send him nudes. I feel like I’ve been tricked but I’m afraid what he might do if I just block him. I can’t control how anxious this makes me feel.”
The charity’s new research identifies cycles of behaviours that perpetrators use, such as creating multiple different profiles and manipulating young users to engage with them across different platforms.
In response, the NSPCC is urging Ofcom and tech companies to take swift action on the recommendations set out in the report, so that they can better identify and prevent online grooming.
Recommendations include:
Implementing tools on a child’s phone that can scan for nude images and identify child sexual abuse material, before its shared.
Using metadata analysis, which uses background information, like when, where, and how someone is using a platform, to spot suspicious patterns. It does not read private messages, but it can flag behaviours that suggest grooming, such as adults repeatedly contacting large numbers of children or creating fake profiles.
Create barriers for adult profiles engaging children on social media platforms, like restrictions on who they can search and how many people they can contact.
Tech platform leaders should commit to delivering services which effectively support and balance user safety and privacy.
The research shows that safety measures must be introduced at the same time to be effective, working in tandem to ensure harm is prevented across the grooming cycle.
The NSPCC is urging tech companies, Ofcom, and Government to take leadership on addressing this devastating crime and commit to using every tool available to them to stop perpetrators in their tracks.
Chris Sherwood, NSPCC Chief Executive, said: “It’s deeply alarming that online grooming crimes have reached a record high across the UK, taking place on the very platforms children use every day.
“At Childline, we hear first-hand how grooming can devastate young lives. The trauma doesn’t end when the messages stop, it can leave children battling anxiety, depression, and shame for years.
“Tech companies must act now to prevent further escalation. The tools the NSPCC sets out to protect children are ready to use and urgently needed. Importantly, they mean that services can keep children safe while protecting all user’s privacy. Children’s safety must be built into platform design from the start, not treated as an afterthought.”







