More than 70 million alerts have been sent to individuals attempting to access child sexual abuse material (CSAM) online over the past two years, according to the Lucy Faithfull Foundation. These alerts are part of Project Intercept, a collaboration between the child protection charity and technology companies including Google, TikTok, and Meta. The focus of the alerts is to highlight the illegality of accessing such content and to direct users towards support services aimed at changing harmful behaviour.
Project Intercept and its approach to tackling CSAM
Project Intercept operates in 131 countries and covers a variety of online platforms, including end-to-end encrypted services and AI chatbot platforms. Unlike simply blocking access to illegal content, the project sends warning messages that inform users about the illegality of viewing CSAM and provide links to confidential advice and self-help resources.
The Lucy Faithfull Foundation reported that nearly 700,000 people have accessed its Stop It Now resources after receiving these alerts. These resources offer confidential advice and tools designed to help individuals change their behaviour. While this number represents a small fraction of those who received alerts, experts note that engagement among those who do seek help is relatively high.
“Given that 70 million warning messages have been sent, the fact that only 700,000 people click through to get support seems low. This is disappointing, given that the scale of the problem of child sexual abuse imagery online is growing fast,” said Professor Sonia Livingstone, director of the Digital Futures for Children centre at London School of Economics. “On the other hand, since four in five of those people who seek support do engage with the resources provided, that suggests the system is working for those who are really motivated to get help.”
Impact and expert perspectives on the 70 million alerts
In 2024 and 2025, an average of 28,000 users per month were redirected to support materials, with more than 80% continuing to interact with the content. However, the foundation has not published data on whether this engagement leads to longer-term behaviour change.
Deborah Denis, chief executive of the Lucy Faithfull Foundation, emphasized the importance of timely intervention: “By placing warnings at the moment harmful behaviour is happening, we can disrupt it and divert people towards help to change.” She also noted that the approach could be expanded further.
The NSPCC highlighted that while such interventions can disrupt harmful behaviour, they should be part of a broader strategy aimed at preventing the creation and sharing of illegal material. The charity called on technology companies to increase their efforts in tackling the spread of CSAM.
Emma Hardy, Communications Director at the Internet Watch Foundation, stressed the need for innovative solutions, especially on encrypted platforms. She said, “As it is, it is simply too easy to share and distribute child sexual abuse imagery online, and for children to become trapped in cycles of exploitation.” Hardy added that “safety by design” should be a guiding principle for new products and platforms to prevent such behaviour from hiding online.
Regulatory and industry responses
Ofcom, the UK communications regulator, stated that warning messages are part of its expectations under the Online Safety Act. Almudena Lara, Child Protection Policy Director at Ofcom, said the data shows both progress and the significant scale of the problem that remains.
Technology companies involved in Project Intercept view the initiative as complementary to existing content moderation systems. Griffin Hunt, a product manager at Google Search, reported that changes made in early 2025 have resulted in “greater engagement with therapeutic help services” and a reduction in follow-up searches for illegal material.
Mega, a company providing encrypted cloud storage, also participates in the project. Mega noted that the initiative challenges the assumption that encrypted services cannot intervene early to address harmful behaviour.