Mode
Text Size
Log in / Sign up

Guideline proposes design principles for technology to reduce child sexual abuse material viewing

Guideline proposes design principles for technology to reduce child sexual abuse material viewing
Photo by Walls.io / Unsplash
Key Takeaway
Consider these qualitative design principles as preliminary; efficacy remains unproven.

This guideline presents findings from a qualitative study involving 31 at-risk individuals and 4 focus group discussions with service providers (therapists and managers) in Belgium, Germany, the Netherlands, and the United Kingdom. The research aimed to inform the user-centered design of the Salus prototype, a technological prevention tool for individuals concerned about viewing child sexual abuse material (CSAM).

Key qualitative findings include that privacy and security concerns—such as potential discovery of apps, data security, and legal consequences—are the main worries for potential users. There was consensus on the value of blocking CSAM, but opinions on an optional adult content filter were not unanimous. Interactivity features like a diary, statistics, resources, and feedback were welcomed by potential users.

Based on these findings, the guideline proposes seven evidence-based design principles for user-centered harm-reduction technology. The authors do not report specific limitations, and the study is funded by the European Commission. As a qualitative study, it does not provide quantitative effect estimates, and no conclusions about efficacy or effectiveness should be drawn.

Clinicians should recognize these design principles as preliminary and grounded in user perspectives, but further research is needed to evaluate the prototype's impact on actual behavior change.

Study Details

Study typeGuideline
EvidenceLevel 5
PublishedMay 2026
View Original Abstract ↓
IntroductionThe volume of Child Sexual Abuse Material (CSAM) available online and the global demand for it has reached unprecedented levels. Increasing numbers of individuals concerned about their online behaviour are contacting therapeutic providers for help and support outside of the criminal justice system. Previous research asking individuals what would help them to stop viewing CSAM suggests that the availability of a technological solution to voluntarily self-manage access to CSAM could be an effective tool.AimTo explore the findings from the user-centered design (UCD) of the ‘Salus’ prototype - a technological prevention tool to support effective self-management of individuals at risk of committing a first or further CSAM offence(s).Materials and methodsIn this two-year, European Commission funded project we conducted research in four European countries: Belgium, Germany, the Netherlands, and the United Kingdom (UK). For the UCD phase of the project we conducted semi-structured interviews with 31 at-risk individuals in Belgium (n=10), Germany (n=10) and the UK (n=11), to explore the specific needs, design features, deployment methods, and concerns and barriers for the design, functionality and deployment of Salus. Additionally, four focus group discussions (FGDs) were held in Belgium, the Netherlands, and the UK with service providers (primarily therapists and managers) with extensive experience of supporting individuals at risk of committing CSAM offences to explore the same questions at the service level.ResultsIn terms of privacy and security, the potential discovery of apps such as Salus, data security and legal consequences of app usage are the main concerns of potential app users. There was consensus on the value of blocking CSAM, but opinions on the inclusion of an optional adult sexual content (pornography) filter in Salus design were not unanimous. Users should be able to switch a pornography filter on and off at their convenience. Blocking notifications should be quiet and subtle. Interactivity features are welcomed by potential users – these may include a diary function; a personal CSAM statistics page; a resources section; and a function to allow users to provide feedback to the app developers. Such features should be optional for users in order to prevent any unintended consequences of app usage. Finally, app deployment must be safe and secure.ConclusionBased on these findings, we propose seven evidence-based design principles for user-centered harm-reduction technology: privacy-by-default architecture; discretion through design ambiguity; adaptive notification systems; optional interactivity with user control; trusted-channel deployment; progressive trust building; and fail-safe harm prevention. These principles provide a framework for app developers and researchers working on similar technologies to develop interventions that reduce harmful behaviours.
Free Newsletter

Clinical research that matters. Delivered to your inbox.

Join thousands of clinicians and researchers. No spam, unsubscribe anytime.