Through Childline, the NSPCC runs a tool in partnership with the Internet Watch Foundation that allows children and young people to report and have inappropriate images removed from the internet
What we did
- Service Design
- User Testing
- Content Strategy
Launched at the start of 2020 the tool has received positive coverage, but little actual usage by young people. We worked with the NSPCC team to test and develop the tool so that it more closely aligned with what young people expected when they found it.
Some of the key changes that we worked with the team to uncover included:
- the language used needs to match young people’s expectations – they don’t want to report images, they just want to have them removed
- one size doesn’t fit all – different age groups
need different messaging so they know what to
expect when they’re asked to verify their age
- accessing identification to verify your age might be a problem for some especially halfway through the process – adding a checklist before they start could help increase completion rates
- better content design can reassure users they’re in the right place and support them to complete the process – including instructional copy and making sure it’s positioned effectively reduces drop offs
- if you’re asking people to create a Childline account, they need to know why. Make sure the copy on the page describes the benefits to them.
The updates were launched in June 2021 and since then, more children have been able to use Report Remove to self-report a nude or sexual image or video of themselves to see if it can be removed from the internet.
NSPCC & IWF
The Report Remove user research with children and young people offered valuable insights into how children may approach using the Report Remove tool, which parts of the user journey were proving challenging for them, and what tangible improvements could be made.
NSPCC & IWF