Internet Watch Foundation

Database Filters

Summary

A service with a website for people to report online child sexual abuse images and videos, non-photographic child sexual abuse images and videos (such as computer generated) and criminally obscene adult content.

Type of intervention

Online

Target groups, level of prevention and subgroups

  • Situations/Places | Internet/online | Online | Internet-related only

Target population

The Internet Watch Foundation (IWF) is the UK Hotline to report:

  • Online child sexual abuse images and videos
  • Non-photographic child sexual abuse images and videos (such as those that are computer generated)
  • Criminally obscene adult content

The IWF works with others to remove child sexual abuse images and videos globally. Regarding the second two areas of the IWF remit, action can be taken to remove this content when it is UK-hosted only. This is because other countries do not share the same laws on this content, unlike child sexual abuse photographic imagery and videos.

Key target audiences:

  • Internet-using men aged 18-24 years as this age group is most likely to stumble across child sexual abuse imagery and least likely to report it. Their reasons for not reporting include fear of prosecution and a preference for ignoring it.
  • Internet users in general. IWF work to ensure that their reporting hotline is easily found online when searched for.
  • Police officers, in particular those working in paedophile or public protection teams and call takers and they need to be able to correctly direct people to their hotline to report content within their remit. They are able to assist police officers with investigations into online child sexual abuse imagery.
  • Online companies and organisations whose services are vulnerable to abuse. The IWF provides services to companies to prevent internet users from stumbling across child sexual abuse images and to enable the expeditious removal of these images.

Delivery organisation

The IWF is the key organisation; however the work relies on partnerships with the Police National Crime Agency CEOP Command and INHOPE hotlines all over the world (these are hotlines similar to IWF based in other countries) and internet companies who host or provide access to online content.

The IWF is a membership body with 117 members as of March 2014. It is also a self-regulatory body for the online industry and is a registered charity.

Mode and context of delivery

Internet users can report webpages to the IWF for assessment via the IWF’s website www.iwf.org.uk. The website can be found via search engines using a range of keywords. Search engine optimisation is key for those individuals who are blindly searching for a reporting body, having not heard of IWF previously.

IWF also promotes its reporting facility through its Facebook page www.facebook.com/InternetWatchFoundation and its Twitter handle @IWFhotline.

Reports can be taken 24 hours a day, 7 days a week, through its online reporting facility on its website. The reports can be made anonymously if desired. All reports are confidential.

Level/nature of staff expertise required

The IWF Hotline is staffed by one Hotline manager, one Services Administrator (to deliver the technical side of the services we offer), two Senior Internet Content Analysts and 10 Internet Content Analysts (ICA).

They work a rota between 8.30am and 5.30pm, Monday to Friday. Training for the ICA job takes at least six months as it’s a specialist and unique role. Previous experience between the ICAs varies greatly as the role is about qualities within the individual rather than experience in other positions. However, ICAs need to be able to use technology. The interview process is as follows: a values based interview with an independent assessor takes place to assess a candidate’s reasons for applying, psychological suitability for the post and to identify any reasons which may impede the candidate from performing and feeling comfortable in the role. They next undergo a ‘normal’ job interview, before being shown child sexual abuse images and being given time to consider if they want the job. All staff at IWF undergo a police background check.

Intensity/extent of engagement with target group(s)

There is little engagement with those who make reports to IWF. It is an online process with no direct contact, unless the reporter has left their details and asks for feedback about their report. On the rare occasion someone makes multiple reports within quick succession and where they have provided contact details, IWF emails them to inform them of what the law says about repeatedly accessing content and that they could be placing themselves at risk.

IWF have direct contact with police offices when they are helping with an investigation.

IWF has regular contact with its members (other organisations who are vulnerable to child abuse images), which might include updating them of their work, inviting them to an event or arranging to meet them.

Description of intervention

The IWF was established in 1996 by the internet industry to provide a UK Internet Hotline for the public and IT professionals, so that they can report criminal online content in a secure and confidential way.

They are an independent, self-regulatory body, funded by the EU and the online industry, including Internet Service Providers (ISPs), mobile operators, social media companies, content providers, hosting providers, filtering companies, search providers, trade associations and the financial sector. Their self-regulatory partnership approach is widely recognised as a model of good practice in combating the abuse of technology for the dissemination of criminal content.

IWF work with the UK government to influence initiatives developed to combat online abuse and this dialogue goes beyond the UK and Europe to promote greater awareness of global issues, trends and responsibilities.

IWF work internationally with INHOPE Hotlines https://www.inhope.org/EN and other relevant organisations, to encourage united global responses to the problem and wider adoption of good practice in combating child sexual abuse images on the internet.

There are a number of tactics used by the IWF on a national and, where relevant, international basis, to minimise the availability of child sexual abuse content online:

  • Reporting mechanism for the public to report any inadvertent exposure to potentially criminal child sexual abuse content
  • ‘Notice and takedown’ system to swiftly remove child sexual abuse content at source in the UK
  • Targeted assessment and monitoring system to remove child sexual abuse content in newsgroups
  • Provision of a child sexual abuse URL list to ISPs, mobile operators, search providers and filtering providers to help disrupt access to child sexual abuse content which is hosted outside the UK and not yet taken down
  • Working with domain name registries and registrars to deregister domain names dedicated to the distribution of child sexual abuse content

After the IWF receives a report, an Internet Content Analyst makes an assessment of the image/video using UK law. The imagery must fall within our remit (of child sexual abuse, criminally obscene adult content or non-photographic images of child sexual abuse). For the latter two, it must also be hosted in the UK if IWF are to take any action.

For child sexual abuse imagery, it is assessed against the UK Sentencing Council’s Guidelines of child sexual abuse (soon to adapt to three levels, rather than five). The host location of the content is then traced. This is done to establish where in the world the imagery is hosted.

The host country is then notified of the existence of the content. Where it is within the UK, a takedown notice is sent to the hosting provider to remove the content. They typically react to remove the content within an hour to two hours.

If it’s hosted outside the UK, the corresponding hotline in that country (where there is one) is notified. 95% of child sexual abuse imagery is hosted in a country with a hotline. That hotline will work with the correct company to then remove the imagery. Europe is quickest to react to remove this content and typically 86% is removed within 10 days. In North America 68% is removed in 10 days and for the rest of the world (which represents just 4% of content) it’s 44%.

Through analysing the child sexual abuse images it is clear that 81% of the child victims are aged 10 years or under, 3% are aged two years or under and 51% show sexual activity between adults and children including rape or sexual torture

In 2018 , the IWF found record amounts (105,047 URLs) of child sexual abuse imagery last year due to improving its technology to help speed up the detection and assessment of the criminal images

Evaluations

Each project under IWF and each awareness raising activity is individually evaluated. In broad terms, the following has been achieved by the IWF Hotline:

  • The IWF found record amounts (105,047 URLs) of child sexual abuse imagery last year due to improving its technology to help speed up the detection and assessment of the criminal images.
  • 2018’s figures also show that the amount of child sexual abuse imagery hosted in the UK is at its lowest level ever recorded – 41 URLs or 0.04% of the global total. In 1996, 18% was hosted in the UK.
  • Every five minutes IWF analysts find the image or video of a child being sexually abused, and 4 out of 5 times this is hosted in a European country.
  • Almost half (47%) of all the imagery found last year was discovered in the Netherlands.
  • IWF has offered support to the Dutch organisation dealing with child sexual abuse imagery. 

  " For 23 years we have been removing from the internet images and videos showing the sexual abuse of children,” said CEO Susie Hargreaves OBE.

References

See IWF Annual Charity Reports at www.iwf.org.uk.

Contact details

IWF Director of External Relations
Telephone: +44 (0) 1223 203030
Email: media@iwf.org.uk or admin@iwf.org.uk
Website: www.iwf.org.uk
Twitter: @IWFhotline

Image result for IWF

INFORMATION CORRECT AT JULY 2021

RATING: Pioneering