Digital Child Exploitation Filtering System Code of Practice - October 2024
A code for the operation of the Department of Internal Affairs’ website filtering system to prevent access to websites containing child sexual abuse material. Updated 31 October 2024.
Explanatory Statement
The expansion of the Internet has led to many positive developments. However, the fact remains that criminals, individuals as well as organised groups, are also using this technology as a means of producing, collecting and distributing child sexual abuse material.
Child sexual abuse material is not ‘just images and stories’ but evidence of criminal activity. The possession and distribution of this material creates an international market that supports and encourages the portrayal of children as sexual objects and the abuse of children. The types of images include films, pictures, photographs, drawings, or computer- generated images.
Where children are the victims of this activity, they can suffer the psychological effects of their abuse for many years after the physical offending has ended. Images that are distributed on the Internet never go away. With each download the person involved is re-victimised.
The Digital Child Exploitation Filtering System (DCEFS) is designed to assist in combating the trade in child sexual abuse material by making it more difficult for persons with a sexual interest in children to access material of that nature.
The DCEFS is a key tool for reducing online harm and complements the enforcement activity undertaken by the Digital Safety Group of the Department of Internal Affairs (the Department). This activity includes online investigations into the trading of objectionable images on peer-to-peer networks, the prosecution of offenders and coordination with other enforcement agencies to have objectionable websites taken down.
The focus of international enforcement will continue to be the identification and rescue of victims and ensuring that these websites are quickly shutdown and their owners prosecuted. However, not every legal system recognises the distribution of child abuse material as a serious crime, and few enforcement agencies around the world have the resources and training to carry out online investigations and the forensic examination of computers.
New Zealand law contains no provision that specifically authorises the operation of a website filtering system or to require Internet Service Providers (ISPs) to connect to such a system. Participation in the DCEFS by ISPs is therefore voluntary. The DCEFS does not remove illegal content from its location on the Internet, nor prosecute the creators or intentional consumers of this material.
Contents
- Purpose
- Definitions
- Scope
- Independent Reference Group
- The Filtering List
- The Landing Page
- Review
- Data
- Code Development and Review
1. Purpose
1.1 The Digital Child Exploitation Filtering System (DCEFS) will contribute to the international effort to combat the trade in child sexual abuse material, both images and text. Reducing the market for such material will help ensure that fewer children are abused in support of that market.
1.2 The DCEFS will help reduce the number of New Zealanders who possess, distribute and make child sexual abuse material.
1.3 While the risk of inadvertent exposure to child sexual abuse images and text is low, the DCEFS will contribute to promoting a safer online environment for the New Zealand public.
2. Definitions
2.1 For the purposes of this Code of Practice:
child sexual abuse material is material that is determined by:
(a.) an Inspector of Publications;
(b.) classified by the Te Mana Whakaatu | New Zealand Classification Office and;
(c.) the definitions supplied by the Internet Watch Foundation (IWF) in their operation of the blocklist.
To the extent that they meet the thresholds set out in the Films, Videos, Publications Classification Act 1993, which includes images and/or text that promotes or supports, or tends to promote or support the exploitation of children, or young persons, or both, for sexual purposes and:
- clearly depicts sexual conduct with or by children, or young persons, or both; or
- exploits the nudity of children, or young persons, or both.
a) an image includes any film, picture, photograph, drawing, or computer - generated image.
b) text includes any print or writing, including electronic text, images of text, and URLs promoting child sexual abuse, or child sexual exploitation material.
3. Scope
3.1 The scope of the DCEFS is limited to preventing access to known websites that contain child sexual abuse material.
3.2 The DCEFS shall only prevent access to a website containing text, a drawing, or computer-generated image where the material clearly depicts child sexual abuse or child sexual exploitation.
3.3 The DCEFS will not prevent access to any website or impair any Internet traffic that is not clearly within the scope as defined in paragraph 3.1.
3.4 The DCEFS is a preventative measure that blocks access to websites depicting child sexual abuse material. It does not identify or track individuals attempting to access these sites.
3.5 The DCEFS does not remove illegal content from its location on the Internet, nor prosecute the creators or intentional consumers of this material.
3.6 In 2024, the DCEFS was updated to include a blocklist compiled by the Internet Watch Foundation (IWF). This international list contains a much greater number of websites hosting child sexual abuse material URLs than previous New Zealand focused version of the DCEFS.
4. Independent Reference Group
4.1 The Department has established an Independent Reference Group (IRG), the membership of which shall be representative of:
- enforcement agencies;
- the Classification Office;
- Representatives for Internet Service Providers (ISPs);
- Internet safety agencies and groups;
- agencies and groups with an interest in the welfare of children and/or the prevention of sexual abuse;
- agencies and groups with an interest in the preservation of human rights.
4.2 The general function of the IRG is to maintain oversight of the operation of the DCEFS to ensure it is operated with integrity and adheres to the principles set down in this Code of Practice and described in the IRG Terms of Reference.
4.3 Meetings of the IRG will be held at least three times a year.
4.4 The IRG shall determine its own meeting procedures.
4.5 Members of the IRG shall meet their own costs for attendance at meetings of the Group.
4.6 The following information shall be made available to the members of the IRG:
- details of all appeal applications and the resulting action taken;
- details of any technical issues with the filter or connections to any ISP;
- such other information that may lawfully be provided to assist the IRG in fulfilling its function.
4.7 The IRG shall consider regular reports on the operation of the DCEFS.
4.8 Minutes of meetings of the IRG shall be published on the Department’s website. Operational information, e.g., URLs, will be omitted from the minutes and reports published on the Department’s website.
4.9 Agencies or groups who are interested in joining the IRG should contact: dcet@dia.govt.nz
5. The Filtering List
5.1 The Digital Safety Group in the Department of Internal Affairs maintains a record of sites (the filtering list) known to host child sexual abuse material. The filtering list is compiled from the following sources:
- reports of sites hosting child sexual abuse material made to the Department by members of the public, either directly or through partner organisations;
- intelligence obtained through partnerships with domestic and overseas enforcement agencies and approved non-governmental organisations that work on combatting the trade in child sexual abuse material;
- The blocklist compiled by the Internet Watch Foundation;
- proactive identification of sites hosting child sexual abuse material through investigation and/or forensic examination.
5.2 Where clarification is needed as to whether a website contains child sexual abuse material, the images or text in question shall be submitted to the Classification Office for a classification.
5.3 Additions to the filtering list from IWF will be updated regularly by Department staff on standard business days. Websites are only uploaded to the IWF filtering list after the determination process has confirmed that the website meets the criterion of containing images or text that promotes or supports, or tends to promote or support, the exploitation of children, or young persons, or both, for sexual purposes.
5.4 Inspectors of Publications conduct periodic quality assurance actions to ensure confidence that the filtering system is blocking only CSAM content. The IWF also conduct regular checks to ensure that websites that no longer contain child sexual exploitation material are removed from the block list.
5.5 A summary of the average number of URLs blocked by DCEFS per month will be provided to the IRG at each meeting. Full details can be requested by the IRG if more information is required.
6. The Landing Page
6.1 When a person requests a site that is on the filtering list, they shall be presented with a landing page.
6.2 The landing page is designed to:
- inform the requester that they have been prevented from accessing the requested website and why;
- provide the requester with a link to the Department of Internal Affairs website, where additional information about the operation of the DCEFS and where help seeking information can be found;
- provide the requester with links to specialist treatment services;
- provide the requester with a method to appeal the action.
7. Review
7.1 A person who considers that they have been wrongly blocked from visiting a legitimate website may appeal the inclusion of the blocked webpage on the filtering list.
7.2 The process for the submission of an appeal shall:
- be expressed and presented in a clear and conspicuous manner;
- ensure the privacy of the requester is maintained by allowing an appeal to be lodged anonymously.
7.3 Each appeal will be considered by an inspector, who shall re-examine the website concerned to determine whether it still meets the criterion for inclusion on the filtering list.
- where the website in question that has been provided by one of our partner agencies and an Inspector of Publications determines that the website should not be blocked, DIA will act on behalf of the complainant to seek to remove the website from the blocklist.
7.4 Where the information supplied by an appellant is inadequate, a reasonable effort will be made to correctly identify the website.
7.5 Each appeal and the resulting action shall be recorded appropriately. Where the appellant has provided contact information, they will be informed of the decision.
7.6 A summary of the appeals received by the Department and the actions taken will be provided to the IRG at each meeting.
8. Data
8.1 During the course of the filtering process the DCEFS will log data related to the website requested, the identity of the ISP that the request was directed from, and the requester’s IP address.
8.2 The system will anonymise the IP address of each person requesting a website on the filtering list and no information enabling the identification of an individual will be stored.
8.3 The collection of data is necessary so that the system can be reviewed to ensure 24-hour, 365-day uptime, and no loss of business due to a technical glitch or fault, for ISPs who join the system.
8.4 The logs will be used to troubleshoot the connections between the Department’s system and the ISP.
8.5 Data shall not be used in support of any investigation or enforcement activity undertaken by the Department.
8.6 Data may be used for statistical or reporting purpose; for example, to inform the Department of the level of demand in New Zealand for child sexual abuse material.
8.7 The Digital Safety Group may:
- use this data to identify other websites containing objectionable content;
- share website data and insights gathered during the assessment phase with other agencies for potential inclusion on their block lists e.g., the INTERPOL ‘Worst of’ or Internet Watch Foundation block lists.
8.8 The logs will be kept for 30 days, which is the standard period for keeping logs for troubleshooting and is consistent with Rule 9 of the Telecommunications Information Privacy Code 2020. At the end of this period the logs are then manually deleted, and a record is made noting this.
8.9 Statistical data will be reported at the next meeting of the IRG.
9. Code Development and Review
9.1 The Code of Practice will be reviewed every three years.
9.2 Anyone may request a review or amendment to the full Code or parts of the Code at any time, by emailing dcet@dia.govt.nz and including, if applicable, particular sections of the Code that require attention.
9.3 Members of the IRG may raise requests to review the Code at IRG meetings for discussion with the wider group.
9.4 ISPs must be made aware of any significant variations to the Code referenced in their existing Agreement with the Digital Safety Group.