General policy statement
The objective of this Bill is to update the Films, Videos, and Publications Classification Act 1993 (the Act) to allow for urgent prevention and mitigation of harms caused by objectionable publications.
The Bill provides additional regulatory tools to manage harms caused by content that is livestreamed or hosted by online content hosts. The Bill mostly relates to online publications, however, one aspect of the Bill (urgent interim classification assessments) will apply to all publications covered by the Act.
The Bill makes the key changes outlined below.
Bill makes livestreaming of objectionable content criminal offence
The Bill makes livestreaming of objectionable content a criminal offence under new section 124AB of the Act. Livestreaming is the online transmission of events in real time. Any digital reproduction of a livestream is a recording and is therefore subject to existing provisions in the Act relating to publications.
The criminal offence of livestreaming objectionable content only applies to the individual or group livestreaming the content. It does not apply to the online content hosts that provide the online infrastructure or platform for the livestream.
Bill confers additional authority on Chief Censor
The Bill enables the Chief Censor to make swift time-limited interim classification assessments of any publication in situations where the sudden appearance and (in the case of online publications) viral distribution of objectionable content is injurious to the public good.
The Act sets out procedures for submitting publications to the Chief Censor as well as for how the Chief Censor examines a submitted publication to determine its classification status. Following these procedures takes time and does not suit the urgency of situations in which the availability and uncontrolled dissemination of objectionable content will be injurious to the public good. The need to notify the public as quickly as possible of the harm that this objectionable content can cause and to limit that harm has highest priority.
This mechanism provides the public, anyone who provides publications to the public (including, in the case of online publications, service providers and online content hosts), and enforcement agencies with immediate clarity on a probable classification status. This will minimise the rate at which objectionable content spreads and reduce its potential to harm. The interim assessment has the same effect as a classification decision and would be in place for a maximum of 20 working days before the Chief Censor would be required to issue a final written decision.
Under Bill take-down notices can be issued requiring removal of objectionable online content
The Bill authorises an Inspector of Publications to issue a take-down notice for objectionable online content. The take-down powers are aligned with current powers of seizure of objectionable publications under the Act.
Take-down notices will be issued to an online content host, directing the removal of a specific link, so that the relevant objectionable material is no longer viewable in New Zealand. An online content host that does not comply with a notice to take down content as soon as is reasonably practicable (without reasonable justification for delays) will be subject to civil pecuniary penalties. It is intended (but not required by the Bill) that the authority to issue a take-down notice will only be exercised in situations where other options for seeking the removal of objectionable content online have proven ineffective. The current collaborative practice of requesting online content hosts to voluntarily remove identified objectionable content will continue to be the first and preferred approach.
Online content hosts will be subject to civil pecuniary penalty if they do not comply with issued take-down notice
Although it is an offence against the Act to make, possess, supply, or distribute an objectionable (analogue) publication, the Act’s provisions are not explicit in how they may apply to online publications and the hosting of objectionable content online. In addition, financial penalties for non-compliance by large multinational corporations are significantly small compared to their revenue, and therefore ineffective deterrents.
Under the Bill, a civil pecuniary penalty will be imposed on online content hosts that do not comply with an issued take-down notice in relation to objectionable online content. This change will bring online content hosts in line with the expectations of businesses operating in New Zealand as they relate to physical analogue content classified as objectionable.
Under Bill safe harbour provisions in Harmful Digital Communications Act 2015 will not apply to objectionable online content
Section 24 of the Harmful Digital Communications Act 2015 (the HDC Act) states that online content hosts cannot be charged under New Zealand law for hosting harmful content on their platforms if they follow certain steps when a complaint is made. This creates the potential for exemption for online content hosts from any criminal or civil liability if they break the law under the Act (which is concerned with more serious content) but follow the steps outlined in the HDC Act. Under the Bill, section 24 of the HDC Act will not apply to the operation of the Act. No amendments will need to be made to the HDC Act. This will mean that enforcing the new offence or modified offences in the Act will not be limited by the HDC Act safe harbour provisions for online content hosts. It will ensure that online content hosts can be prosecuted for hosting objectionable content if they are liable for doing so.
Bill facilitates setting up of future mechanisms for blocking or filtering objectionable online content
In New Zealand, the only current government-backed web filter is designed to block child sexual exploitation material (the Digital Child Exploitation Filtering System). This filter is voluntary and operates at the Internet service provider (ISP) level. It currently applies to about 85% of New Zealand’s ISP connections.
The Bill facilitates the establishment of a government-backed (either mandatory or voluntary) web filter if one is desired in the future. It provides the Government with explicit statutory authority to explore and implement such mechanisms through regulations, following consultation.
Filters can be circumvented by those actively seeking content through tools such as a virtual private network. The limitations of a filtering system mean that there is a risk that a web filter could provide limited benefits, and impact on freedom of expression.
Accordingly, the regulation-making powers under the Bill require that any regulations to establish a filter must—
clarify the criteria for identifying and preventing access to objectionable content that the filter would block:
clarify governance arrangements for the system:
specify reporting arrangements for the system:
clarify the review process and right of appeal should an ISP, online content host, or other individual or entity dispute a decision to prevent access to a website, part of a website, or an online application:
clarify the obligations of ISPs in relation to the operation of the system:
provide detail of how data security and privacy provisions would be addressed.
Any filter established will be limited to addressing a specific form of objectionable content and would focus on web page filtering. The filter would not include messaging applications and other online services. It is intended to be designed so that the impact on freedom of expression would not be extended past the existing justified limits.
Existing appeal pathways in the Act would apply, with necessary modifications, when challenging decisions relating to takedown notices. Review and appeal processes set out in regulations would apply to decisions relating to the blocking of websites, online application, or similar by the electronic system. Decisions relating to the blocking of websites can be also be challenged through judicial review.