Well done to the people who worked hard behind the scenes to get safe harbour provisions included in the Harmful Digital Communications (HDC) Bill. Well done to the government for including this positive step forward.
But, as things stand, I can almost guarantee that people will abuse the provisions. We are heading for a new environment of mass censorship. It might even make the approved agency redundant.
“The purpose of this bill is to mitigate the harm caused to individuals by digital communications and to provide victims of harmful digital communications with a quick and efficient means of redress.” I intend to do a few blog posts looking at the Bill from different perspectives. This is the first of those.
“The purpose of this [safe harbours] is to ensure that a content host cannot be held liable for content they host that is posted by another person, but which the host does not know about. This bill contains a safe harbour provision stating that a content host is not liable for content they host, unless the content host has received a notice of complaint about the content and fails to take reasonable steps to remove it.”
A conditional safe harbour for internet intermediaries like online content hosts is a great idea. They should not be held liable for content not authored by them or beyond their control. This shielding from liability comes with responsibility to provide an easy way for people to notify them and then take reasonable steps as soon as practicable to remove access to that content.
Safe harbours the hosts’ perspective
Clearly, online content hosts will want to make sure they are always protected by the safe harbour provisions.
What are the requirements of a valid notice? Clause 20(3) specifies them as the complainant’s name, specific location details of the content, and why the content is considered unlawful, harmful, or otherwise objectionable.
As soon as they get a notice apparently complying with the requirements, online content hosts are going to very quickly remove access to that content. There are great risks and no upside for the hosts to question, delay, or challenge the notice.
At the same time, there are no grounds in the HDC Bill for a content host to reject a notice. It must act as soon as it receives a notice. So, a ‘better safe than sorry’ behaviour is logical.
See the problem?
Safe harbours - the complainants’ perspective
From the perspective of a complainant, sending a notice to an online content host will be easy. For genuine complaints, the notice mechanism provides a zero cost way to get harmful or otherwise objectionable content removed quickly.
Now see the problem?
Lessons from DMCA-style safe harbours
In the US, Title II of the Digital Millennium Copyright Act (DMCA) creates a safe harbour for internet intermediaries from copyright infringement liability provided they meet specific requirements. Notably, the internet intermediaries are also shielded from liability from their own users if they claim that the material removed is in fact not infringing.
Our own Copyright Act provides somewhat similar but weaker safe harbour provisions for Internet intermediaries in relation to copyright infringement under Sections 92B and 92C.
In DMCA-style safe harbours, there is no cost for sending a notice alleging copyright infringement. There is virtually no penalty for getting it wrong. There are no limits on the number of notices.
The result of zero cost notices, both in terms of money and adverse consequences, has resulted in errors, significant abuses, and false positives. In a submission to the TCF, Google said that over one-third (37%) of notices were not valid copyright claims.
There are many reported abuses of DMCA takedown notices. For example, the recent forced removal of an interview with an anti-gay group by the group itself. Earlier this year, a San Francisco television station used the DMCA to take down portions of its own broadcast to save it from the embarrassment of getting it wrong. And there is always politics- even US presidential campaigns are fair game for the DMCA abusers. EFF’s Takedown Hall of Shame has many more examples.
Excluded above are DMCA abuses that aren’t highly relevant to the HDC Bill. These include the use of automated bots, huge volumes, and businesses using it for competitive purposes (although I still need to check if and how the bill restricts complainants to natural persons).
Abusing the HDC Bill safe harbours
My intention is not to bracket HDC complainants with copyright maximalist. Rather it is to show that a combination of zero cost notices (in terms of an absence of adverse consequences) combined with the incentive for online content hosts to hastily remove the content will inevitably lead to people abusing the safe harbour provisions.
All the carefully crafted balances in the bill between harm and the Bill of Rights will be drowned by a torrent of content removal notices. Don’t like a blog post? Send a notice to the content host. Don’t think you look good in a photograph? Send a notice and get it removed. Don’t like the views of a person? You know the drill by now - send a notice and get the tweet or wall post or comment or whatever removed.
And, just because it’s possible, why not send a notice in somebody else’s name? The online content host has no way of verifying identity or even that the named person is in fact the person harmed by the specified content. In any case, the online content host is only interested in removing the content unquestioningly to maintain its liability shield.
This makes the “negotiation, mediation, and persuasion” function of the approved agency totally redundant. When there is a direct and faster way to get online content removed, the usual goal of a complainant, why bother going via the approved agency?
Drawing from the experience of DMCA-style safe harbours, it is conceivable that the approved agency will ask or be offered direct access to the online content hosts’ systems to remove allegedly offending content itself. That’s a whole new level of censorship power.
More work required
I am most definitely in favour of the safe harbour concept in the HDC Bill. It is good for complainants and good for online content hosts. In fact, I suggest going further and looking at additional shielding from liability for online content hosts such as from the author of the content taken down.
Not all DMCA-style safe harbour provisions are required but there should be a comprehensive review and explicit inclusion or exclusion. For example, an in-depth look at the role of counter-notices. This may or may not be a good idea (I see both pros and cons) but should be evaluated and the reasoning for the final position detailed.
The main change required is to effectively prevent, or at least minimise, the abuse of HDC Bill safe harbour provisions by people to create a new environment of mass censorship. That’s not going to be easy but, in my opinion, necessary. I will be giving some thought about how to achieve this in the future.
Hopefully, such reservations are not swept away by the public support for what is essentially a good Bill. It’s not about being a moron but trying to get the best possible laws for New Zealand.
Former State Services Commission strategy & innovation manager and InternetNZ chief executive Vikram Kumar is CEO of Mega. He posts at Internet Ganesha.
This article is tagged with the following keywords. Find out more about MyNBR Tags
Most listened to
- Deloitte's Scott McClay discusses which South Island companies are performing best
- TIN100's Greg Shanahan on this year's top trends and top movers in high-tech exports
- ASB senior rural economist Nathan Penny disagrees with ANZ's forecast and is standing by his bank’s $6.75/kgMS prediction
- Why is the FMA exempting robo-advice from the law? Liam Mason explains
- NBR Radio: The best interviews, with Grant Walker — updated daily