close
close

Business.Scoop » Big tech companies were open to internet safety regulations – why did the New Zealand government abandon the idea?

Article – Conversation

The government says the internet security framework violates free speech. But some of the world’s largest technology companies have said they are not opposed to some form of regulation.

The coalition government has abandoned efforts to modernize New Zealand’s outdated online safety laws, despite some support for the changes from social media and tech giants.

The Safer Online Services and Media Platforms project, led by the Department of Home Affairs, aimed to develop a new framework for what can be published on online platforms and other forms of media (such as news) in New Zealand.

It addressed the sharing of harmful content online such as child sexual exploitation, age-inappropriate material, bullying and harassment, promotion of self-harm, etc. It also aimed to improve the regulation of online services and media platforms in general.

Announcing the halt to the project in May, Home Affairs Minister Brooke van Velden argued that illegal content was already subject to scrutiny and that the concepts of “harm” and “emotional well-being” were subjective and open to interpretation, and said it was a matter of free speech.

The principle of free speech is important to this coalition government and is an important consideration in the digital world. On this basis, the Department will not proceed with the regulation of online content.

But when we looked at tech and social media companies’ submissions on the proposed framework, we found that companies like Facebook, Reddit, and X (formerly Twitter) generally supported regulation—within limits.

Freedom of speech is important: Interior Minister Brooke van Velden. Getty Images

Regulation of online media

The “Safer Online Services and Media Platforms” project has been running since 2021. Last year, internal departments invited applications to the public.

The proposed regulations would create a new, more streamlined model for regulating the industry. They would propose codes of conduct regulated by an independent regulator to control online harm and protect public safety. Security standards would apply to online platforms and other media.

Currently, at least ten different government organizations have some level of responsibility for managing online services and responding to harmful content, often overlapping. And some areas are barely regulated. Social media companies, for example, are not required under New Zealand law to meet security standards.

Other countries are also considering how to regulate harmful digital content, online services and media platforms. Ireland, Canada, the UK and Australia have developed versions of the law to regulate online spaces.

Outdated regulations

We analyzed submissions from some of the dominant companies in the technology sector, including Google (including YouTube), Meta, Snap, Reddit, TikTok and X Corp. Our goal was to look at what these companies had to say about regulations that would directly impact their core businesses.

Everyone agreed that the current system is outdated and needs modernization. Google argued, for example:

Content regulation was developed for a different era of technology, focusing on media such as radio and television broadcasting. It is therefore appropriate that the regulatory framework is updated to be fit for purpose to reflect both technological and social changes.

These companies have already introduced their own protection policies and have signed up to the Aotearoa New Zealand voluntary code of practice for online safety and harm.

Importantly, neither company claimed that its self-regulation efforts were sufficient.

According to these companies’ comments, the only option was a code that focused on objectives rather than rigid rules that would be too prescriptive. Comments stressed that the new code must be a system that is “proportionate” to implementation and enforcement.

Snap stated that:

Internet regulation is most effective when it is based on general principles that businesses of all sizes can follow and implement proportionately.

Proportionality is a common legal test used to decide whether a right, such as freedom of speech, can be limited in the interests of another public interest. However, only Meta and X Corp mentioned protecting free speech in their submissions.

Most submissions indicated that they would task an independent regulator with developing a single, overarching code, with the caveat that the regulator must be completely independent of all industry players as well as the government of the day.

Reddit stated:

We are also concerned about the proposal that codes of practice should be developed by industry rather than by government or an appropriate regulatory agency.

The comments also indicated that consultations with industry stakeholders are necessary throughout the design process.

A missed opportunity

In their comments on the proposed regulatory framework, each company set out its views on how the codes should be designed, whether a regulatory code should include legal but harmful content, who should bear the burden of implementation, and what the penalties should be.

It is worth noting, however, that everyone supported changing the regulations.

The decision to abolish this framework represents a missed opportunity to protect future generations from some of the harms of online media.

Fiona Sing, Research Fellow in Population Health, University of Auckland, Waipapa Taumata Rau and Antonia Lyons, professor of addiction research, University of Auckland, Waipapa Taumata Rau

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Content sourced from Scoop.co.nz
Original URL