close
close

Big tech companies were open to internet safety regulations – why did the New Zealand government abandon the idea? — Education HQ

The ‘Safer Online Services and Media Platforms’ project, led by the Department of Home Affairs, aimed to develop a new framework for regulating what can be published on online platforms and other forms of media (e.g. news media) in New Zealand.

It addressed the sharing of harmful content online, such as child sexual abuse, age-inappropriate material, bullying and harassment, promotion of self-harm, etc. It also aimed to improve the regulation of online services and media platforms generally.

Announcing the project’s suspension in May, Home Affairs Minister Brooke van Velden argued that illegal content was already subject to scrutiny and that the concepts of “harm” and “emotional well-being” were subjective and open to interpretation. She also said it was a matter of freedom of speech.

“The principle of freedom of speech is important to this coalition government and is an important factor to consider in the digital world. “On this basis, the Department will not continue to work to regulate online content.”

But when we looked at tech and social media companies’ submissions on the proposed framework, we found that companies like Facebook, Reddit, and X (formerly Twitter) generally supported regulation—within limits.

Regulation of online media

The “Safer Online Services and Media Platforms” project has been running since 2021. Last year, internal departments invited applications to the public.

The proposed legislation would create a new, more streamlined model for industry regulation. It proposed codes of conduct governed by an independent regulator to control online harm and protect public safety. Safety standards would apply to online and other media platforms.

At least ten different government organizations now bear some responsibility for managing online services and responding to harmful content, often overlapping. And some areas are barely regulated. New Zealand law, for example, does not require social media companies to adhere to security standards.

Other countries are also considering how to regulate harmful digital content, online services and media platforms. Ireland, Canada, the UK and Australia have developed versions of the law that regulate the online space.

Outdated regulations

We analyzed submissions from some of the dominant companies in the technology sector, including Google (including YouTube), Meta, Snap, Reddit, TikTok and X Corp. Our goal was to look at what these companies had to say about regulations that would directly impact their core businesses.

All agreed that the current system is outdated and needs to be modernized. Google argued, for example:

“Content regulation was designed for a different era of technology, focusing on media such as radio and television broadcasting. It is therefore appropriate that the regulatory framework is updated to be fit for purpose to reflect both technological and social changes.”

These companies have already introduced their own safeguarding policies and signed up to the Aotearoa New Zealand voluntary code of conduct on online safety and harm.

Importantly, neither company claimed that its self-regulation efforts were sufficient.

According to these companies’ comments, the only option was a code that focused on objectives rather than rigid rules that would be too prescriptive. Comments stressed that the new code must be a system that is “proportionate” to implementation and enforcement.

Snap stated that:

“…internet regulation is most effective when it is based on general principles that businesses of all sizes can follow and implement proportionally.”

Proportionality is typically a legal test used to decide whether a right, such as freedom of speech, can be limited in the interest of another public matter. However, only Meta and X Corp mentioned freedom of speech protections in their application.

Most submissions stated that they would trust an independent regulator to develop one overarching code, with the caveat that the regulator must be truly independent of all industry players as well as the government of the day.

Reddit stated:

“…we are also concerned about the proposal for the industry to develop codes of practice, rather than for the government or the relevant regulatory agency.”

The comments also highlighted the need for consultation with industry representatives throughout the design process.

A missed opportunity

In their comments on the proposed regulatory framework, each company had its own view on how the codes should be designed, whether the regulatory code should include legal but harmful content, who should bear the burden of implementation, and what the penalties should look like. as.

It is worth noting, however, that all of them were in favor of a thorough overhaul of the regulations.

The decision to abandon this legal framework is a missed opportunity to protect future generations from some of the dangers posed by online media.Conversation


This article is republished from The Conversation under a Creative Commons license. Read the original article here.