July 16-22  2023: Issue 590

 

New ACMA Powers To Combat Misinformation + Disinformation - Public Feedback Invited: Closes August 6 2023

The Australian Government's  Department of Infrastructure, Transport, Regional Development, Communications and the Arts are seeking your feedback on an exposure draft of the Communications Legislation Amendment (Combatting Misinformation and Disinformation) Bill 2023.

Misinformation and disinformation pose a threat to the safety and wellbeing of Australians, as well as to our democracy, society and economy.

In January 2023, the Minister for Communications announced that the Australian Government would introduce new laws to provide the independent regulator, the Australian Communications and Media Authority (ACMA), with new powers to combat online misinformation and disinformation.

The new powers will enable the ACMA to monitor efforts and require digital platforms to do more, placing Australia at the forefront in tackling harmful online misinformation and disinformation, while balancing freedom of speech.

The proposed powers would:

  • enable the ACMA to gather information from digital platform providers, or require them to keep certain records about matters regarding misinformation and disinformation
  • enable the ACMA to request industry develop a code of practice covering measures to combat misinformation and disinformation on digital platforms, which the ACMA could register and enforce
  • allow the ACMA to create and enforce an industry standard (a stronger form of regulation), should a code of practice be deemed ineffective in combatting misinformation and disinformation on digital platforms.

The ACMA will not have the power to request specific content or posts be removed from digital platform services.

The ACMA powers will strengthen and support the existing voluntary framework established by the Australian Code of Practice on Disinformation and Misinformation (the voluntary code), and will extend to non-signatories of the voluntary code.

These powers are consistent with the key recommendations in the ACMA's June 2021 Report to government on the adequacy of digital platforms' disinformation and news quality measures.

The Government wants to hear your feedback on the proposed legislation. It's easy to have your say—simply read the exposure draft Bill and the supporting documents and:

Make a submission here by clicking the 'Have your say' button.

Email them in writing, via video, or by sending an audio recording to information.integrity@infrastructure.gov.au.

Relevant documentation

The Guidance Note for the exposure draft Bill provides an explanation of the key parts of the Bill. For a short explanation of some of the Bill's key elements, please see the fact sheet.

Submissions close on 6 August 2023. 

All submissions (other than private submissions) will be published on the website. Comments will not be published but will inform the review process in the same manner as submissions.

More stick, less carrot: Australia’s new approach to tackling fake news on digital platforms

Shutterstock
Andrea CarsonLa Trobe University

An urgent problem for governments around the world in the digital age is how to tackle the harms caused by mis- and disinformation, and Australia is no exception.

Together, mis- and disinformation fall under the umbrella term of “fake news”. While this phenomenon isn’t new, the internet makes its rapid, vast spread unprecedented.

It’s a tricky problem and hard to police because of the sheer amount of misinformation online. But, left unchecked, public health and safety, electoral integrity, social cohesion and ultimately democracy are at risk. The COVID-19 pandemic taught us not to be complacent, as fake news about COVID treatments led to deadly consequences.

But what’s the best way to manage fake news spread? How can it be done without government overreach, which risks the freedom and diversity of expression necessary for deliberation in healthy democracies?

Last month, Minister for Communications Michelle Rowland released a draft exposure bill to step up Australia’s fight against harmful online mis- and disinformation.

It offers more stick (hefty penalties) and less carrot (voluntary participation) than the current approach to managing online content.

If passed, the bill will see Australia shift from a voluntary to a mandatory co-regulatory model.

Following An EU Model

According to the draft, disinformation is spread intentionally, while misinformation is not.

But both can cause serious harms including hate speech, financial harm and disruption of public order, according to the Australian Communications and Media Authority (ACMA).

To date, research shows countries tend to approach this problem in three distinct ways:

  • non-regulatory “supporting activities” such as digital literacy campaigns and fact-checking units to debunk falsehoods

  • voluntary or mandatory co-regulatory measures involving digital platforms and existing media authorities

  • anti-fake news laws.

The Albanese government’s draft bill will bring us closer to the European Union-style model of mandatory co-regulation.

Platforms Remain Responsible, Not Government

Initial opinions about the bill are divided. Some commentators have called the proposed changes “censorship”, arguing it will have a chilling effect on free speech.

These comments are often unhelpful because they conflate co-regulation with more draconian measures such anti-fake news laws adopted in illiberal states like Russia, whereby governments arbitrarily rule what information is “fake”.

For example, Russia amended its Criminal Code in 2022 to make the spread of “fake” information an offence punishable with jail terms of up to 15 years, to suppress the media and political dissent about its war in Ukraine.

To be clear, under the proposed Australian bill, platforms continue to be responsible for the content on their services – not governments.

The new powers allow ACMA to look under the platform’s hood to see how they deal with online mis- and disinformation that can cause serious harm, and to request changes to processes (not content). ACMA can set industry standards as a last resort.

The proposed changes don’t give ACMA arbitrary powers to determine what content is true or false, nor can it direct specific posts to be removed. Content of private messages, authorised electoral communications, parody and satire, and news media all remain outside the scope of the proposed changes.

None of this is new. Since 2021, Australia has had a voluntary Code of Practice on Disinformation and Misinformation, developed for digital platforms by their industry association (known as DIGI).

This followed government recommendations arising out of a lengthy Australian Competition and Consumer Commission (ACCC) inquiry into digital platforms. This first effort at online regulation was a good start to stem harmful content using an opt-in model.

But voluntary codes have shortfalls. The obvious being that not all platforms decide to participate, and some cherry-pick the areas of the code they will respond to.

The Proposed Changes

The Australian government is now seeking to deliver on a bipartisan promise to strengthen the regulators’ powers to tackle online mis- and disinformation by shifting to a mandatory co-regulatory model.

Under the proposed changes, ACMA will be given new information gathering powers and capacity to formally request an industry association (such as DIGI) vary or replace codes that aren’t up to scratch.

Platform participation with registered codes will be compulsory and attract warnings, fines and, if unresolved, hefty court-approved penalties for noncompliance.

These penalties are steep – as much as 5% of a platform’s annual global turnover if repeatedly in breach of industry standards.

The move from voluntary to mandatory regulation in Australia is logical given the EU has set the foundation for other countries to hold digital technology companies responsible for curbing mis- and disinformation on their platforms.

Questions Remain

But the draft bill raises important questions to address before it’s legislated as planned for later this year. Among them are:

  • how to best define mis- and disinformation? (at present the definitions are different to DIGI’s)

  • how to deal with the interrelationship between mis- and disinformation, especially regarding election content? There’s a potential issue because research shows the same content labelled “disinformation” can also be labelled “misinformation” depending on the online user’s motive, which can be hard to divine

  • and why exclude online news media content? Research has shown news media can also be a source of harmful misinformation (such as 2019 election stories about the “Death Tax”).

While aiming to mitigate harmful mis- and disinformation is noble, how it will work in practice remains to be seen.

An important guard against unintended consequences is to ensure ACMA’s powers are carefully defined along with terms and likely circumstances requiring action, with mechanisms for appeal.

Public submissions close August 6.The Conversation

Andrea Carson, Professor of Political Communication, Department of Politics, Media and Philosophy, La Trobe University

This article is republished from The Conversation under a Creative Commons license. Read the original article.