Work by the ACCC was noted for example here, here, here, here and here.
Three key objectives in the ACMA position paper are
- reduce the impact of harmful misinformation
- empower people to better judge the quality of news and information
- enhance the transparency and accountability of platforms’ practices.
According to the University of Canberra’s Digital News Report: Australia 2020, 48 per cent of Australians rely on online news or social media as their main source of news. But 64 per cent of Australians are concerned about what is real or fake on the internet.
“That should rightly be of immense community concern. False and misleading news and information online has the potential to cause serious harm to individuals, communities and society,” ACMA Chair Nerida O’Loughlin said.
“In developing this new code, digital platforms will need to balance the need to limit the spread and impact of harmful material on the internet while protecting Australians’ important rights to freedom of speech.
“Digital platforms should not be the arbiters of truth for online information. But they do have a responsibility to tackle misinformation disseminated on their platforms and to assist people to make sound decisions about the credibility of news and information.
“We know that major platforms have stepped up their processes during the COVID-19 pandemic due to the prevalence of information potentially harmful to health and property.
“It’s now time for digital platforms to codify and commit to permanent actions that are systematic, transparent, certain and accountable for their users in addressing such potentially harmful material.”ACMA is to oversee the platforms’ code development process and report to Government by June 2021. ACMA anticipates the digital platforms will work together, including undertaking public consultation, to develop and have in place a single, industry-wide code by December 2020.
The position paper states
Australians rely upon a range of indicators to assess the quality of their news and information, including the source or outlet of a news piece. On digital platforms, the widespread use of algorithms, the proliferation of sources and the dissociation of content from its source can make it challenging to assess quality and make informed decisions about which news and information to read and trust. Difficulty in discerning the quality of news and information can lead to the increased spread of harmful misinformation. This includes disinformation — false and misleading information distributed by malicious actors with the intent to cause harm to individual users and the broader community.
International regulatory approaches to date have largely focused on countering deliberate disinformation campaigns. Disinformation campaigns can engage ordinary users to inadvertently propagate misleading information. However, misleading information shared without intent to cause harm can still lead to significant harm. From the consumer perspective, all forms of false, misleading or deceptive information can have potentially harmful effects on users and the broader community.
This paper uses ‘misinformation’ as an umbrella term to cover all kinds of potentially harmful false, misleading or deceptive information, with deliberate disinformation campaigns considered a subset of misinformation.
The government has been considering responses appropriate for Australian users
These concepts were canvassed as part of the Australian Competition and Consumer Commission (ACCC) Digital Platforms Inquiry (DPI). The ACCC recommended a mandatory code to address complaints about disinformation (Recommendation 15) and an oversight role for a regulator to monitor issues of misinformation and the quality of news and information (Recommendation 14).
In response to that inquiry, the government has asked major digital platforms to develop a voluntary code to cover both recommendations. The government’s response recognises that addressing the complex problem of misinformation requires a comprehensive and principled approach. Any such approach should balance interventions with the rights to freedom of speech and expression.
Australians are increasingly reliant on digital platforms to access, consume and share news and information. The ACMA considers that platforms bear considerable responsibility to provide users with a safe and user-friendly environment to engage with news and information and help users more easily discern the quality of this content.
The 2019–20 Australian bushfire season and the COVID-19 pandemic have reinforced the potential harms of false and misleading information The first half of 2020 has been marked for many Australians by two extraordinary events: the unprecedented summer bushfire season and the COVID-19 pandemic. Both events have provided fertile circumstances for the spread of false and misleading information, distributed with and without malicious intent.
The bushfires saw instances of false and misleading information about the cause of the fires, the use of old images purporting to be of current events and conspiracy theories such as the fires having been purposely lit to make way for a Sydney to Melbourne train line. False and misleading information about the pandemic—such as how to prevent exposure, possible treatments, and the origins of the virus—have been shown to have real-world consequences, including personal illness and damage to property.
Recent Australian research found that nearly two-thirds (66 per cent) of people say they have encountered misinformation about COVID-19 on social media. The World Health Organisation has labelled the crisis an ‘infodemic’ and platforms have implemented new measures to limit the spread of misinformation.
Both these events have highlighted the impact and potential harm of misinformation on both Australian users of digital platforms and the broader Australian community. xxx Voluntary codes should build on existing measures as part of a risk-based approach to harmful misinformation
In recent years, most major platforms have implemented a range of measures and processes to address potentially harmful misinformation and news quality issues. This work has intensified during the COVID-19 pandemic, with platforms taking further steps to address potential harms, including:
- Greater signalling of credible, relevant and authentic information through new features and tools.
- Increased detection and monitoring of fake accounts, bots and trolls who engage in malicious and inauthentic activity with vulnerable users.
- Updating terms of service and community guidelines to allow for action to be taken against false and misleading news and information in relation to health and safety issues where the scale and immediacy of potential harm is paramount.
In developing a voluntary code, the ACMA considers that platforms should codify their activities and commit to permanent actions that are systematic, transparent, certain and accountable for their users in addressing such potentially harmful misinformation.
A voluntary code needs to be fit for purpose for Australian users and the Australian community. Given the recent evidence of significant harm caused by false and misleading information shared online, and the practical difficulty of determining which information has been circulated with intent to harm, the ACMA considers platforms should implement measures to address all kinds of harmful misinformation circulating on their services. These measures should be graduated and proportionate to the risk of harm.
Adopting a graduated and flexible approach means platforms would also be free to draw the lines between different interventions in accordance with their own policies and to achieve an appropriate balance with rights to freedom of speech and expression.
The ACMA has outlined its expectations to guide code development
This paper includes a series of positions that outline the ACMA’s expectations on the development of the code. These positions cover threshold issues about the scope, design, and administration of the code, and are intended to assist platforms in the development of their code(s). These positions have been informed by existing international regulatory approaches, preliminary discussions with platforms and an examination of best-practice guidelines.
The ACMA considers that the code should cover misinformation across all types of news and information (including advertising and sponsored content) that:
- is of a public or semi-public nature
- is shared or distributed via a digital platform
- has the potential to cause harm to an individual, social group or the broader community.
To enable a consistent experience for Australians who use multiple platforms, the ACMA considers a single industry code would be the preferable approach. Any code should be consumer-centric, including providing a mechanism for users to easily access dispute resolution mechanisms.
As a voluntary code, it will be a matter for individual platforms to decide on whether they participate in the development of the code or choose to be bound by the code. The ACMA would, however, strongly encourage all digital platforms with a presence in Australia, regardless of their size, to sign up to an industry-wide code to demonstrate their commitment to addressing misinformation.
At a minimum, the code should apply to the full range of digital platforms that were outlined in the DPI terms of reference. This includes online search engines, social media platforms and other digital content aggregation services with at least one million monthly active users in Australia.
The ACMA considers that this will likely include widely used platforms such as Facebook, YouTube, Twitter, Google Search and Google News, Instagram, TikTok, LinkedIn, Apple News and Snapchat. The ACMA anticipates that code signatories will change over time to adjust to new entrants and other market changes.
The ACMA have developed a code model, using an outcomes-based approach, to assist platforms in composing their codes In developing a code, the ACMA considers that platforms should adopt an outcomes-based approach. This would provide signatories with a common set of aims while granting the flexibility to implement measures that are most suited to their business models and technologies. The ACMA has developed the code model below which articulates potential objectives and outcomes for the code.