21 March 2022

Disinfo

Alongside a commitment - so heartfelt - to introduce disinfo legislation in the 2nd half of this year as part of the 'Australian Code of Practice on Disinformation and Misinformation', the Communications Minister has released the June 2021 A report to government on the adequacy of digital platforms’ disinformation and news quality measures.

Unsurprisingly there is no engagement with disinformation/misinfo from members of the Government such as Craig Kelly. 

 The report states 

In December 2019, as part of its response to the Australian Consumer and Competition Commission’s Digital Platforms Inquiry, the Australian Government requested that digital platforms in Australia develop a voluntary code of practice to address online disinformation and news quality. The Australian Code of Practice on Disinformation and Misinformation1 (the code) was launched by industry association Digital Industry Group Inc (DIGI) on 22 February 2021. The code has since been adopted by 8 digital platforms – Google, Facebook, Microsoft, Twitter, TikTok, Redbubble, Apple and Adobe. 

The ACMA was tasked with overseeing the development of the code and reporting to the government on the adequacy of platform measures and the broader impacts of disinformation in Australia. Our report provides new consumer research on users’ experience of disinformation and misinformation on digital platforms and our assessment of the industry’s code. It also provides a range of findings and a number of recommendations for consideration by the government. 

The online propagation of disinformation and misinformation presents an increasing threat to Australians 

Over the previous 18 months, we have seen increasing concern within the community over the ‘infodemic’ of online disinformation and misinformation, particularly in relation to the real-world impacts of COVID-19. The propagation of these falsehoods and conspiracies undermines public health efforts, causes harm to individuals, businesses and democratic institutions, and in some cases, incites individuals to carry out acts of violence. 

To understand the scale and impacts of this issue in Australia, we undertook a mixed- methods study focused on COVID-19 misinformation. Key insights include:

> Most adult Australians (82%) report having experienced misinformation about COVID-19 over the past 18 months. Of these, 22% of Australians report experiencing ‘a lot’ or ‘a great deal’ of misinformation online. 

> Belief in COVID-19 falsehoods or unproven claims appears to be related to high exposure to online misinformation and a lack of trust in news outlets or authoritative sources. Younger Australians are most at risk from misinformation, however there is also evidence of susceptibility among other vulnerable groups in Australian society. 

> Australians are most likely to see misinformation on larger digital platforms, like Facebook and Twitter. However, smaller private messaging apps and alternative social media services are also increasingly used to spread misinformation or conspiracies due to their less restrictive content moderation policies. 

> Misinformation typically spreads via highly emotive and engaging posts within small online conspiracy groups. These narratives are then amplified by international influencers, local public figures, and by coverage in the media. There is also some evidence of inorganic engagement and amplification, suggesting the presence of disinformation campaigns targeting Australians. 

> Many Australians are aware of platform measures to remove or label offending content but remain sceptical of platform motives and moderation decisions. There is widespread belief that addressing misinformation requires all parties – individuals, platforms and governments – to take greater responsibility to improve the online information environment and reduce potential harms. 

Digital platforms have introduced a range of measures in response to the growth of disinformation and misinformation on their services In response largely to global concerns, digital platforms have introduced measures typically based on company-wide policies including:

> supporting third-party fact-checking organisations 

> proactively updating their policies to specifically address unique events, such as the COVID-19 pandemic and the 2020 US presidential election 

> investing in means to signal credible, relevant and authentic information 

> providing financial assistance and grants to news outlets, government and not-for- profit organisations to bolster the spread of credible information and news 

> increased detection, monitoring and enforcement action against groups and networks who use their services to spread disinformation and misinformation. 

Despite platforms’ mostly global approach to updating policies and implementing other actions, many measures have had an impact on Australian users.

> In 2020, Facebook removed more than 110,000 pieces of COVID-related misinformation generated by Australian accounts. 

> Between July and December 2020, Twitter removed 50 pieces of content authored by Australian accounts for contravening its COVID-19 misleading information policy. 

> In 2020, Google blocked 101 million advertisements globally for contravening its misrepresentation policies. 

> TikTok’s COVID-19 Information Hub was visited by over 292,000 Australians between November 2020 and March 2021. 

The above data shows that platforms are taking proactive steps to tackle disinformation and misinformation on their products and services. The introduction of an Australian industry code builds on these actions to codify actions, improve transparency, enhance consumer protections, and implement mechanisms to monitor their effectiveness. It also provides a framework to promote stakeholder collaboration and incentivise further actions by platforms to respond to a rapidly evolving online environment. 

Digital platforms have come together to develop a single outcomes-based code of practice with several important features 

It is extremely positive to see industry, steered by DIGI, come together to develop a single code of practice. A single code should promote a consistent approach by platforms and provide confidence in industry to manage the range of harms associated with disinformation and misinformation. 

DIGI ran a meaningful public consultation process in developing its draft code, which attracted a variety of submissions that clearly influenced subsequent changes. In particular, the scope of the code was expanded to cover misinformation as well as disinformation, a key piece of stakeholder feedback during the consultation process. The ACMA considers this is an improvement on the EU Code of Practice on Disinformation. The code adopts an outcomes-based regulatory approach that allows a range of platforms with different services and business models to sign up to the single code. Signatories are required to sign up to the objective of ‘providing safeguards against harms that may arise from disinformation and misinformation’ and may opt-in to other code objectives, such as disrupting advertising incentives and supporting strategic research. The code also provides signatories flexibility to implement measures to counter disinformation and misinformation in proportion to the risk of potential harm. Signatories must also report annually on the range of measures they will implement to achieve the objectives and outcomes. Importantly, the code also stresses the need to balance interventions with the need to protect users’ freedom of expression, privacy, and other rights. 

Our assessment identifies further improvements that should be made to the code’s scope and the clarity of commitments 

The ACMA has assessed the code to consider whether it has met the expectations set out by the government and has identified a range of improvements. 

In our view, the scope of the code is limited by its definitions. In particular, a threshold of both ‘serious’ and 'imminent’ harm must be reached before action is required under the code. The effect of this is that signatories could comply with the code without having to take any action on the type of information which can, over time, contribute to a range of chronic harms, such as reductions in community cohesion and a lessening of trust in public institutions. 

The code should also be strengthened through an opt-out rather than opt-in model. Signatories should only be permitted to opt out of outcomes where that outcome is not relevant to their service and be required to provide justification for the decision. 

The code is also limited in the types of services and products it covers. Private messaging is excluded, despite increasing concern about the propagation of disinformation and misinformation through these services, particularly when used to broadcast to large groups. Including messaging services within the code, with appropriate caveats to protect user privacy (including the content of private messages), would provide important consumer protections. 

We also consider improvements to the code should be made in relation to: > its application to news aggregation services > the treatment of professional news content and paid and sponsored content > the weight given to news quality as a key aspect of the government’s request to industry. 

The ACMA is also concerned that the code does not place an obligation on individual signatories to have robust internal complaints processes. This was an area of particular concern identified in the Digital Platforms Inquiry. 

The code includes commitments to establish administrative functions within 6 months of code commencement. As code administrator, DIGI will establish a compliance sub- committee, a detailed reporting guideline and a facility to address signatory non- compliance. However, these functions remain under development at the time of finalising this report. As a result, the ACMA has not been able to assess their effectiveness. DIGI and code signatories should consider changes to the code to address the matters identified by the ACMA in its review in February 2022. 

A clear and transparent measurement framework is critical to the effectiveness of a voluntary, outcomes-based regulatory model Signatories were required to nominate their code commitments and deliver an initial report under the code, providing information and data on the measures they have adopted under the code. 

Signatories’ reports provide a large range of information on the actions they have taken to address disinformation, misinformation and news quality, and their investments in collaborative initiatives. 

However, reports are heavily focused on platform outputs and lack systematic data or key performance indicators (KPIs) that would establish a baseline and enable the tracking of platform and industry performance against code outcomes over time. Reports also show inconsistencies in the interpretations of key code terms and in reporting formats. 

Platforms should move quickly to identify KPIs specific to their services and work together to establish industry-wide KPIs to demonstrate the effectiveness of the code as an industry-wide initiative. 

The ACMA recommends a number of actions by government to bolster industry self-regulatory arrangements 

The ACMA considers that it is still too early to draw concrete conclusions on the overall impact or effectiveness of the code. The code administration framework – including a detailed reporting guideline and mechanism to handle complaints – is not due for completion until late August 2021. The design and implementation of these elements will be key to the overall effectiveness of the code. 

Given these circumstances, continued monitoring is required and the ACMA recommends it provide government with another report on the code by the end of the 2022–23 financial year. This will provide sufficient time to assess the operation of the code administration framework and assess the impact of any changes arising from the February 2022 review of the code. As part of this report, the ACMA recommends it continues to undertake focused research on these issues. 

Initial signatory reports identify challenges in obtaining relevant data on platform actions in Australia. Providing the ACMA with formal information-gathering powers (including powers to make record-keeping rules) would incentivise greater platform transparency and improve access to Australia-specific data on the effectiveness of measures to address disinformation and misinformation. Information collected could also be used to identify systemic issues across the digital platform industry and inform future ACMA research. 

More formal regulatory options could be considered, particularly for platforms that choose not to participate in the code or reject the emerging consensus on the need to address disinformation and misinformation. The ACMA recommends that government provides the ACMA with reserve regulatory powers in relation to digital platforms – such as code registration powers and the ability to set standards. This would provide the government with the option to act quickly to address potential harms if platform responses are not adequate or timely. 

There are also opportunities for improved collaboration between government agencies, platforms, researchers and non-government organisations on issues relating to disinformation and misinformation. The ACMA recommends that the government should consider establishing a Misinformation and Disinformation Action Group to provide a mechanism to support future information sharing, cooperation and collaboration. 

The ACMA makes 5 recommendations to the government in its report. 

Recommendation 1: The government should encourage DIGI to consider the findings in this report when reviewing the code in February 2022. 

Recommendation 2: The ACMA will continue to oversee the operation of the code and should report to government on its effectiveness no later than the end of the 2022- 23 financial year. The ACMA should also continue to undertake relevant research to inform government on the state of disinformation and misinformation in Australia. 

Recommendation 3: To incentivise greater transparency, the ACMA should be provided with formal information-gathering powers (including powers to make record keeping rules) to oversee digital platforms, including the ability to request Australia- specific data on the effectiveness of measures to address disinformation and misinformation. 

Recommendation 4: The government should provide the ACMA with reserve powers to register industry codes, enforce industry code compliance, and make standards relating to the activities of digital platforms’ corporations. These powers would provide a mechanism for further intervention if code administration arrangements prove inadequate, or the voluntary industry code fails. 

Recommendation 5: In addition to existing monitoring capabilities, the government should consider establishing a Misinformation and Disinformation Action Group to support collaboration and information-sharing between digital platforms, government agencies, researchers and NGOs on issues relating to disinformation and misinformation.