Showing posts with label Media. Show all posts
Showing posts with label Media. Show all posts

22 February 2024

Fake News

'Tackling online false information in the United Kingdom: The Online Safety Act 2023 and its disconnection from free speech law and theory' by Peter Coe in (2024) Journal of Media Law comments  

In the UK, there has been consistent recognition from a variety of actors, including the UK government, that the dissemination of false information can be harmful to individuals and the public sphere. It has also been acknowledged that this problem is being exacerbated by the role played in our lives by the likes of Google, Facebook, Instagram, and X, and because the systems that were in place for dealing with this type of content (and other illegal and/or harmful content), prior to the introduction of the Online Safety Act 2023 (OSA), were designed for the offline world, and were (and in some cases, still are) outdated and no longer fit for purpose. 

The UK’s online harms regime has intensified this debate. The regime began life in April 2019 as the Online Harms White Paper, morphing into multiple iterations of the Online Safety Bill (OSB), published in its original form in May 2021, and finally crystallising as the OSA, which was enacted on the 26th of October 2023. On the one hand, it is acknowledged that legislation placing statutory responsibilities on internet services to prevent the publication of false information (and other illegal and harmful content) may benefit society and public discourse. This is because, in theory at least, by helping to decrease the volume of false information we are exposed to, such laws should reduce the opportunities for the public sphere to become distorted. As citizens we should be able to assess, with greater confidence, the veracity of information available to us, and in turn, use this information, and the trust we have in it, to make positive contributions to public discourse. 

But, on the other hand, the OSA has been (and before it, the OSB was) met with significant resistance from a variety of actors because of the potential threats to free speech that it presents.  Indeed, since the publication of the White Paper, and the initial draft of the OSB, the regime has been shrouded in controversy. The OSB was subject to numerous amendments, and at one stage, it looked as though it would be scrapped altogether. Yet despite this, at the time of writing, the OSA has recently been enacted, albeit the overall shape of the regime remains unclear, because much of the legal detail will be contained in secondary legislation. Therefore, debates on the efficacy of the OSA will continue, and only time will tell what its ultimate impact on free speech will be. 

Notwithstanding this uncertainty, the purpose of this article is to interrogate the regime’s compatibility with free speech law and theory. In doing so, it begins with an explanation of what is meant by false information, and how the phenomenon has been exacerbated by the internet. This is followed by analysis of the pre-OSA system for dealing with this content, and an explanation of why it did not work, as aspects of it have a bearing upon the OSA regime. Next, the contours of the free speech framework are sketched, including relevant jurisprudence of the European Court of Human Rights (ECtHR), and the theories underpinning it that are particularly relevant to online false information. In this section I explain why these theories are flawed in this context, and therefore how these flaws could justify the creation of laws to tackle online false information. Yet, as I go on to suggest in my analysis of the OSA, which follows, this creates a paradoxical disconnect between theory and law, in that although the flaws in the theories may justify the creation of such laws – which manifests as the OSA – its creation arguably conflicts with the ECtHR’s jurisprudence, and the spirit of its theoretical foundations, and could inadvertently interfere with free speech. Finally, the article concludes with some potential solutions for meeting this challenge that do not erode one of the core fundamental human rights.

30 March 2023

Censorship

The report of the  Review of Australian classification regulation provided to the government by former Communications Dept Secretary Neville Stevens in May 2020 has now been released.


The Review's Terms of Reference were

Classification plays a crucial role in helping Australians make informed decisions about content they or those in their care watch, read and play. 
 
The current National Classification Scheme (the Scheme) exists to provide a framework by which films, video games and certain publications made available in Australia receive a rating and consumer advice that provides a safeguard to the Australian public that content is consumed by the appropriate audience. 
 
It is a joint scheme between the Commonwealth and the states and territories and was established in 1995. The Commonwealth Classification (Publications, Films and Computer Games) Act 1995 establishes the framework for classification of content, and state and territory classification legislation regulates the advertising, availability and sale of classifiable content. 
 
The Scheme applies to online and physical video games, films and episodic series on all platforms including in cinemas, on DVD and online (such as streaming services and subscription video on demand) but not to programs broadcast on television. Classification of television programs is regulated under separate codes of practice covering free to air broadcasters, subscription television broadcasters, the ABC and the SBS. In 2012, the Australian Law Reform Commission’s (ALRC) report ‘Classification – Content Regulation and Convergent Media’ found that classification legislation ‘does not deal adequately with the challenges of media convergence and the volume of media available to Australians’. The Convergence Review Committee’s report in 2012 endorsed the findings of the ALRC review. Consistent with these reviews, the Australian Competition and Consumer Commission’s (ACCC) Digital Platforms Inquiry final report recommended that ‘a new platform-neutral regulatory framework be developed,’ including ‘creating a nationally uniform classification scheme to classify or restrict access to content consistently across delivery formats’ (Recommendation 6). 
The ALRC review was conducted before the popularisation of online streaming and video on demand services and the significant increase in online and mobile games available in Australia. This review will build on the ALRC report in the context of today’s diverse media content market. Consistent with the agreement of the Council of Attorneys-General, a review of the National Classification Code, the Guidelines for the Classification of Films (Films Guidelines) and the Guidelines for the Classification of Computer Games (Computer Games Guidelines) will also be undertaken to ascertain whether they continue to reflect contemporary community standards. The National Classification Code and the Films Guidelines were last reviewed in 2002, and the Computer Games Guidelines were last reviewed prior to the introduction of the R 18+ category for games in 2013. 
Scope 
 
An independent expert will be appointed to conduct the review, supported by the Department of Communications and the Arts (now the Department of Infrastructure, Transport, Regional Development and Communications).   
 
The review will cover: 1. Opportunities to harmonise the classification of, or restriction of access to, content across different delivery platforms including broadcasting services (commercial free to air, national broadcasting and subscription television), online stores and services, cinema releases, and physical product (e.g. boxed video games and DVDs). 2. The design of a contemporary Australian classification framework, including: a. What content requires classification; b. Consistency of classification categories, standards and access restrictions across media formats; c. Classification decision-making processes, including mechanisms for review; and d. Governance arrangements, including the suitability of the current cooperative scheme. 3. Opportunities to update classification decision-making standards, including a comprehensive review to update the National Classification Code, the Films Guidelines, and the Computer Games Guidelines. 
 
The following issues are out of scope: • Broader content regulation issues outlined in Recommendation 6 of the ACCC’s Digital Platforms Inquiry. Content regulation reform is a significant undertaking that needs to be broken down into interrelated processes. • Regulation of sexually explicit content online, which will be considered in possible reforms to the Online Content Scheme in Schedules 5 and 7 of the Broadcasting Services Act 1992.

The Report states

Australia’s classification system has existed since the early 1900s and has evolved over the decades. Departmental research consistently shows classification is wanted and valued by Australians. From an early focus on censorship, the system has shifted to providing information and guidance to help parents make decisions about the suitability of content for children of varying ages and to provide all consumers with information to make informed choices. 
There have been a number of reviews of classification arrangements including the Australian Law Reform Commission’s report in 2012 and the Australian Competition and Consumer Commission’s Digital Platforms Inquiry report in 2019. These reports highlighted deficiencies with current classification arrangements and recommended significant changes to take into account the increase in content available online and the convergence of media platforms. 
Areas of concern raised by these reviews and reinforced by submissions to this review include: • The high cost of the processes of the Classification Board (the Board), especially given the volume of content now requiring classification; • Timeframes to use the Board which are too long to be compatible with current media practices; • Separate regulatory systems and regulators for broadcast and for other content providers; • Lack of clarity on what content should require classification due to the very wide and outdated definitions in current legislation; • Lack of compliance with existing legislation among some content providers, including a number of video on demand providers and online games storefronts, partly as a result of the high cost and long timeframes of existing classification practices; • Governance arrangements between the Australian Government and the states and territories, which could better define roles and responsibilities of the various parties in an online environment, and which are not seen as sufficiently timely or flexible; and • Lack of a regular approach to updating classification guidelines to reflect contemporary community concerns and research into relevant matters, including child development issues. 
 
My analysis of these issues and my recommendations for change are informed by the need for a future classification regulatory framework that: 1. Is able to adapt to new technologies, market developments and emerging issues of community concern; 2. Provides clear, useful and easily accessible information to enable consumers to make informed media choices for both themselves and for their children; 3. Has evidence-based classification guidelines that are regularly updated, taking into regard both expert knowledge and Australian community standards; 4. Enables classification arrangements that are efficient and cost-effective for industry, that are consistent across content platforms and which have the confidence of the community; 5. Provides appropriate content restriction and enforcement for both physical and online content; and 6. Enables timely decision-making on changes to the classification scheme. 
 
National Classification Code and standards 
 
Clause 1 of the National Classification Code and section 11 of the Classification (Publications, Films and Computer Games) Act 1995 contain a range of underpinning principles and matters to be taken into account in classification. Although formulated in 1995, many aspects of these overarching principles retain value, in particular the balancing of protecting children from harmful content while preserving the right of adults to “read, hear, see and play what they want.” However, other concepts and language contained in these provisions, which have roots in the history of classification, are in need of an update. Such amendments would reflect the evolution of classification from its historical origins in censorship and concerns for public morals to a more objective, harms-based system focussed on informing consumers (particularly parents) and protecting children. 
 
I recommend that key principles set out in the National Classification Code be updated to provide that: • Adults should be able to read, hear, see and play what they want, with limited exception; • Minors should be protected from content likely to harm or disturb them; and • Everyone should be protected from exposure to content of serious concern to the wellbeing of the community. 
 
Content to be classified  
 
There is a need to clarify what content should be classified, as current definitions in the Classification (Publications, Films and Computer Games) Act 1995 were designed for the content market of the 1990s and technically capture all streaming services and user-generated content uploaded to sites such as YouTube. 
 
The focus of classification should be on content that is most relevant and important to Australian consumers. I therefore recommend that the following three principles should be used to define content that should be classified: • Professionally produced – content with higher quality production values; and • Distributed on a commercial basis – to capture organisations or individuals that distribute media content as part of their business, as opposed to individuals or community groups whose main purpose is not to distribute media content for commercial gain; and • Directed at an Australian audience – a selection of content is specifically made available for Australia or marketing is specifically directed at Australians. 
 
Narrowing the definition of ‘classifiable content’ will capture online video on demand providers and online games stores directed at Australian consumers but exclude user-generated content. Classification should continue to be the responsibility of the organisation that makes the content available first in Australia, regardless of who originally made the content. 
 
The eSafety Commissioner would continue to have responsibility for responding to online content that is illegal, including content that would be Refused Classification under the National Classification Scheme. 
 
As part of the classification of films, sexually explicit (X 18+) films in physical formats should continue to be classified. Sexually explicit content online is regulated by the Online Content Scheme which is currently being reviewed. 
 
Current classification exemptions for films, computer games and publications should be maintained.   
 
Processes to classify content 
 
A range of different classification processes currently exist under the National Classification Scheme and broadcasting laws. Where some content providers are submitting content to the Board, some are using classification tools and others are self-classifying content. These varying processes mean that classification can be more expensive and time-consuming for some parts of industry compared to others and this uneven playing field can have an impact on compliance with classification laws. 
 
Classification decisions need to be consistent, accurate, accessible and easily understood by consumers. The community must have confidence that the right classification outcome is reached, regardless of the process that is used to achieve that classification. 
 
I recommend harmonising processes across platforms so that industry is given greater responsibility for undertaking classification, with the flexibility to choose the classification process that best suits them. These processes would be: • Self-classification by people trained and accredited by the regulator, who could be either in-house staff or third-party classifiers; or • Self-classification using classification tools approved by the Australian Government Minister; or • Submitting content to the regulator for classification. 
 
Many computer games online show Australian classifications using the International Age Rating Coalition (IARC) tool. However, Apple’s App Store uses its own international age-rating system where games are classified 4+, 9+, 12+ or 17+. The Apple App Store’s own system is working well – there are few complaints to the Department, and the Department’s research with the community indicates there is general consumer acceptance. I therefore recommend that the relevant Australian Government Minister should have the power to authorise the use of alternative classification systems for computer games where they provide the necessary classification information for the Australian community. 
 
The games storefront Steam, operated by the company Valve, does not display Australian classification information for all games and does not provide Australian consumers and parents with adequate information to help them make informed choices. This needs to change. If Valve does not participate in IARC in the near future, I recommend that the Department further discuss with Valve the implementation of a separate tool to generate Australian ratings for computer games sold to Australian consumers on Steam. 
 
Currently, the same content is required to be classified separately for release across different platforms and in different formats. To avoid this double handling, I recommend that once content is classified once, it should not need to be classified again, unless it is modified and the modification is likely to change the classification. However, content providers should be able to give additional consumer advice where necessary.\ 
 
The only exceptions to this would be to: • Allow content providers to reclassify content after 10 years to reflect changing community standards; and • Provide a limited provision for content providers to apply to the regulator for approval to reclassify where they consider the original classification category (e.g. G, PG, M, etc.) requires reassessment. 
 
Classification decisions should continue to be uploaded and published on the National Classification Database at www.classification.gov.au, and this database should also include content classified by the broadcasters. This will provide transparent information to Australian consumers and help content providers find the classification of content that has previously been classified. 
 
The review of classification decisions should be transferred from the Classification Review Board to the Australian Government regulator. In the infrequent cases where the regulator was the original decision-maker, alternative staff would review the decision to manage any conflict of interest issues. 
 
The community must have confidence that the move to greater industry self-classification will not undermine the integrity of the classification system. To continue high levels of community confidence in classification, industry self-classification must be underpinned by a robust accreditation, audit, review and timely complaints mechanism overseen by the Australian Government regulator. 
 
Classification categories and consumer advice 
 
A variety of suggestions were made about changes to the classification categories, including adding a category between PG and M, or introducing entirely new age-based categories. Although I see merit in providing more guidance on age suitability for parents, I do not recommend changes to classification categories at this time. 
 
The current scheme, while it may not be perfect, is well known to the community and a clear case would need to be made for any changes. There is no consensus amongst stakeholders, or arising from the Department’s consumer research, for any particular alternative system and changes are strongly opposed by some stakeholders on commercial and technical grounds. However, this matter should be kept under review. 
 
I recommend that the Refused Classification category should continue to include both illegal content and content which is abhorrent to the community but that it be renamed Prohibited to make the meaning of this category clearer. 
 
I also recommend that the current categories for submittable publications be replaced with equivalent categories currently in use for films and computer games: Unrestricted would be replaced with M, Category 1 restricted replaced with R 18+ and Category 2 restricted replaced with X 18+. This change would be clearer for consumers and bring greater uniformity to the classification system. 
 
There are various views in relation to consumer advice and how it is currently applied by classification tools, by broadcast classifiers and by the Board. With a move to greater industry self-classification, there needs to be more detailed guidance given to industry so that consistent consumer advice is provided. 
 
To be useful, consumer advice should be specific, direct and consistent. In this vein, I recommend that generic consumer advice, such as ‘strong themes’, be avoided wherever possible and instead, more descriptive consumer advice be provided. 
 
In updating guidelines for consumer advice, greater recognition should be given to current and emerging community concerns such as suicide, incitement of racial hatred and domestic violence. 
 
Legal restrictions 
 
Currently, the categories MA 15+ and R 18+ are legally restricted under the National Classification Scheme. However, MA 15+ content is not legally restricted on free to air television where broadcasters are subject to a requirement that it be broadcast after at least 8.30pm. Moreover, this content is readily available at any time through broadcasters’ video on demand (catch-up TV) services. Reflecting this, the MA 15+ category stands for Mature Audience on free to air television compared with Mature Accompanied for content classified under the National Classification Scheme. 
 
Despite MA 15+ and R 18+ both being legally restricted categories, an important distinction lies in the provisions relating to adult accompaniment or consent that apply to MA 15+. This means that the age restriction for this category is conditional on the physical accompaniment (for example, during the duration of a film screened in a cinema) or consent (for example, when purchasing a product in store) by a responsible adult. In contrast, the restriction of R 18+ is unconditional and only individuals 18 years and older can access this content. 
 
In the online world, where the concept of another person’s accompaniment or consent is difficult to monitor or enforce, the full conditions of MA 15+ arguably lose their validity. The fact that the accompaniment or consent caveat does not have application in a home setting is reflected in the different conditions that apply to the MA 15+ category for broadcast content. 
 
I consider that arrangements should be consistent across all online platforms and I am recommending that MA 15+ content accessed online no longer be legally restricted. Legal restriction of this category is not enforceable via available technology and this change would harmonise arrangements between broadcasters and other content providers. There are an increasing number of parental controls available online that enable parents to restrict access to particular content and I recommend that these be more widely available and better promoted. 
 
I recommend that the MA 15+ category should remain restricted in the physical world as there are readily available means of enforcing this restriction and in its absence, there would be no alternative mechanism for parents to prevent their children accessing this material. The R 18+ and X 18+ categories should remain restricted on all formats and the best available technology should be employed to restrict access. 
 
Classification guidelines 
 
There are different but similar guidelines for the classification of films applying to online content providers and free to air and subscription broadcasters. It would be preferable to have a single set of guidelines for films applying across all delivery platforms. 
 
The Films Guidelines use an impact hierarchy for classification, which is inherently subjective and relies heavily on the capacity of the Board to interpret in a consistent manner. The guidelines used by television broadcasters, by comparison, are more detailed in their description of what is allowable in each category. As classification increasingly becomes the responsibility of industry, there is a need for guidelines to be as detailed and as specific as possible to enable the provision of consistent classification decisions and information. This would provide the public with a transparent set of classification criteria and engender confidence in the system. 
 
I therefore recommend the development of more detailed and consistent guidelines across all delivery platforms. 
 
Currently, there is no mechanism for regular reviewing and updating of guidelines to reflect community standards, empirical research on child development issues or developments in content or modes of delivery. I recommend that a Classification Advisory Panel comprising experts in child development and other relevant fields, as well as representatives of community groups and those with industry experience, be established to provide advice on updates of the classification categories, National Classification Code, classification guidelines and matters to be taken into account in decision- making in the Classification Act. The panel would draw on both the empirical evidence in relation to harmful impacts of media content, especially on children, and research and consultation with the community. It would report at least every four years on possible updates to classification guidelines and as necessary to respond to issues that may be referred to it or on which it considers attention needs to be given. 
 
There are separate guidelines used to classify films, computer games and publications. A number of submissions called for the merging of the Films Guidelines and Computer Games Guidelines. Many adult gamers were concerned that the differences in these guidelines were unnecessary and resulted in a number of games being Refused Classification when they are both readily available internationally and would not be Refused Classification under the Films Guidelines. 
 
While there was considerable support for eliminating inconsistencies between the Computer Games Guidelines and Films Guidelines, other submitters were concerned that simply combining these Guidelines would not adequately capture certain interactive game features or provide adequate safeguards for children. 
 
I consider that there is a need to address concerns about the impact of interactive content on children and about violence in computer games, and for this reason do not recommend the merging of the Films Guidelines and Computer Games Guidelines. 
 
However, there are provisions in the Computer Games Guidelines that are more restrictive than the Films Guidelines and have led to a number of games being Refused Classification in Australia. Consistent with the principle in the National Classification Code that “adults should be able to read, hear, see and play what they want,” I recommend that the Films Guidelines and Computer Games Guidelines should be aligned at the R 18+ level and that corresponding changes are made to the Refused Classification provisions in the Computer Games Guidelines. Existing protections would continue to be applied, particularly relating to interactivity, for content below that level that may be accessed by children. 
 
Films Guidelines 
 
Some specific issues were raised in respect of the Films Guidelines. Concerns about sexualised depictions of minors in films is one such issue. While context, artistic merit and intended audience should be taken into consideration when assessing a film generally, sexualised depictions of minors (whether real or animated) that are gratuitous, exploitative or offensive, and which sexually objectify children, should never be permitted. 
 
I recommend that the Films Guidelines should be amended to make reference to the need to give greater weight to the possibility that sexualised depictions of children are gratuitous, exploitative or offensive. While the current classification system provides for child abuse material to be Refused Classification, the provisions in the Commonwealth Criminal Code Act 1995 (the Criminal Code) in relation to child abuse material are much more detailed than those in the National Classification Code and Guidelines, and I recommend that the National Classification Code and Guidelines should be aligned with the Criminal Code in this regard. 
 
There is also a need for clear warnings for consumers and specific guidance for classifiers about matters such as violence against women and sexual violence, suicide, dangerous imitable behaviour and scary content. 
 
I recommend that the Classification Advisory Panel should address these issues in providing advice on the development of revised and more detailed guidelines. It should also review evidence of impacts on children of lower levels of violence. While current treatment of language in classification is considered generally acceptable, there would be value in including racist and other discriminatory language in this element. I also recommend that the use of alcohol, prescription medications and smoking should be considered under the element ‘drugs’. 
 
For X 18+ films, I recommend that the absolute prohibitions on fetishes, which are not illegal, and violence (where it is unrelated to sex) should be removed. 
 
Computer Games Guidelines 
 
Issues relating specifically to the Computer Games Guidelines that have emerged during this review include simulated gambling, loot boxes and other micro-transactions. The main issue with loot boxes is the combination of expenditure with chance and concerns about gambling-like impacts on players, including children. To address this, I recommend that loot boxes that can be purchased are given consumer advice addressing both expenditure and chance aspects, and are given a minimum classification of PG. 
 
Simulated gambling games, which replicate casino games, require a stronger response to prevent children’s access to such games. I recommend that games which are purely based on simulated gambling should be given a minimum classification of MA 15+ and continue to be given consumer advice of ‘simulated gambling’. However, games which incorporate simulated gambling in a less prominent way (e.g. as part of a broader, narrative-based game), and where simulated gambling can be avoided, may not need such a high rating. Appropriate consumer advice would include ‘simulated gambling’ where it is interactive and clearly replicates casino games. 
 
Publications Guidelines 
 
There were few suggested changes to the Publications Guidelines. While I recommend maintaining separate Guidelines for Films, Computer Games and Publications, the Publications Guidelines should incorporate definitions of classifiable elements which are consistent with those used in the Films Guidelines and Computer Games Guidelines. Clarity is also needed in relation to allowable detail in depictions of nudity. 
 
Concerns were raised by two stakeholders about sexualised depictions of children in publications. As with the Films Guidelines, the Publications Guidelines should include the need to give greater weight to the possibility that sexualised depictions of children are gratuitous, exploitative or offensive. 
 
Advertising of films, games and publications 
 
I recommend no changes to classification regulation for the advertising of films, computer games and submittable publications, although responsibilities for advertising assessments that currently lie with the Board should be the role of the Australian Government regulator. 
 
Advertising for films and computer games on television should continue to be regulated through broadcasting codes of practice and the Australian Association of National Advertisers codes. Complaints about the placement of advertising should continue to be referred to the broadcaster in the first instance, with escalated complaints being dealt with by the regulator. Complaints about the substance of advertising should continue to be referred to Ad Standards. 
 
I looked closely at the film industry’s proposal to change Commonwealth laws for advertising unclassified films where the content of the trailer is assessed rather than the likely classification of the film being advertised. However, I recognise that parts of the Australian community may have concerns about potentially unsuitable films being marketed to children, in cinemas in particular, and on balance recommend no change. 
 
Classification governance 
 
Role of the Australian Government and the states and territories 
 
Under the National Classification Scheme, the Australian Government is responsible for classifying content and the states and territories are responsible for regulating the sale, exhibition, advertising and hire of classifiable content. Under the intergovernmental agreement signed in 1995, decisions made by Ministers must be effected through the Council of Attorneys-General (CAG). Any changes to the National Classification Code and the classification Guidelines must be unanimously agreed by Ministers from all jurisdictions. Many submitters were concerned that these long-standing arrangements were no longer working well in the digital age. 
 
To clarify classification responsibilities and to make classification decision-making more responsive to changes in the content market, I recommend that the 1995 intergovernmental agreement be revised so that: • The Australian Government retains responsibility for establishing the mechanisms to classify content, however a range of different classification processes can be used. • The Australian Government is responsible for enforcement of online classifiable content, with states and territories responsible for enforcement of offline (physical) classifiable content. • CAG decision-making should generally be made on the basis of consensus but where consensus cannot be reached, decisions should be made on the basis of a majority of the members. 
 
The Australian Government regulator 
 
Currently, classification regulation is split amongst a number of Federal bodies, including the Board, the Classification Review Board, Australian Communications and Media Authority (ACMA) and the Department. Consistent with the recommendations to harmonise content regulation across all delivery platforms, I consider that most of these functions should be consolidated in one body. Given its existing role in regulation of broadcasters and online content more generally, I recommend that this body be ACMA.

21 March 2022

Disinfo

Alongside a commitment - so heartfelt - to introduce disinfo legislation in the 2nd half of this year as part of the 'Australian Code of Practice on Disinformation and Misinformation', the Communications Minister has released the June 2021 A report to government on the adequacy of digital platforms’ disinformation and news quality measures.

Unsurprisingly there is no engagement with disinformation/misinfo from members of the Government such as Craig Kelly. 

 The report states 

In December 2019, as part of its response to the Australian Consumer and Competition Commission’s Digital Platforms Inquiry, the Australian Government requested that digital platforms in Australia develop a voluntary code of practice to address online disinformation and news quality. The Australian Code of Practice on Disinformation and Misinformation1 (the code) was launched by industry association Digital Industry Group Inc (DIGI) on 22 February 2021. The code has since been adopted by 8 digital platforms – Google, Facebook, Microsoft, Twitter, TikTok, Redbubble, Apple and Adobe. 

The ACMA was tasked with overseeing the development of the code and reporting to the government on the adequacy of platform measures and the broader impacts of disinformation in Australia. Our report provides new consumer research on users’ experience of disinformation and misinformation on digital platforms and our assessment of the industry’s code. It also provides a range of findings and a number of recommendations for consideration by the government. 

The online propagation of disinformation and misinformation presents an increasing threat to Australians 

Over the previous 18 months, we have seen increasing concern within the community over the ‘infodemic’ of online disinformation and misinformation, particularly in relation to the real-world impacts of COVID-19. The propagation of these falsehoods and conspiracies undermines public health efforts, causes harm to individuals, businesses and democratic institutions, and in some cases, incites individuals to carry out acts of violence. 

To understand the scale and impacts of this issue in Australia, we undertook a mixed- methods study focused on COVID-19 misinformation. Key insights include:

> Most adult Australians (82%) report having experienced misinformation about COVID-19 over the past 18 months. Of these, 22% of Australians report experiencing ‘a lot’ or ‘a great deal’ of misinformation online. 

> Belief in COVID-19 falsehoods or unproven claims appears to be related to high exposure to online misinformation and a lack of trust in news outlets or authoritative sources. Younger Australians are most at risk from misinformation, however there is also evidence of susceptibility among other vulnerable groups in Australian society. 

> Australians are most likely to see misinformation on larger digital platforms, like Facebook and Twitter. However, smaller private messaging apps and alternative social media services are also increasingly used to spread misinformation or conspiracies due to their less restrictive content moderation policies. 

> Misinformation typically spreads via highly emotive and engaging posts within small online conspiracy groups. These narratives are then amplified by international influencers, local public figures, and by coverage in the media. There is also some evidence of inorganic engagement and amplification, suggesting the presence of disinformation campaigns targeting Australians. 

> Many Australians are aware of platform measures to remove or label offending content but remain sceptical of platform motives and moderation decisions. There is widespread belief that addressing misinformation requires all parties – individuals, platforms and governments – to take greater responsibility to improve the online information environment and reduce potential harms. 

Digital platforms have introduced a range of measures in response to the growth of disinformation and misinformation on their services In response largely to global concerns, digital platforms have introduced measures typically based on company-wide policies including:

> supporting third-party fact-checking organisations 

> proactively updating their policies to specifically address unique events, such as the COVID-19 pandemic and the 2020 US presidential election 

> investing in means to signal credible, relevant and authentic information 

> providing financial assistance and grants to news outlets, government and not-for- profit organisations to bolster the spread of credible information and news 

> increased detection, monitoring and enforcement action against groups and networks who use their services to spread disinformation and misinformation. 

Despite platforms’ mostly global approach to updating policies and implementing other actions, many measures have had an impact on Australian users.

> In 2020, Facebook removed more than 110,000 pieces of COVID-related misinformation generated by Australian accounts. 

> Between July and December 2020, Twitter removed 50 pieces of content authored by Australian accounts for contravening its COVID-19 misleading information policy. 

> In 2020, Google blocked 101 million advertisements globally for contravening its misrepresentation policies. 

> TikTok’s COVID-19 Information Hub was visited by over 292,000 Australians between November 2020 and March 2021. 

The above data shows that platforms are taking proactive steps to tackle disinformation and misinformation on their products and services. The introduction of an Australian industry code builds on these actions to codify actions, improve transparency, enhance consumer protections, and implement mechanisms to monitor their effectiveness. It also provides a framework to promote stakeholder collaboration and incentivise further actions by platforms to respond to a rapidly evolving online environment. 

Digital platforms have come together to develop a single outcomes-based code of practice with several important features 

It is extremely positive to see industry, steered by DIGI, come together to develop a single code of practice. A single code should promote a consistent approach by platforms and provide confidence in industry to manage the range of harms associated with disinformation and misinformation. 

DIGI ran a meaningful public consultation process in developing its draft code, which attracted a variety of submissions that clearly influenced subsequent changes. In particular, the scope of the code was expanded to cover misinformation as well as disinformation, a key piece of stakeholder feedback during the consultation process. The ACMA considers this is an improvement on the EU Code of Practice on Disinformation. The code adopts an outcomes-based regulatory approach that allows a range of platforms with different services and business models to sign up to the single code. Signatories are required to sign up to the objective of ‘providing safeguards against harms that may arise from disinformation and misinformation’ and may opt-in to other code objectives, such as disrupting advertising incentives and supporting strategic research. The code also provides signatories flexibility to implement measures to counter disinformation and misinformation in proportion to the risk of potential harm. Signatories must also report annually on the range of measures they will implement to achieve the objectives and outcomes. Importantly, the code also stresses the need to balance interventions with the need to protect users’ freedom of expression, privacy, and other rights. 

Our assessment identifies further improvements that should be made to the code’s scope and the clarity of commitments 

The ACMA has assessed the code to consider whether it has met the expectations set out by the government and has identified a range of improvements. 

In our view, the scope of the code is limited by its definitions. In particular, a threshold of both ‘serious’ and 'imminent’ harm must be reached before action is required under the code. The effect of this is that signatories could comply with the code without having to take any action on the type of information which can, over time, contribute to a range of chronic harms, such as reductions in community cohesion and a lessening of trust in public institutions. 

The code should also be strengthened through an opt-out rather than opt-in model. Signatories should only be permitted to opt out of outcomes where that outcome is not relevant to their service and be required to provide justification for the decision. 

The code is also limited in the types of services and products it covers. Private messaging is excluded, despite increasing concern about the propagation of disinformation and misinformation through these services, particularly when used to broadcast to large groups. Including messaging services within the code, with appropriate caveats to protect user privacy (including the content of private messages), would provide important consumer protections. 

We also consider improvements to the code should be made in relation to: > its application to news aggregation services > the treatment of professional news content and paid and sponsored content > the weight given to news quality as a key aspect of the government’s request to industry. 

The ACMA is also concerned that the code does not place an obligation on individual signatories to have robust internal complaints processes. This was an area of particular concern identified in the Digital Platforms Inquiry. 

The code includes commitments to establish administrative functions within 6 months of code commencement. As code administrator, DIGI will establish a compliance sub- committee, a detailed reporting guideline and a facility to address signatory non- compliance. However, these functions remain under development at the time of finalising this report. As a result, the ACMA has not been able to assess their effectiveness. DIGI and code signatories should consider changes to the code to address the matters identified by the ACMA in its review in February 2022. 

A clear and transparent measurement framework is critical to the effectiveness of a voluntary, outcomes-based regulatory model Signatories were required to nominate their code commitments and deliver an initial report under the code, providing information and data on the measures they have adopted under the code. 

Signatories’ reports provide a large range of information on the actions they have taken to address disinformation, misinformation and news quality, and their investments in collaborative initiatives. 

However, reports are heavily focused on platform outputs and lack systematic data or key performance indicators (KPIs) that would establish a baseline and enable the tracking of platform and industry performance against code outcomes over time. Reports also show inconsistencies in the interpretations of key code terms and in reporting formats. 

Platforms should move quickly to identify KPIs specific to their services and work together to establish industry-wide KPIs to demonstrate the effectiveness of the code as an industry-wide initiative. 

The ACMA recommends a number of actions by government to bolster industry self-regulatory arrangements 

The ACMA considers that it is still too early to draw concrete conclusions on the overall impact or effectiveness of the code. The code administration framework – including a detailed reporting guideline and mechanism to handle complaints – is not due for completion until late August 2021. The design and implementation of these elements will be key to the overall effectiveness of the code. 

Given these circumstances, continued monitoring is required and the ACMA recommends it provide government with another report on the code by the end of the 2022–23 financial year. This will provide sufficient time to assess the operation of the code administration framework and assess the impact of any changes arising from the February 2022 review of the code. As part of this report, the ACMA recommends it continues to undertake focused research on these issues. 

Initial signatory reports identify challenges in obtaining relevant data on platform actions in Australia. Providing the ACMA with formal information-gathering powers (including powers to make record-keeping rules) would incentivise greater platform transparency and improve access to Australia-specific data on the effectiveness of measures to address disinformation and misinformation. Information collected could also be used to identify systemic issues across the digital platform industry and inform future ACMA research. 

More formal regulatory options could be considered, particularly for platforms that choose not to participate in the code or reject the emerging consensus on the need to address disinformation and misinformation. The ACMA recommends that government provides the ACMA with reserve regulatory powers in relation to digital platforms – such as code registration powers and the ability to set standards. This would provide the government with the option to act quickly to address potential harms if platform responses are not adequate or timely. 

There are also opportunities for improved collaboration between government agencies, platforms, researchers and non-government organisations on issues relating to disinformation and misinformation. The ACMA recommends that the government should consider establishing a Misinformation and Disinformation Action Group to provide a mechanism to support future information sharing, cooperation and collaboration. 

The ACMA makes 5 recommendations to the government in its report. 

Recommendation 1: The government should encourage DIGI to consider the findings in this report when reviewing the code in February 2022. 

Recommendation 2: The ACMA will continue to oversee the operation of the code and should report to government on its effectiveness no later than the end of the 2022- 23 financial year. The ACMA should also continue to undertake relevant research to inform government on the state of disinformation and misinformation in Australia. 

Recommendation 3: To incentivise greater transparency, the ACMA should be provided with formal information-gathering powers (including powers to make record keeping rules) to oversee digital platforms, including the ability to request Australia- specific data on the effectiveness of measures to address disinformation and misinformation. 

Recommendation 4: The government should provide the ACMA with reserve powers to register industry codes, enforce industry code compliance, and make standards relating to the activities of digital platforms’ corporations. These powers would provide a mechanism for further intervention if code administration arrangements prove inadequate, or the voluntary industry code fails. 

Recommendation 5: In addition to existing monitoring capabilities, the government should consider establishing a Misinformation and Disinformation Action Group to support collaboration and information-sharing between digital platforms, government agencies, researchers and NGOs on issues relating to disinformation and misinformation.

17 December 2020

Open Justice in NSW

The NSW Law Reform Commission has released a conbsultation paper regarding its Open Justice - Court and tribunal information: access, disclosure and publication onquiry.

The Commission's Terms of reference are to review and report on the operation of: 

1. legislative prohibitions on the disclosure or publication of NSW court and tribunal information, 

2. NSW court suppression and non-publication orders, and tribunal orders restricting disclosure of information, and 

3. access to information in NSW courts and tribunals; 

In particular, the Commission is to consider:

a) Any NSW legislation that affects access to, and disclosure and publication of, court and tribunal information, including: - The Court Suppression and Non-Publication Orders Act 2010 (NSW); - The Court Information Act 2010 (NSW); and - The Children (Criminal Proceedings) Act 1987. 

b) Whether the current arrangements strike the right balance between the proper administration of justice, the rights of victims and witnesses, privacy, confidentiality, public safety, the right to a fair trial, national security, commercial/business interests, and the public interest in open justice. 

c) The effectiveness of current enforcement provisions in achieving the right balance, including appeal rights. 

d) The appropriateness of legislative provisions prohibiting the identification of children and young people involved in civil and criminal proceedings, including prohibitions on the identification of adults convicted of offences committed as children and on the identification of deceased children associated with criminal proceedings. 

e) Whether, and to what extent, suppression and non-publication orders can remain effective in the digital environment, and whether there are any appropriate alternatives. 

f) The impact of any information access regime on the operation of NSW courts and tribunals.  

g) Whether, and to what extent, technology can be used to facilitate access to court and tribunal information. 

h) The findings of the Royal Commission into Institutional Responses to Child Sexual Abuse regarding the public interest in exposing child sexual abuse offending. 

i) Comparable legal and practical arrangements elsewhere in Australia and overseas. 

j) Any other relevant matters. 

The consultation questions are 

 The open court principle and its exceptions

Q 2.1: Statutory requirements to hold proceedings in private 

(1) Are the current laws that require certain proceedings to be closed to the public appropriate? Why or why not? (2) What changes, if any, should be made to these laws? (3) Are the current statutory exceptions to the requirement to hold proceedings in private appropriate? Why or why not? (4) Should there be standard exceptions that apply in all (or most) circumstances? If so, what should they be, and in what circumstances should they apply? 

Q 2.2: Statutory powers to hold proceedings in private 

(1) Are the existing laws that give courts discretionary powers to make exclusion orders appropriate? Why or why not? (2) What changes, if any, should be made to these existing laws? (3) Should there be standard grounds that need to be satisfied before a court can make a discretionary exclusion order in all (or most) circumstances? If so, what should they be and in what circumstances should they apply? (4) Should there be standard procedures by which an exclusion order could be made in all (or most) circumstances? If so, what should they be and in what circumstances should they apply? (5) Should there be a standard offence for breaching an exclusion order in most (or all) circumstances? If so: (a) what should be the elements of the offence and in what circumstances should it apply, and (b) what should be the penalty? 

Non-disclosure and suppression: statutory prohibitions 

Q 3.1: Statutory prohibitions on publishing or disclosing certain information As a matter of principle, should there ever be automatic statutory prohibitions on publishing or disclosing certain information? Why or why not? 

Q 3.2: Current statutory prohibitions on publishing or disclosing information (1) Are the current statutory prohibitions on publishing or disclosing certain information appropriate? Why or why not? (2) What changes, if any, should be made to the current statutory prohibitions? 

Q 3.3: Additional statutory prohibitions that may be needed What further information, if any, should be protected by automatic statutory prohibitions on publication or disclosure? 

Q 3.4: Types of action a statute may prohibit 

(1) Is the existing variety of types of action that a statute may prohibit justified? Why or why not? (2) What changes, if any, should be made? (3) Should a standard provision setting out the types of action that a statute may prohibit be developed? If so: (a) what should the provision say (b) how should key terms be defined, and (b) when should it apply? 

Q 3.5: Duration of the statutory prohibition 

(1) Should the statutory prohibitions on publishing or disclosing certain information always specify the duration of the prohibition? Why or why not? (2) What changes, if any, should be made to the existing duration provisions attached to statutory prohibitions on publishing or disclosing information? (3) What prohibitions, if any, should include a duration provision that do not already? What should these duration provisions say? 

Q 3.6: Application of the statutory prohibition to related proceedings 

In what circumstances, if any, should statutory prohibitions that protect the identities of people involved in proceedings apply in appeal or other related proceedings? 

3.7: When publication or disclosure of information should be permitted 

(1) Are the existing Q exceptions attached to statutory prohibitions on publishing or disclosing information appropriate? Why or why not? (2) What changes, if any, should be made to the existing exceptions? (3) What prohibitions, if any, should include exceptions that do not already? What should these be? (4) Should standard exceptions apply to all (or most) statutory prohibitions on publishing or disclosing information? If so, what should they be and in what circumstances should they apply? (5) Where exceptions allow a court to permit disclosure of protected information, what criteria, if any, should guide that court? 

Non-disclosure and suppression: discretionary orders 

Q 4.1: Actions targeted by an order 

(1) Are the existing definitions of “suppression order” and “non-publication order” in the Court Suppression and Non-publication Orders Act 2010 (NSW) appropriate? Why or why not? (2) What changes, if any, should be made to these definitions? (3) What other statutes should these definitions (with or without amendment) apply to? (4) What other changes (if any) should be made to these statutes in relation to the types of action an order may prevent? 

Q 4.2: Types of information that may be subject to an order 

(1) Are the current provisions that identify the types of information that may be the subject of a suppression or non-publication order, adequate? Why or why not? (2) What changes, if any, should be made to these provisions? 

Q 4.3: Consent to publication or disclosure 

What provision, if any, should be made about making an order where a person consents to the publication of information that would reveal their identity? 

Q 4.4: Limits to orders 

(1) Are the existing provisions relating to the scope of suppression and non-publication orders appropriate? Why or why not? (2) What changes, if any, should be made to existing provisions in relation to: (a) the exceptions and conditions that apply (b) the geographic limits of such orders (c) the duration of such orders, and (d) any other aspects of the scope of such orders? 

Q 4.5: Service and notice requirements 

(1) Are the existing procedures (under the Court Suppression and Non-publication Orders Act 2010 (NSW), or any other statute) for making suppression and non-publication orders adequate? Why or why not? (2) What changes, if any, should be made to existing procedures in relation to: (a) who may make an application for an order (b) when an order can be made (c) who can appear and be heard in an application for an order (d) the service and notice requirements for an order, or (e) any other matter? 

Q 4.6: Costs in proceedings for orders 

What provision, if any, should be made for cost orders in relation to applications for suppression or non-publication orders? 

Q 4.7: The public interest in open justice 

(1) Does the Court Suppression and Non-publication Orders Act 2010 (NSW) deal with the consideration of the public interest in open justice appropriately? Why or why not? (2) What changes, if any, should be made to the existing provision? (3) What provision, if any, should be made in other statutes that grant power to make suppression or non-publication orders for recognising the public interest in open justice? (4) What other considerations should be taken into account before an order is made? 

Q 4.8: The “necessary” test for making orders 

(1) What changes, if any, should be made to the “necessary” test? (2) Should a definition of “necessary” be included in the Court Suppression and Non-publication Act 2010 (NSW) or any other statute? If so, what should it be? 

Q 4.9: Grounds for making orders 

(1) Are the grounds for making suppression and non-publication orders under the Court Suppression and Non-publication Act 2010 (NSW) and other NSW statutes appropriate? Why or why not? (2) What changes, if any, should be made to them? 

Q 4.10: A requirement to give reasons 

(1) Should courts be required to give reasons for a decision to make or refuse to make a suppression or non-publication order in some or all circumstances? Why or why not? In what circumstances should this requirement apply? (2) If there was to be a requirement, how should it be expressed? 

Q 4.11: Interim orders 

(1) Is the current provision in the Court Suppression and Non-publication Orders Act 2010 (NSW) for interim orders appropriate and effective? Why or why not? (2) What changes, if any, should be made to the existing provision? (3) What provision, if any, should be made for interim orders in other statutes that grant powers to make suppression or non-publication orders? 

Q 4.12: Review and appeal of orders (1) Are the existing provisions relating to the review and appeal of suppression and non-publication orders appropriate? Why or why not? (2) What changes, if any, should be made to these provisions? (3) To what extent should review and appeal provisions be available for suppression and non-publication orders that are not covered by the Court Suppression and Non-publication Orders Act 2010 (NSW)? 

Q 4.13: Framing effective orders 

How could the Court Suppression and Non-publication Orders Act 2010 (NSW) provisions be amended to assist courts in framing more effective orders? 

Q 4.14: Interaction between the Court Suppression and Non-publication Orders Act 2010 (NSW) and other statutes 

(1) Should the Court Suppression and Non-publication Orders Act 2010 (NSW) only apply to situations that are not subject to other automatic prohibitions or provisions that allow suppression and non-publication orders to be made? Why or why not? (2) Which provisions for suppression and non-publication, if any, should be consolidated or standardised? 

Monitoring and enforcing prohibitions on publication and disclosure 

Q 5.1: Sources of sanctions for breaches of prohibitions 

(1) Is the current regime, in which some breaches of prohibitions on publication or disclosure of information are enforced through statutory offences and others are enforced by contempt proceedings, satisfactory? Why or why not? (2) What changes, if any, should be made to the existing arrangements? To what extent should there be greater consistency in the statutory offences? (3) In particular, what changes, if any, should be made in relation to: (a) a mental element for any offence (b) the definition of terms used for publication or disclosure (c) exceptions to any of the statutory offences, or (d) the current maximum penalties for any statutory offences? (4) What changes, if any, should be made to the current arrangements for enforcing contempt of court in relation to breaches of prohibitions on publication or disclosure? 

Q 5.2: Monitoring prohibitions on publication and disclosure 

(1) How should prohibitions on publication and disclosure of information be monitored? (2) Is public transparency about the number of people who are proceeded against for offences involving breaches of the prohibitions necessary or desirable? Why or why not? How could public transparency about these numbers be improved? 

Q 5.3: Enforcing prohibitions on publication and disclosure 

(1) Are the existing arrangements for managing breaches of prohibitions on publication and disclosure of information effective? Why or why not? (2) If not, what changes should be made? 

Q 5.4: Challenges in enforcing prohibitions on publication or disclosure 

(1) What changes, if any, could make it easier for justice agencies to identify and prosecute people who breach prohibitions on publication or disclosure of information? (2) Should there be a scheme for mutual recognition and enforcement of suppression and non-publication orders across Australia? If so, what would the scheme entail? (3) How should the law and/or justice agencies deal with situations where prohibitions on the publication or disclosure of information under NSW law are breached outside Australia? (4) Should the time limits for enforcing the statutory offences considered in this Chapter be extended? Why or why not? 

Access to information 

Q 6.1: Consolidation of the court information access regimes in NSW 

(1) Should the regimes governing access to court information be consolidated? Why or why not? (2) If so, how should the regimes be consolidated? (3) What principles and rules should underpin a consolidated regime? 

Q 6.2: Discretion to permit or deny access to information 

(1) In what circumstances, if any, should courts have discretion to permit or deny access to court information? (2) In what circumstances, if any, should information be available as of right? 

Q 6.3: Considerations in determining access requests 

(1) What, if any, standard considerations or principles should all (or most) courts apply when determining an access request? (2) Are there any circumstances that would warrant different considerations to the standard considerations being applied? If so: (a) what circumstances, and (b) what should the considerations be? 

Q 6.4: Types of court information available for access 

(1) What types of court information should be available for access? (2) Should different access rules apply to different types of information? 

Q 6.5: Prohibiting access to court information Should access to court information be prohibited in certain circumstances? If so, when? 

Q 6.6: Who can access court information? 

Who should be able to access what types of court information and on what conditions? 

Q 6.7: Privacy protections for personal information 

How should the privacy of personal identification information contained in court information be protected? 

Q 6.8: Applying for access to court information 

(1) What procedures, if any, should apply when a person seeks access to court information? (2) What guidance, if any, should be given in relation to these procedures? 

Q 6.9: How access to court information should be provided 

(1) By what methods should courts provide a person with access to court information? (2) Should the available methods be different depending on the applicant and the situation? If so, how? 

Q 6.10: Fees for accessing information 

(1) In what circumstances should a person be charged a fee to access court information? (2) In what circumstances should any fees for accessing information be waived or reduced? 

Q 6.11: A national access regime 

Should there be a national regime governing access to documents? Why or why not? 

Q 6.12: Public availability of judgments and decisions 

How could NSW courts and tribunals improve access to judgments and decisions? 

Protections for children and young people 

Q 7.1: Criminal proceedings – prohibition on the publication and disclosure of identifying information 

(1) Should there continue to be a general prohibition on publishing or broadcasting the identities of children involved in criminal proceedings in NSW? Why or why not? (2) What changes, if any, should be made to the existing prohibition and the exceptions to it? 

Q 7.2: Criminal proceedings – closed court orders (1) Should criminal proceedings involving children continue to be held in closed court as a rule? Why or why not? (2) Are the current exceptions to the rule appropriate? If not, what changes should be made? 

Q 7.3: Criminal diversion processes 

(1) Is the prohibition on publishing or broadcasting the identities of young offenders who take part in criminal diversion processes appropriate? Why or why not? (2) What changes, if any, should be made to the existing prohibition? 

Q 7.4: Proceedings for apprehended domestic violence orders 

(1) Is the prohibition on publishing the identities of children involved in apprehended domestic violence order proceedings appropriate? Why or why not? (2) What changes, if any, should be made to the existing prohibition? 

Q 7.5: Care and protection proceedings – prohibition on the publication and disclosure of identifying information 

(1) Is the prohibition on publishing or broadcasting the identities of children involved in care and protection proceedings appropriate? Why or why not? (2) What changes, if any, should be made to the existing prohibition and exceptions? 

Q 7.6: Care and protection proceedings – closed court orders 

(1) Are the existing provisions relating to the exclusion of people (including the child or young person themselves) from court and non-court proceedings under the Children and Young Persons (Care and Protection) Act 1998 (NSW) appropriate? Why, or why not? (2) What changes, if any, should be made to these provisions? 

Q 7.7: Adoption proceedings 

(1) Should there continue to be restrictions on the publication or disclosure of material that identifies people involved in adoption proceedings? Why, or why not? (2) What changes, if any, should be made to the existing restrictions and exceptions? (3) Should adoption proceedings continue to be held in closed court? Why, or why not? (4) What changes, if any, should be made to the existing closed court provisions? 

Q 7.8: Parentage and surrogacy proceedings 

(1) Should there continue to be prohibitions on the publication or disclosure of material relating to parentage and surrogacy proceedings? Why or why not? (2) What changes should be made to the existing restrictions? (3) Should parentage and surrogacy proceedings continue to be held in closed court? Why or why not? (4) What changes, if any, should be made to the existing closed court provisions? 

Q 7.9: Other proceedings 

What further protections, if any, should there be against the publication and disclosure of, or public access to, types of legal proceedings involving children other than those to which protections already apply? 

Victims and witnesses: privacy protections and access to information 

Q 8.1: General protections for victims and witnesses (1) Are the general privacy protections for victims and witnesses in NSW appropriate? Why or why not? (2) What changes, if any, should be made? 

Q 8.2: Current protections for specific types of victims and witnesses 

(1) Are the privacy protections for specific types of victims and witnesses in NSW appropriate? Why or why not? (2) What changes, if any, should be made? 

Q 8.3: Protections for other types of victims and witnesses 

What privacy protections, if any, are needed for other types of victims and witnesses? 

Q 8.4: Access to court information by victims 

(1) Are the current arrangements governing access to court information by victims appropriate? Why or why not? (2) What changes, if any, should be made? 

Protections for sexual offence complainants 

Q  9.1: The prohibition on publishing the identities of sexual offence complainants 

(1) Is the prohibition on publishing the identities of complainants in sexual offence proceedings and the exceptions to the prohibition appropriate? Why or why not? (2) What changes, if any, should be made? 

Q 9.2: Closing courts during sexual offence proceedings 

(1) Are the situations in which courts may be closed during sexual offence proceedings appropriate? Why or why not? (2) What changes, if any, should be made?  

Media access to information 

Q 10.1: Media access to court information in NSW 

(1) Are the current arrangements for the media to access court information in relation to both civil and criminal proceedings appropriate? Why or why not? (2) Should the media have special privileges to access court information in relation to civil and/or criminal proceedings? Why or why not? (3) What changes, if any, should be made to the current arrangements, including in relation to: (a) the nature of the access provided (b) the types of documents that may be accessed (c) time limits on access, and (d) application procedures? 

Q 10.2: Media access to court proceedings 

(1) Is the current regime governing media access to proceedings appropriate and workable? Why or why not? (2) What changes, if any, should be made to the current regime, including in relation to: (a) prescribed sexual offence proceedings (b) proceedings involving children (c) accessing “virtual courtrooms”, and (d) orders excluding people under the Court Security Act 2005 (NSW)? 

Q 10.3: Broadcasting court proceedings 

(1) Are the rules that apply to media recording and broadcasting of court proceedings appropriate? Why or why not? (2) What changes, if any, should be made? 

Q 10.4: Impact of publication restrictions on the media 

(1) Are the laws that restrict the media from publishing or broadcasting information relating to court proceedings appropriate? Why or why not? (2) What changes, if any, should be made? (3) In relation to suppression and non-publication orders: (a) are the interests of the media adequately reflected in the grounds for making such orders? (b) is the list of people with standing to be heard in applications for suppression or non-publication orders appropriate? (c) are the current arrangements for communicating the existence of suppression and non-publication orders adequate? (4) What changes, if any, should be made to the laws and procedures relating to the media and suppression and non-publication orders? 

Q 10.5: Contemporary media 

(1) Are the current definitions and use of the terms “media” and “news media organisation” appropriate? Why or why not? (2) What changes, if any, should be made to these terms and their definitions? (3) How else could members of the media be identified for the purposes of the laws dealing with media access to court information and proceedings? 

Researcher access to information 

Q 11.1: Researcher access to information 

(1) What changes, if any, should be made to the existing arrangements for providing researchers with access to court information? (2) In particular, what changes, if any, should be made in relation to: (a) a centralised scheme for giving researchers access to court information, including a research committee (b) the kinds of researchers who should be able to access court information (c) the kinds of research that court information should be available for (d) the other considerations that may be relevant to granting a researcher access to court information (e) the type of court information researchers should be able to access (f) the types of conditions that should be placed on researchers who are given access to court information (g) applicable fees and arrangements for fee waiver (h) access to archived court records, and (i) requests to collate data and/or statistics? 

Digital technology and open justice 

Q 12.1: Online courts 

If virtual courtrooms are to be available, what provision, if any, should be made to ensure that: (a) open justice principles are given effect to, where possible, and (b) risks of prohibited disclosure or publication are managed effectively? 

Q 12.2: Electronic access to court information 

(1) What arrangements, if any, should be made for electronic access to court information? (2) In particular, what should the arrangements be in relation to: (a) the type of information that can be accessed (b) who can access the information, and (c) any necessary protections against unauthorised disclosure or publication of such information? 

Q 12.3: Suppression and non-publication orders in the digital environment 

(1) What, if anything, can be done to deal with situations where suppression and non-publication orders under NSW law are breached outside Australia? (2) In particular, what, if anything can be done to minimise the risk of offending content affecting the fairness of a trial? 

Q 12.4: Tweeting and posting in court 

(1) Are current provisions regulating use of social media by the media and public in court adequate? Why or why not? (2) What changes, if any, should be made to the existing provisions? 

Other proposals for change 

Q 13.1: A register of orders 

(1) Should there be a publicly accessible register of suppression and non-publication orders made by NSW courts? Why or why not? (2) If so: (a) who should be able to access the register, (b) what details should be included in the register, and (c) who should build and maintain the register? 

Q 13.2: An open justice advocate 

(1) Is there a need for an advocate to appear and be heard in applications for suppression and non-publication orders? Why or why not? (2) If so, what responsibilities should the advocate have? 

Q 13.3: Education initiatives 

(1) What education initiatives could be implemented to improve people’s understanding of open justice and associated restrictions? (2) Who should be responsible for delivering those initiatives? 

Q 13.4: Other ways to avoid juror prejudice 

(1) Could the juror oath and affirmation be amended to better ensure jurors appreciate, and take seriously, the obligation not to seek or rely on potentially prejudicial information? If so, how could they be improved? (2) Is the current Jury Act 1977 (NSW) offence of making inquiries effective? If not, how could it be improved? (3) Are the current jury directions about avoiding media publicity and making inquiries about the case appropriate? If not, what reforms are required? (4) Could improving the way that juror questions are managed better ensure jurors do not conduct their own inquires? If so, what improvements could be made? (5) Could more educational guidance be provided to jurors about avoiding media publicity and making inquiries prior to the trial? If so, what should this guidance say? (6) Could pre-trial questioning of jurors be used more effectively to determine which potential jurors have been exposed to prejudicial information? If so, how? (7) Should NSW adopt the Queensland approach of allowing judge alone trials where there has been significant pre-trial publicity that may affect jury deliberations? Why or why not? (8) Are there any other ways in which current law or practice can be improved to prevent jurors from being influenced by potentially prejudicial information?