26 June 2020

Potemkin Regulation

The ANAO Management of the Australian Government’s Lobbying Code of Conduct — Follow-up Audit report finds, alas unsurprisingly, that the Attorney-General's Department (AGD) has  not engaged with implementation of the national lobbying regime, meaning that we have a potemkin code regarding influence in policymaking.

The report states
Lobbying activities refer to communications with government representatives in an effort to influence government decision-making. To help safeguard decision-making processes from factors such as undue influence or unfair competition, governments around the world, including the Australian Government, have introduced lobbying regulatory regimes. 
The Australian Government’s regime was established with the introduction of the Lobbying Code of Conduct (Code) in 2008. The policy objective of the regime is expressed in the Code as: … to promote trust in the integrity of government processes and ensure that contact between lobbyists and Government representatives is conducted in accordance with public expectations of transparency, integrity and honesty. 
The regime specifies that this objective will be achieved through lobbyist and Government representative compliance with the Code’s various provisions and its main administrative mechanism, the Register of Lobbyists (Register). The Register is a publicly available database of registered lobbyist organisations and lobbyists, and their clients. As at March 2020, the Register listed 257 lobbyist organisations, 590 individual lobbyists, and 1,792 clients. 
Lobbyist organisations have administrative responsibilities associated with keeping the Register up to date, and lobbyist organisations and individual lobbyists must also comply with a number of lobbying principles and prohibitions under the Code. Government representatives are required to check the Register prior to meeting with a lobbyist, and to report any known breaches of the Code. The Attorney-General’s Department (AGD) became responsible for administering the Code following a machinery of government change that transferred accountability from the Department of the Prime Minister and Cabinet (PM&C) in May 2018. 
Auditor-General Report No.27 of 2017–18, Management of the Australian Government’s Register of Lobbyists, assessed the effectiveness of PM&C’s management of the Code, and concluded that:
While the Department of the Prime Minister and Cabinet’s arrangements to manage the Australian Government’s Register of Lobbyists are consistent with the framework agreed by Government, improvements could be made to communications, compliance management and evaluation for the Code and the Register. It would also be timely to review the appropriateness of the current arrangements and Code requirements in supporting the achievement of the objectives established for the Code.
The Auditor-General recommended that the department review the appropriateness of current arrangements in supporting the achievement of the Code’s objectives. This included: implementing a strategy to raise lobbyists’ and Government representatives’ awareness of the Code and their responsibilities; assessing risks to compliance with the Code and providing advice on the ongoing sufficiency of the current compliance management framework; and developing a set of performance measures and establishing an evaluation framework to inform stakeholders about the extent to which outcomes and broader policy objectives are being achieved. 
PM&C partly agreed with the recommendation, indicating that it would implement the recommendation where it was consistent with a non-legislation based scheme.
ANAO comments
To form a conclusion against the audit objective, the following high-level criteria were adopted:
  • Does AGD have effective governance arrangements to oversee the implementation of the recommendation from Auditor-General Report No.27 of 2017–18? 
  • Has a strategy been implemented to raise awareness of the Lobbying Code of Conduct among lobbyists and Government representatives? 
  • Has AGD assessed risk to Lobbying Code of Conduct compliance and provided advice to the Australian Government on the sufficiency of the current compliance management framework? 
  • Have performance measures and an evaluation framework for the Lobbying Code of Conduct and Register of Lobbyists been developed?
The bad news is that
AGD did not implement the recommendation from Auditor-General Report No.27 of 2017–18, Management of the Australian Government’s Register of Lobbyists
Governance arrangements to oversee the implementation of the ANAO recommendation were limited in effectiveness. There was no implementation planning at any stage in the transition of accountability for the Code and ANAO recommendation from PM&aC to AGD. ... 
AGD did not develop a strategy to raise awareness of the Code. Registered lobbyists received information about some of their administrative responsibilities. Limited activities were undertaken to inform lobbyists and Government representatives of their compliance obligations under the Code. 
AGD did not systematically assess risks to compliance with the Code and did not advise Government about the sufficiency of the current compliance framework in meeting the Code’s objectives. 
AGD did not develop an evaluation framework for the Code and did not develop performance measures. It did not assess or inform others about whether the current regime is achieving the regulatory objectives.
The report's 'Supporting findings' in summary are
Governance structures and processes 
There was no plan for the implementation of the ANAO recommendation, or for the implementation of the machinery of government transfer of accountability for the Code from PM&C to AGD. The ANAO recommendation was broadly considered when designing and building a proposed IT system for the Register, but no attempt was made to map IT functionality to the specific components of the ANAO recommendation. 
Arrangements for senior management and audit committee oversight of implementation for the ANAO recommendation were partly effective. Divisional responsibility for the Code within AGD was clearly established. The Executive Board and Senior Management Committee had visibility of the Code, however this was focused on technological issues associated with the transfer of the Register rather than the implementation of the ANAO recommendation. Progress against the recommendation was reported to the ARMC, but the commencement of this process was delayed. 
Communications to raise awareness 
AGD did not develop a communications or stakeholder engagement strategy for the Code. 
AGD’s effectiveness in communicating regulatory requirements to lobbyists cannot be assessed in the absence of a communications strategy. Communication primarily occurred through a dedicated website and through correspondence with registered lobbyist organisations, with limited public information and stakeholder engagement. Communications focused on administrative responsibilities rather than broader compliance obligations, with no communication activities targeted at unregistered lobbyists. 
Communications to Government representatives to raise their awareness of the Code and regulatory obligations were partly effective. AGD used the lobbying website to provide some information to Government representatives about compliance obligations, but did not undertake any broader communications activities with Government representatives, including with the Australian Government entities that employ them or the entities that have a responsibility to provide guidance to the Australian public sector. 
Assessment and management of compliance risks 
AGD did not systematically consider or manage risks that impact the ability or willingness of regulated entities and individuals to comply with the Code. Risks in relation to AGD’s ability to administer the Code were assessed at a basic level and only after actual risks associated with data accuracy were realised. There was no strategy to ensure that administrative risks, or risks to compliance with the Code, are effectively managed. In practice, activities and procedures such as email communications with lobbyist organisations, compliance dashboards and draft standard operating procedures aimed to manage some administrative risks. 
AGD did not advise Government about the sufficiency of the compliance framework in meeting the Code’s objectives. 
Performance measurement and evaluation 
AGD did not develop an evaluation framework for assessing the regime’s success in meeting objectives and did not develop performance measures. 
AGD did not develop a monitoring program for the Code and did not establish any performance measures for AGD processes in administering the Code. A service standard was developed for timely updates to the Register, but performance against this standard is not yet measured or assessed. 
No performance information was provided to the Parliament or the public about work undertaken in relation to the Code, and whether intended regulatory objectives are being achieved.

ACMA Fake News Position Paper

Catching up with the ACCC, the Australian Communications and Media Authority (ACMA) Misinformation and news quality on digital platforms today outlines ACMA's expectations for a voluntary code/codes to be developed by digital platforms regarding fake news.

Work by the ACCC was noted for example here, herehere, here and here.

Three key objectives in the ACMA position paper are
  •  reduce the impact of harmful misinformation 
  • empower people to better judge the quality of news and information 
  • enhance the transparency and accountability of platforms’ practices.
ACMA states
 According to the University of Canberra’s Digital News Report: Australia 2020, 48 per cent of Australians rely on online news or social media as their main source of news. But 64 per cent of Australians are concerned about what is real or fake on the internet. 
“That should rightly be of immense community concern. False and misleading news and information online has the potential to cause serious harm to individuals, communities and society,” ACMA Chair Nerida O’Loughlin said. 
“In developing this new code, digital platforms will need to balance the need to limit the spread and impact of harmful material on the internet while protecting Australians’ important rights to freedom of speech. 
“Digital platforms should not be the arbiters of truth for online information. But they do have a responsibility to tackle misinformation disseminated on their platforms and to assist people to make sound decisions about the credibility of news and information. 
“We know that major platforms have stepped up their processes during the COVID-19 pandemic due to the prevalence of information potentially harmful to health and property. 
“It’s now time for digital platforms to codify and commit to permanent actions that are systematic, transparent, certain and accountable for their users in addressing such potentially harmful material.”
ACMA is to oversee the platforms’ code development process and report to Government by June 2021.  ACMA anticipates the digital platforms will work together, including undertaking public consultation, to develop and have in place a single, industry-wide code by December 2020.

The position paper states
Australians rely upon a range of indicators to assess the quality of their news and information, including the source or outlet of a news piece. On digital platforms, the widespread use of algorithms, the proliferation of sources and the dissociation of content from its source can make it challenging to assess quality and make informed decisions about which news and information to read and trust. Difficulty in discerning the quality of news and information can lead to the increased spread of harmful misinformation. This includes disinformation — false and misleading information distributed by malicious actors with the intent to cause harm to individual users and the broader community. 
International regulatory approaches to date have largely focused on countering deliberate disinformation campaigns. Disinformation campaigns can engage ordinary users to inadvertently propagate misleading information. However, misleading information shared without intent to cause harm can still lead to significant harm. From the consumer perspective, all forms of false, misleading or deceptive information can have potentially harmful effects on users and the broader community. 
This paper uses ‘misinformation’ as an umbrella term to cover all kinds of potentially harmful false, misleading or deceptive information, with deliberate disinformation campaigns considered a subset of misinformation. 
The government has been considering responses appropriate for Australian users 
These concepts were canvassed as part of the Australian Competition and Consumer Commission (ACCC) Digital Platforms Inquiry (DPI). The ACCC recommended a mandatory code to address complaints about disinformation (Recommendation 15) and an oversight role for a regulator to monitor issues of misinformation and the quality of news and information (Recommendation 14). 
In response to that inquiry, the government has asked major digital platforms to develop a voluntary code to cover both recommendations. The government’s response recognises that addressing the complex problem of misinformation requires a comprehensive and principled approach. Any such approach should balance interventions with the rights to freedom of speech and expression. 
Australians are increasingly reliant on digital platforms to access, consume and share news and information. The ACMA considers that platforms bear considerable responsibility to provide users with a safe and user-friendly environment to engage with news and information and help users more easily discern the quality of this content. 
The 2019–20 Australian bushfire season and the COVID-19 pandemic have reinforced the potential harms of false and misleading information The first half of 2020 has been marked for many Australians by two extraordinary events: the unprecedented summer bushfire season and the COVID-19 pandemic. Both events have provided fertile circumstances for the spread of false and misleading information, distributed with and without malicious intent. 
The bushfires saw instances of false and misleading information about the cause of the fires, the use of old images purporting to be of current events and conspiracy theories such as the fires having been purposely lit to make way for a Sydney to Melbourne train line. False and misleading information about the pandemic—such as how to prevent exposure, possible treatments, and the origins of the virus—have been shown to have real-world consequences, including personal illness and damage to property. 
Recent Australian research found that nearly two-thirds (66 per cent) of people say they have encountered misinformation about COVID-19 on social media. The World Health Organisation has labelled the crisis an ‘infodemic’ and platforms have implemented new measures to limit the spread of misinformation. 
Both these events have highlighted the impact and potential harm of misinformation on both Australian users of digital platforms and the broader Australian community. xxx Voluntary codes should build on existing measures as part of a risk-based approach to harmful misinformation 
In recent years, most major platforms have implemented a range of measures and processes to address potentially harmful misinformation and news quality issues. This work has intensified during the COVID-19 pandemic, with platforms taking further steps to address potential harms, including:
  • Greater signalling of credible, relevant and authentic information through new features and tools. 
  • Increased detection and monitoring of fake accounts, bots and trolls who engage in malicious and inauthentic activity with vulnerable users. 
  • Updating terms of service and community guidelines to allow for action to be taken against false and misleading news and information in relation to health and safety issues where the scale and immediacy of potential harm is paramount. 
In developing a voluntary code, the ACMA considers that platforms should codify their activities and commit to permanent actions that are systematic, transparent, certain and accountable for their users in addressing such potentially harmful misinformation. 
A voluntary code needs to be fit for purpose for Australian users and the Australian community. Given the recent evidence of significant harm caused by false and misleading information shared online, and the practical difficulty of determining which information has been circulated with intent to harm, the ACMA considers platforms should implement measures to address all kinds of harmful misinformation circulating on their services. These measures should be graduated and proportionate to the risk of harm.   
Adopting a graduated and flexible approach means platforms would also be free to draw the lines between different interventions in accordance with their own policies and to achieve an appropriate balance with rights to freedom of speech and expression. 
The ACMA has outlined its expectations to guide code development 
This paper includes a series of positions that outline the ACMA’s expectations on the development of the code. These positions cover threshold issues about the scope, design, and administration of the code, and are intended to assist platforms in the development of their code(s). These positions have been informed by existing international regulatory approaches, preliminary discussions with platforms and an examination of best-practice guidelines. 
The ACMA considers that the code should cover misinformation across all types of news and information (including advertising and sponsored content) that:
  •  is of a public or semi-public nature 
  • is shared or distributed via a digital platform 
  • has the potential to cause harm to an individual, social group or the broader community.
To enable a consistent experience for Australians who use multiple platforms, the ACMA considers a single industry code would be the preferable approach. Any code should be consumer-centric, including providing a mechanism for users to easily access dispute resolution mechanisms. 
As a voluntary code, it will be a matter for individual platforms to decide on whether they participate in the development of the code or choose to be bound by the code. The ACMA would, however, strongly encourage all digital platforms with a presence in Australia, regardless of their size, to sign up to an industry-wide code to demonstrate their commitment to addressing misinformation. 
At a minimum, the code should apply to the full range of digital platforms that were outlined in the DPI terms of reference. This includes online search engines, social media platforms and other digital content aggregation services with at least one million monthly active users in Australia. 
The ACMA considers that this will likely include widely used platforms such as Facebook, YouTube, Twitter, Google Search and Google News, Instagram, TikTok, LinkedIn, Apple News and Snapchat. The ACMA anticipates that code signatories will change over time to adjust to new entrants and other market changes. 
The ACMA have developed a code model, using an outcomes-based approach, to assist platforms in composing their codes In developing a code, the ACMA considers that platforms should adopt an outcomes-based approach. This would provide signatories with a common set of aims while granting the flexibility to implement measures that are most suited to their business models and technologies. The ACMA has developed the code model below which articulates potential objectives and outcomes for the code.

25 June 2020

Environment

This week's damning ANAO report Referrals, Assessments and Approvals of Controlled Actions under the Environment Protection and Biodiversity Conservation Act 1999 comments
Despite being subject to multiple reviews, audits and parliamentary inquiries since the commencement of the Act, the Department of Agriculture, Water and the Environment’s administration of referrals, assessments and approvals of controlled actions under the EPBC Act is not effective. 
Governance arrangements to support the administration of referrals, assessments and approvals of controlled actions are not sound. The department has not established a risk-based approach to its regulation, implemented effective oversight arrangements, or established appropriate performance measures. 
Referrals and assessments are not administered effectively or efficiently. Regulation is not supported by appropriate systems and processes, including an appropriate quality assurance framework. The department has not implemented arrangements to measure or improve its efficiency. 
The department is unable to demonstrate that conditions of approval are appropriate. The implementation of conditions is not assessed with rigour. The absence of effective monitoring, reporting and evaluation arrangements limit the department’s ability to measure its contribution to the objectives of the EPBC Act.
It gets worse
Arrangements for collecting and managing information on compliance with the EPBC Act are not appropriate. The department does not have an appropriate strategy to manage its compliance intelligence, limiting its access to the regulatory information necessary for complete and accurate compliance risk assessments. Key limitations include poor linkages between sources of regulatory information and a lack of formal relationships to receive external information. 
The regulatory approach to referrals, assessments and approvals has not been informed by an assessment of compliance risk. Strategic compliance risk assessments do not inform regulatory plans. In one instance, the department’s activities to promote voluntary compliance were aligned with an identified risk of inadvertent non-compliance in the New South Wales agriculture sector. The approach to individual referrals, assessments and approvals is not tailored to compliance risk. 
While the department has established sound oversight structures, they have not been effectively implemented. Procedures for oversight of referrals, assessments and approvals by governance committees are not consistently implemented. Conflicts of interest are not managed. 
The department has not established appropriate performance measures relating to the effectiveness or efficiency of its administration of referrals, assessments and approvals. All relevant performance measures in the department’s corporate plan were removed in 2019–20, and no internal performance measures relating to effectiveness or efficiency have been established. The department’s reporting under the regulator performance framework in 2017–18 was largely reliable. 
Referrals and assessments 
Systems and processes for referrals and assessments do not fully support the achievement of requirements under the EPBC Act. Procedural guidance does not fully represent the requirements of the EPBC Act and lacks appropriate arrangements for review and update. Information systems do not meet business needs and contain inaccurate data. Staff training is not supported by arrangements to ensure completion of mandatory requirements. There is no framework to prioritise work. 
Referrals and assessments are not undertaken in full accordance with procedural guidance. Decisions have been overturned in court due to non-compliance with the EPBC Act and key documentation for decisions is not consistently stored on file. There is no quality assurance framework to assure the department that procedural guidance is implemented. 
Proxy efficiency indicators developed by the ANAO indicate the efficiency of referrals and assessments has not improved over recent years. The department has no arrangements to measure its efficiency and the implementation of proposed efficiency improvement measures has not been appropriately tracked. Most referral, assessment method and approval decisions are not made within statutory timeframes. 
Conditions of approval 
Departmental documentation does not demonstrate that conditions of approval are aligned with risk to the environment. Of the approvals examined, 79 per cent contained conditions that were non-compliant with procedural guidance or contained clerical or administrative errors, reducing the department’s ability to monitor the condition or achieve the intended environmental outcome. 
The department has not established appropriate arrangements to monitor the implementation of pre-commencement conditions of approval. The department’s systems for monitoring commencement of actions are inaccurate. The absence of procedural guidance for reviewing documents submitted as part of pre-commencement conditions leaves the department poorly positioned to prevent adverse environmental outcomes. 
Appropriate monitoring, evaluation and reporting arrangements have not been established. Performance measurement and evaluation activities do not assess the contribution of referrals, assessments and approvals to the objectives of the EPBC Act.

Fake Meat Patent Study

The IP Australia Meat expectations - an analytics report on substitute meats study analyses trends in meat substitute technologies, an emerging market as conscious consumerism and sustainability awareness grows. The report shows imitation meat technology is an area of growth and investment across the world, with a strong increase in patent filings since 2013. China dominates this sector, with the United States, Europe and Japan other major players. The report was prepared for the Department of Industry, Science, Energy and Resources to help support Australia’s meat and food industry.

The document states
This report analyses trends in meat substitute technologies, an emerging market as conscious consumerism and sustainability awareness grows. It considers two technologies; imitation meat and lab-grown meat. Recent patent data shows imitation meat technology is an area of growth and investment across the world. Between 2000 and 2012, patenting of imitation meat technologies maintained a low and relatively steady rate, with a significant increase since 2013. Large food companies are among the top applicants. 
China dominates this sector as both the largest source of innovation and the largest patent filing destination in imitation meat. It is also responsible for much of the recent growth in patenting in this technology with Guizhou Bezon Food Industry filing the largest number of patents in this sector. The United States, Europe and Japan are the other major players in this sector, with Australia ranking equal fifth, alongside Canada, as a patent filing destination. 
Lab-grown meat is a technology in its infancy, with only 10 patent families filed globally. This report outlines findings from an investigation of patent families filed since 2000, analysing trends, markets and commercial players in meat substitute technologies. .... 
258 patent families have been filed in imitation meat.  Patent family filings for imitation meat increased from 2013 onwards 85 per cent of imitation meat patent families are in an active state (in force or seeking patent protection) Guizhou Bezon Food Industry Company is the top global patent filer, with 18 patent families. China is the largest filing destination in the world for imitation meat patent families. Australia is equal fifth filing destinations for imitation meat. This report did not identify any patents filed by Australian applicants.
The study comments
As the basis for this study, worldwide patent databases were searched for all products and processes for producing foodstuffs that were intended to imitate animal meat in appearance, texture and/or flavour. The search returned a total of 258 relevant INPADOC patent families (Appendix A) filed from 2000 onwards. A majority of the patents were directed to imitation meat products derived from plants, such as soy, but some were also directed to technologies that used non-meat animal- derived products, such as milk and eggs. These were not specifically excluded from the search. The search strategy used a combination of keywords, International Patent Classification (IPC) symbols and Cooperative Patent Classification (CPC) symbols (Appendix B: Search Strategy). 
Analysing patent family filing across time can indicate growth or declines in innovation or interest in a technology. Figure 1 shows the number of patent families filed each year, both in total and by their legal status. The number of patents that are in an active state (ie that are either in force or for which protection is being sought, and are not lapsed, expired or withdrawn) provides an indication of whether applicants are continuing to protect their inventions. It is clear from Figure 1 that imitation meat is an area of growing interest. Patenting activity was low but regular up until 2012, after which the number of applications increased significantly. In 2016, 55 new families were filed. The data is not complete from 2017 onwards due to a lag in patent publication; the dip in 2017–18 reflects incomplete data rather than a trend decline. Much of the recent activity has been due to a few Chinese companies, who have submitted many applications in a short space of time. These companies will be discussed further below in the Markets and Top Applicants sections. ...
In relation to Lab Grown Meat the report states
 The search strategy used a combination of keywords, International Patent Classification (IPC) symbols and Cooperative Patent Classification (CPC) symbols (Appendix B: Search Strategy). Timeline We identified a total of 10 patent families filed for lab-grown meat, five in the early 2000’s and five between 2011 and 2017, see Figure 5. The data is incomplete from 2017 onwards due to a lag in patent publications. The search for lab-grown meat focuses on technologies for cell culture or tissue engineering of animal cells to create tissue similar to animal- derived meat. Culturing techniques for human cells, and culturing techniques directed at modifying the growth of cells in living animals, were not included in the analysis. With these restrictions in place, there is only a small number of patent families filed since 2000 that are directed to production of lab-grown meat for human consumption.

24 June 2020

Responsibility

As I head into testimony to a Victorian parliamentary committee regarding social platforms this morning I note 'Corporate Fundamental Responsibility: What Do Technology Companies Owe the World?' by Haochen Sun in (2020) 74 University of Miami Law Review 898.

Sun comments
 In this digital age, technology companies reign supreme. However, the power gained by these companies far exceeds the responsibilities they have assumed. The ongoing privacy protection and fake news scandals swirling around Facebook clearly demonstrate this shocking asymmetry of power and responsibility. 
Legal reforms taking place in the United States in the past twenty years or so have failed to correct this asymmetry. Indeed, the U.S. Congress has enacted major statutes minimizing the legal liabilities of technology companies with respect to online infringing acts, privacy protection, and payment of taxes. While these statutes have promoted innovation, they have also had the unintended effect of breeding irresponsibility among technology companies. 
Against this backdrop, this Article offers a new lens through which we can deal with the ethical crisis surrounding technology companies. It puts forward the concept of corporate fundamental responsibility as the ethical and legal foundation for imposing three distinct responsibilities upon technology companies: to reciprocate users’ contributions, play their role positively, and confront injustices created by technological development. The Article further considers how these responsibilities could be applied to improve protection of private data and to encourage responsible exercise of intellectual property rights by technology companies. 
The tripartite conception of corporate fundamental responsibility, this Article shows, is built upon the ethical theories of reciprocity, role responsibility, and social justice. Therefore, corporate fundamental responsibility paves the way for technology law to embrace ethics whole-heartedly, creating new legal and ethical guidance for the benevolent behavior of technology companies. In developing technologies, collecting data, and regulating speech, technology company leaders must act responsibly for the future of humanity.

22 June 2020

ACT Government IT Potemkin village

AN IT Security Potemkin village? The damning ACT Auditor‐General's report on Data Security in the ACT public sector states that ‘ACT Government agencies have not clearly understood the risks and requirements of securing sensitive data, and are not well placed to respond to a data breach or loss of critical business systems’.

Overall the 'comprehensive ICT Security Policy, which all agencies must comply with under the ACT Protective Security Policy Framework' has been a fizzer: agencies do not need to demonstrate their compliance and compliance with ICT Security Policy requirements is often lacking. The audit found:
  • 89% of critical ICT systems did not have a current system security risk management plan that demonstrated and documented data security risks and controls. 
  • there are significant delays in completing security plans. On average it took Shared Services over three months to commence a critical ICT system security assessment and it would then take Shared Services and ACT Government agencies on average almost eight months to complete a critical ICT system security risk management plan. 
  • agencies have not notified Shared Services of the security classification of 65% of ACT Government agency ICT systems. This makes it difficult to prioritise security protection activities. 
  •  it is not known for most critical ICT systems if there is a recovery plan in place. 
  • there is widespread use of high‐risk cloud services by agency users. This can expose sensitive or personal data to unauthorised external parties often with little recourse available.
  • there is a low level of data security awareness among staff in most agencies examined in the audit. This increases the likelihood of a data breach and its potential impact. 
The full report should be chastening for ACT ministers. A summary states
Providing secure means of handling data, both in transit and at rest, is a necessary requirement for providing online services to the community. Government agencies are held to a high standard of accountability for securing sensitive data on behalf of the community. Within the Territory, there is a data security accountability framework set in place by legislation, policies and oversight functions to monitor compliance. ACT Government agencies need to securely manage the receipt, storage, transmission and destruction of data within this framework. This audit has sought to examine whether this accountability framework is designed to provide security to agencies when managing data. Agency efforts to comply with this framework has then been examined to determine if data security risks are being managed in a way that is consistent with mandatory requirements and better practice.
The Auditor-General Concludes
DATA SECURITY GOVERNANCE AND STRATEGY 
The ACT Protective Security Policy Framework and ICT Security Policy define the minimum standards for ACT Government agencies to comply with achieving confidentiality and availability of their data and systems. Under its CYBERSEC obligations, the Framework requires agencies to comply with the ICT Security Policy. The ICT Security Policy and its related subordinate policies give agencies mandatory requirements and guidance for most aspects of the management and operation of their ICT business systems recommended by better practice. While some of these subordinate policies need to be reviewed and additional guidance should be given for agencies to manage ICT service vendors, the ICT Security Policy provides clear guidance for agencies to manage data security. 
The mandatory status of the ICT Security Policy is not supported by effective agency monitoring arrangements. The ACT Protective Security Policy Framework has annual compliance reporting from agencies on their efforts to manage protective security to the Security and Emergency Management Senior Officials Committee. But its reportable CYBERSEC compliance requirements do not provide reasonable assurance that agencies have effectively protected the data for which they are responsible. These obligations focus on the role of Shared Services to document and implement the controls contained in the ICT Security Policy, and for agencies to consult Shared Services when implementing and maintaining their ICT business systems. These obligations do not recognise the scope of agency responsibility for the security of the systems they are responsible for. These reporting arrangements are also not used to inform a whole of government data security risk assessment to determine if agencies are exposed to unacceptable data security risks. 
While there are governance committees with responsibility for oversighting and improving ACT Government agencies’ data security, they are not effectively focussed towards a common strategy that sets the priorities, resourcing and responsibilities for securing data across government. This reduces the effectiveness of these bodies to communicate to agency executives what the expectations across government are for data security, and which risks and systems should be prioritised across government to reduce the likelihood and impact of a serious data breach. 
DATA SECURITY MANAGEMENT ACT 
Government agencies have not implemented effective governance and administrative arrangements to comply with the ICT Security Policy and the ACT Protective Security Policy Framework. By not complying with ICT Security Policy requirements, the ACT Public Service is not well placed to understand what data agencies are responsible for, the risks of this data being breached, and controls to be implemented across government to manage this risk. 
Shared Services has effective tools and processes to help agencies manage data security risks by using system risk management plans and security assessments. However, as agencies have not effectively managed the security status of their systems, and Shared Services is experiencing a significant backlog of security assessments, Shared Services and agencies are not presently well placed to address gaps in data security risk management in a timely manner. 
Agencies have not clearly understood their data security risks and requirements. While one agency reviewed in this audit had documented its system security risks for one system, most agencies have not done this effectively. Agencies have not controlled the usage of cloud‐based ICT services, or determined how business needs can be met through the use of sanctioned ICT services. A particular area of risk noted is a lack of user education on how to use data securely. A lack of awareness has been demonstrated in a lack of understanding on how to share data securely, as well as to recognise when a data breach has occurred and needs to be reported. This increases the likelihood of a data breach and its potential impact. More education is needed that is targeted at the needs of agencies, and specific groups of users such as privileged and senior executive users. 
There is no whole‐of‐government data breach response plan to manage and coordinate resources and stakeholders in the event of a major data breach. The Security and Emergency Management Senior Officials Group agreed to implement improvements to government’s capability to respond to these events, but these have not yet been completed. Furthermore, individual agencies are not well placed to respond to a data breach or loss of system availability, and need to invest more effort in documenting and testing how to restore functionality of critical business systems. 
However, there are initiatives underway to manage the risk of legacy systems which is another area of risk for agency data security. More work is needed to realise the benefits of these initiatives, including: decommissioning old systems when new ones are implemented; upgrading systems to use supported technology; and securing ones that cannot be upgraded through protective controls that shield these systems from data security attacks.
Key findings are
DATA SECURITY GOVERNANCE AND STRATEGY 
The ACT Protective Security Policy Framework (December 2019) and ACT Protective Security Policy Framework Operational Procedures Manual (July 2017) and supporting policies such as the ICT Security Policy (August 2019) provide a framework for data security for ACT Government agencies. Annual directorate and agency compliance reporting, and the resulting reporting to the Security and Emergency Management Senior Officials Group, seeks to provide the leadership of the ACT Public Service with reasonable assurance that data security risks are being effectively managed. However, the suite of policy and its associated reporting does not provide: 
  • a clear picture of the status of ICT system security across government, including common data security risks, possible treatments for as many of these risks as possible within a given resource allocation, and prioritisation of where treatment efforts should be directed based on the impact of a data breach or loss; 
  • expected minimum standards for the management of ACT Government agency ICT systems such as for information security documentation and monitoring, vulnerability management, access control, administrator rights, secure data transfers and system recovery ‐ particularly where directorates and agencies do not use Shared Services to manage system security; 
  • a shared understanding of the risk tolerance for data security risks across government and how this will be translated into acceptable risk management approaches for individual systems; 
  • causes of common data security risks, issues and breaches; and 
  • current data security management capabilities, along with activities and projects underway to extend this capability. 
GOVSEC 4 of the ACT Protective Security Policy Framework (December 2019) includes 2.22 annual compliance reporting requirements for all directorates. Through this process, directorates provide assurance on aspects of their compliance with data security and other protective security requirements. The GOVSEC 4 compliance and annual reporting arrangements do not provide reasonable assurance that whole of government data security risks are being effectively managed. Agency compliance with CYBERSEC requirements and their reported efforts to address data security risks are not captured in a whole of government data security risk assessment. 
The ACT Protective Security Policy Framework (December 2019) requires directorates to follow the ICT Security Policy (August 2019), which is developed and maintained by Shared Services. The ICT Security Policy is a comprehensive policy that provides instructions for complying with most whole of government security requirements. It outlines responsibilities for data security and includes references to relevant legislation and better practice. A review of the ICT Security Policy against the requirements of the NIST Cybersecurity Framework shows that guidance is provided on most areas, but there is a gap in the guidance with respect to the management and monitoring of ICT service vendors. A small number of subordinate policy documents to the ICT Security Policy are either no longer in existence or have not been recently reviewed. 
The ACT Protective Security Policy Framework Operational Guidelines (July 2017),   which support the ACT Protective Security Policy Framework (December 2019), specifically require agencies to comply with the ICT Security Policy (August 2019). However, the annual compliance reporting obligation of directorates under GOVSEC 4 only requires them to report against the mandatory requirements of the Framework, including CYBERSEC 2 which requires that they consult with Shared Services when implementing or improving their ICT systems. There is no information or assurance in the annual directorate reporting under GOVSEC 4 as to whether and how directorates have complied with the ICT Security Policy. A requirement to consult Shared Services is not effective in providing an acceptable level of data security and the annual compliance reporting process does not provide reasonable assurance that data security risks are being effectively managed. 
There are several separate and distinct governance bodies that have a role in  influencing and determining how data security is managed by ACT Government agencies. These bodies include the Strategic Board, the Data Steering Committee, the Digital Services Governance Committee (including its Strategic IT Digital Capability Sub‐Committee) and the Security and Emergency Management Senior Officials Group. These bodies have broad and senior representation across ACT Government agencies, and are actively seeking to improve data security across government through their oversight of a series of initiatives and activities. There are a series of strategies and plans relating to data security that have been documented or are being developed across ACT Government agencies. These include Shared Services‐specific documents and whole‐of‐government documents. While the various governance bodies that have responsibility for managing and improving ACT Government data security have identified activities and improvements to implement, there is a risk that these are not connected and coordinated in an efficient manner that is driven by an overarching strategy. None of these documents presently fulfil the role of an overarching strategy or plan for ACT Government agencies to manage and improve data security. None of the strategies and plans that have been developed to date have:
  • recognised the role of the various governance bodies and stakeholders who have a responsibility for managing and improving ACT Government data security; 
  • identified interactions with legislative compliance obligations such as the Information Privacy Act 2014; 
  • an identified single responsible executive who is responsible for leading, monitoring and reporting on the implementation of the strategy. This role could be fulfilled by the Chief Digital Officer, who is currently responsible for leading improvements to IT investment to address data security and for public relations when significant data breaches occur in ACT Government; 
  • coordinated governance efforts across government to ensure a shared vision for improving data security. This may identify relevant cross‐ jurisdictional coordination needs, such as considering the future implementation of the Australian Government’s Cyber Security Strategy 2020; 
  • recognised the current state of data security for ACT Government; 
  • identified a desired state for data security based on a clearly stated risk appetite; and 
  • recognised the resources and activities required to manage and improve data security and be approved by the Strategic Board and Cabinet. 
DATA SECURITY MANAGEMENT 
The ICT Security Policy (August 2019) requires agencies to register their ICT systems 3.11 including cloud services with Shared Services. The policy also requires Shared Services to maintain an inventory of the systems, including a range of information that is useful for identifying the systems’ risks. Over time Shared Services has attempted to maintain such an inventory but this has been unsuccessful. Accordingly, there is no complete and current inventory of ICT systems in use across ACT Government agencies. New functionality is being implemented into Shared Services’ ServiceNow system, which is expected to automatically discover ICT systems and assets across the ACT Government ICT network. Until this is successfully implemented and producing the expected results, there will not be a collective and comprehensive understanding of ICT systems across ACT Government and therefore accountabilities for data assets. The use of unauthorised cloud‐based ICT services and systems presents a risk to ACT Government agencies’ data security. Typically, these cloud‐based services are identified and downloaded by ACT Government agencies’ employees. Many of these services relate to image and document conversion software. The use of these services presents a risk of exposing sensitive data to cloud‐based service providers with unknown data security protections, as well as licencing and legislative compliance risks. To help deal with these issues, Shared Services has implemented a new specialised software package that seeks to identify and analyse the use of cloud‐ based services across ACT Government agencies. Through this initiative, reports have been prepared and presented to directorates by Shared Services in January 2020, which shows that there is use of cloud‐based software and systems by users of the ACT Government ICT network. 
System security risk management plans are a mandatory requirement of the ICT Security Policy (August 2019) and are an effective control for demonstrating and documenting the data security risks and controls for ACT Government agencies’ ICT systems. There is widespread non‐compliance across the ACT Public Service with the requirement to have system security risk management plans and poor demonstration of the effective and efficient management of data security using these plans. The ACT Audit Office’s 2012 Whole‐of‐Government Information and Communication Technology Security Management and Services report recommended a mandatory requirement that directorates and agencies develop system security plans, and threat and risk assessments for all new ICT systems and legacy ICT systems using a risk analysis. In December 2019, 89% of critical ICT systems did not have a current, approved system security risk management plan. 
The assessment of a system’s security risk management plan can be conducted by the Shared Services ICT Security team or by an external provider at the directorate’s cost. As at December 2019 there was a significant backlog of requests for reviews of system security risk management plans with the Shared Services ICT Security team. 
It takes on average over three months to allocate a security resource to undertake an assessment of a critical ICT system and four months to allocate a security resource to undertake an assessment of a non‐critical ICT system. After this point, Shared Services and system owners work together to review these plans. On average it takes almost eight months to review and approve critical ICT system security risk management plans and over five months to review and approve less complex non‐ critical ICT system security risk management plans. These delays compromise the effective and efficient management of data security risks by ACT Government agencies. As part of efforts to address the issues with the timeliness and currency of system security risk management plans, Shared Services has developed a quarterly security report to directorates to highlight the status of these plans. Automated alerts are also being investigated to remind agency system owners when plans are due for review. 
The management of system security risk management plans at a system‐by‐system  level means that the management of data security is siloed across ACT Government agencies and systems and common risks are not managed in a similar way across systems. Capturing common risks and treatments from these plans across government agencies and systems is necessary to provide ACT Public Service leadership with a clear understanding of whole‐of‐government data security risk management, and to prioritise which risks and systems should receive highest attention with limited resources. 
The use of accredited cloud service providers for software implementation and maintenance reduces some data security risks, but gives rise to other risks. The use of these services requires sound contract management arrangements that allow for assurance to be obtained from vendors on the management of these risks. For two of the agencies’ systems considered as part of the audit, there were inadequate processes in place to identify and manage the data security risks; one system owner had access to certifications and reviews undertaken by the cloud service vendor to demonstrate their ongoing management of data security for the system, but did not avail themselves of this information, and the system owner for another system had not adequately monitored the vendor’s security practices. Shared Services has well established processes and systems for managing user identities and access to ICT systems. Two directorate systems examined in this audit also had adequate processes for managing this, but one system had not demonstrated appropriate management of security for its privileged or regular users. This system had users who have moved to other parts of the agency or the ACT Public Service and no longer required access. The fourth system examined was in the process of reviewing its user role group structure, which was highly complex and difficult to monitor. The Community Services Directorate has established clear procedures relating to the  types of information that could be shared and with whom. Staff within the directorate also demonstrated a good understanding of what data was considered sensitive personal information and the legislative basis for classifying it as such. Users in other audited agencies did not demonstrate an awareness of the risks associated with sensitive personal information, and of sharing this data via email or USB drives and were also unaware of the acceptable file sharing mechanisms that are available to them to securely share data with third parties. This lack of understanding and awareness across ACT Government agency users presents a risk to the security of data. 
The ACT Protective Security Policy Framework (December 2020) and the ICT Security Policy (August 2019) requires directorates to have policies and procedures in place to inform, train and counsel employees on their data security responsibilities. In the four entities examined during the audit, data security user awareness was hampered by a lack of knowledge and training to support understanding on data security and the handling of data security breaches. None of the four entities considered as part of the audit had developed a comprehensive data security awareness training package for its staff. However, some had developed discrete training packages that targeted elements of data security, such as the Community Services Directorate and the Justice and Community Safety Directorate working together to develop e‐ learning training for cyber security awareness, and ACT Corrective Services which provides security awareness training for new corrections staff. Neither Shared Services, the Territory Records Office, Security and Emergency Management Branch nor the Office of the Chief Digital Officer provide reusable training packages to agencies with respect to data security or breach management. The delivery of data security training and awareness activities, targeted to meet the needs all users including privileged users and executives, would support agencies to meet their training obligations under the ICT Security Policy (August 2019). Such training could be tailored to address agency‐specific threats, as well as reference any agency‐specific policies and procedures. INFOSEC 2 of the ACT Protective Security Policy Framework (December 2019) requires directorates and agencies to classify, mark, transfer, handle and store information relative to its value, importance and sensitivity. As part of managing the inventory of ICT systems under the ICT Security Policy (August 2019), directorates must advise Shared Services of the information classification of their ICT systems. A review of the information classification of ACT Government systems shows that for 65% of ACT Government systems Shared Services has not been notified of the system’s information classification. This hampers the ability of Shared Services to prioritise security protection activities and insufficient protection strategies may be applied to these systems. 
The need to manage and support legacy systems has led to the ACT Government incurring significant extra cost and increased data security risks from the delayed full implementation of Windows 10. Approximately 29% of existing ACT Government agency desktops have not been upgraded to Windows 10, due to the number of legacy systems that will not work in the new operating system. Maintaining extended support for Windows 7 is expected to cost the ACT Government $450,000 per annum until this operating system is decommissioned. Until this point, the ACT Government will not fully realise the improved data security benefits of the more modern Windows 10 operating system. Some improvements are being made to the management of legacy systems in recent times, including packaging legacy applications to work with Windows 10, using a secure environment to run unsupported applications, and implementing a library of application programming interfaces which could introduce a secure intermediary to operate between less secure legacy systems and the internet. 
Applying software patches to address vulnerabilities in applications and operating systems are two of the ‘Essential Eight’ strategies to mitigate data security breaches. Shared Services has developed effective processes for implementing patches to operating systems and applications. Three of the four systems examined as part of the audit were having patches implemented either by the vendor directly or by Shared Services. The fourth system was a legacy system that was no longer supported and due to be replaced and it was not having patches applied. In order to mitigate the risks to the system it was operating in a supported desktop and server environment with reduced functionality. Being able to operate in such a controlled environment is not always the case for legacy systems and, given the large number of legacy applications in the ACT Government ICT network, this is one of the most significant areas of data security risk. 
Directorates have not implemented effective audit logging policies that consider the data security risks faced by their ICT systems. For the four systems reviewed as part of the audit, agencies had implemented audit logging to the extent possible within each system, but had not determined how these logs would be used and had not determined whether other events or triggers were needed to periodically check logs. Shared Services has implemented effective audit logging practices via a security information and event monitoring system which receives logs from across the network, as well as for cloud‐based applications. It has an established and regular process for monitoring logs and events for the network and cloud application and has also reviewed and defined the events that are high risk to necessitate alerts or triggers for further investigation. 
Following a significant data breach of the ACT Government’s online directory in November 2018 the Security and Emergency Management Senior Officials Group reviewed roles and responsibilities for cyber security across the ACT Government network. To improve ACT Government responsiveness in the event of a significant data security breach, the Security and Emergency Management Senior Officials Group agreed to a series of actions in March 2019. The Security and Emergency Management Senior Officials Group intends that these actions will be completed by July 2020. 
In the event of damage to an ICT system or the loss of data, accurate system design documentation will assist in promptly rebuilding system functionality. In December 2019 the Digital Service Governance Committee was advised 68 critical directorate ICT systems did not have system design documentation and the status and accuracy of system design documentation for the other 147 systems was unknown. Two of the four systems examined as part of the audit had outdated system design documentation. 
An effective data restoration plan (also commonly referred to as system design documentation, or schematics) when paired with an appropriate patching strategy, backup schedule and restoration from backup testing is an important safeguard in providing assurance that data recovery from the loss of system availability is possible. A review of recovery plans across ACT Government agencies shows: five per cent of systems have a tested recovery plan in place; 35% of systems have a recovery plan in place, which has not been tested; six per cent of systems do not have a recovery plan in place; and for 54% of systems it is not known whether there is a recovery plan in place. None of the four systems reviewed as part of the audit had current recovery plans that had been tested through agency business continuity or lifecycle management activities.
Consequent  recommendations are -
1 WHOLE‐OF‐GOVERNMENT DATA SECURITY RISK ASSESSMENT 
Shared Services (Chief Minister, Treasury and Economic Development Directorate) and the Security and Emergency Management Branch (Justice and Community Safety Directorate) should develop a whole‐of‐government data security risk assessment. The whole‐of‐government data security risk assessment should be reviewed and updated at scheduled intervals. 
2 ICT SECURITY POLICIES 
Shared Services (Chief Minister, Treasury and Economic Development Directorate) should: a) revise and update the ICT Security Policy (August 2019) to accurately refer to supporting documents referred to in the policy. Where supporting documents and policies are out of date, they should be reviewed; and b) develop policy guidance, in support of the ICT Security Policy, for ACT Government agencies on their responsibilities with respect to managing and monitoring ICT service vendors. 
3 CYBERSEC CONTROLS AND REPORTING 
The Security and Emergency Management Branch (Justice and Community Safety Directorate), Shared Services and the Office of the Chief Digital Officer (Chief Minister, Treasury and Economic Development Directorate), through the auspices of the Security and Emergency Management Senior Officials Group should: a) review and update the CYBERSEC requirements of the ACT Protective Security Policy Framework to reflect the most important system security measures from the ICT Security Policy (August 2019). These measures should be targeted at the areas of agency responsibility and able to be reported in dashboard form; and b) require agencies to report on the implementation of these measures in their ICT systems as part of the GOVSEC 4 reporting process of the ACT Protective Security Policy Framework, in order to provide reasonable assurance that data security risks are being effectively managed. 
4 DATA SECURITY STRATEGY 
The Office of the Chief Digital Officer and Shared Services (Chief Minister, Treasury and Economic Development Directorate) and Security and Emergency Management Branch (Justice and Community Safety Directorate), in partnership with ACT Government agencies, should document and agree a whole of government data security strategy and plan. This document should identify: a) the role and responsibilities of governance bodies and agencies responsible for managing and improving data security across ACT Government; b) any related whole‐of‐government plans for addressing specific data security issues, such as the planned Cyber Security Incident Emergency Sub‐plan to the ACT Emergency Plan; c) activities and resources to improve data security for ACT Government; and d) identifying the Chief Digital Officer as the responsible senior executive for implementing the strategy to improve data security across ACT Government. 
5 SYSTEM SECURITY RISK MANAGEMENT PLAN ASSESSMENTS 
Shared Services (Chief Minister, Treasury and Economic Development Directorate) should: a) in conjunction with Recommendation 4, ensure agencies take account of the full cost of managing security across a system’s lifecycle as part of ICT projects, including undertaking security assessments; and b) address the backlog of security risk management plan assessments so that agencies can access security assessments and advice to help them manage data security risks in a timely manner. 
6 SYSTEM SECURITY RISK MANAGEMENT PLANS 
The Security and Emergency Management Branch (Justice and Community Safety Directorate) and Shared Services (Chief Minister, Treasury and Economic Development Directorate) should: a) in conjunction with Recommendation 3, require ACT Government agencies to report on the currency of their system security risk management plans using a common authoritative list of critical systems; and b) in conjunction with Recommendation 1, develop a process to capture common risks and treatments from ACT Government agencies’ system security risk management plans to inform the whole of government data security risk assessment. 
7 DATA SECURITY TRAINING 
Shared Services (Chief Minister, Treasury and Economic Development Directorate), with input from the Security and Emergency Management Branch (Justice and Community Safety Directorate) and the Office of the Chief Digital Officer (Chief Minister, Treasury and Economic Development Directorate), should coordinate the development of data security training that: a) considers the specific training needs for all users, privileged users and executives; and b) addresses the risk of using unsanctioned methods of sharing sensitive personal data. The data security training package should be capable of being delivered and customised by ACT Government agencies as necessary. 
8 DATA BREACH RESPONSE PLANS 
The Security and Emergency Management Branch (Justice and Community Safety Directorate), the Office of the Chief Digital Officer and Shared Services (Chief Minister, Treasury and Economic Development Directorate) should complete all agreed actions from the March 2019 Security and Emergency Management Senior Officials Group meeting to improve the data breach response processes.

Experts in the dock

'The New Psychology of Expert Witness Procedure' by Jason M Chin, Mehera San Roque and Rory McFadden in (2020) 42(1) Sydney Law Review 69 asks
Can procedural reforms effectively regulate expert witnesses? Expert procedures, like codes of conduct and court-appointed experts, remain controversial among academics and courts. Much of this discussion, however, has been divorced from the science of the reforms. In this article, the authors draw from emerging work in behavioural ethics and metascience that studies procedures analogous to those that are being used in courts. This work suggests that procedures can be effective, as they have been in science, if directed at key vulnerabilities in the research and reporting process. The authors’ analysis of the metascience and behavioural ethics literature also suggests several nuances in how expert evidence procedure ought to be designed and employed. For instance, codes of conduct require specific and direct wording that experts cannot interpret as ethically permissive. Further, drawing on a recent case study, courts have an important role to play in establishing a culture that takes codes as serious ethical responsibilities, and not simply as pro forma requirements. 
 The authors argue
In response to the threat of partisan expert witnesses, legal systems have developed a variety of procedural mechanisms (for example, expert codes of conduct, concurrent evidence, and court-appointed experts) to help manage experts and maintain public trust in the courts.[1] These procedures have inspired considerable academic and professional debate, and uneven adoption by courts.  However, this discussion has been almost entirely uninformed by empirical research.  In contrast with this experience in law, several sciences are enthusiastically enacting procedural reforms, which are being robustly tested and which rely on a large body of psychological research. This new area of metascientific and psychological research provides a novel perspective on procedural reform, suggesting such reform can meaningfully contribute to the regulation of expert witnesses. It also suggests how procedures ought to be designed and implemented. In this article, we explore that connection and, in doing so, the possibilities and limits of expert witness procedure. 
In law, procedural reform aimed at expert partisanship has been controversial, garnering professional and academic support, but also sceptical and critical commentary. In particular, the critics have pointed out that the focus on individual expert partisanship promotes a narrow understanding of current problems with expert evidence, and also that expert procedures were designed without the benefit of empirical testing and may have perverse effects. Moreover, in forensic science specifically, partisanship may be a less pressing concern than the fact that many practices have not been demonstrated to actually work.
We seek to develop this discussion by highlighting an emerging corner of metascientific research (that is, the scientific study of science itself) that examines analogous procedural reform in science. These new procedures — grounded in the psychological study of ethical behavioural — have responded to a growing concern from many fields that many published studies cannot be reproduced by independent researchers. Such reforms include procedural modifications to the way scientists typically see their findings reviewed by others and published. Importantly, these reforms have received empirical testing demonstrating they often work and have been endorsed by respected scientific bodies, which may increase their ethical and psychological force. As we will discuss below, these insights from metascience help provide a roadmap for procedural reform in courts. Expert codes of conduct may especially benefit from recent research in metascience.
Our emphasis on codes of conduct — a procedural reform that spans civil and criminal trials in New South Wales (‘NSW’) — makes our analysis necessarily broad. That said, we recognise that criminal and civil litigation engage different policy considerations and practicalities (for example, the recent emphasis on efficiency in civil litigation). As to the latter, criminally accused parties frequently cannot afford their own expert witnesses and must rely on the expert proffered by the Crown. So, in the criminal context, robust expert procedure may be especially important. Indeed, we will focus on criminal cases in our legal analysis.Any application of our suggestions should mind the significant policy gap between civil and criminal cases.
In Part II, we briefly set the scene with some of the most significant procedural reforms that have been introduced to manage the presentation, form and content of expert evidence and expert reports. Part III introduces new research in metascience and behavioural ethics (that is, the psychological study of the situational factors that influence ethical behaviour) that founds a procedural reform movement in science. As we discuss, these reforms are being eagerly adopted in many scientific fields. Part IV then begins the discussion about how revelations from metascience and behavioural ethics could be leveraged to improve expert evidence procedure, putting them on firmer (meta)scientific footing. In Part V, we conclude with some limitations that can be expected of even the most scientifically grounded expert procedural reforms.

Robotics and Automated Vehicle Testing

'Robo-Apocalypse cancelled? Reframing the automation and future of work debate' by Leslie Willcocks in (2020) Journal of Information Technology argues
Robotics and the automation of knowledge work, often referred to as AI (artificial intelligence), are presented in the media as likely to have massive impacts, for better or worse, on jobs skills, organizations and society. The article deconstructs the dominant hype-and-fear narrative. Claims on net job loss emerge as exaggerated, but there will be considerable skills disruption and change in the major global economies over the next 12 years. The term AI has been hijacked, in order to suggest much more going on technologically than can be the case. The article reviews critically the research evidence so far, including the author’s own, pointing to eight major qualifiers to the dominant discourse of major net job loss from a seamless, overwhelming AI wave sweeping fast through the major economies. The article questions many assumptions: that automation creates few jobs short or long term; that whole jobs can be automated; that the technology is perfectible; that organizations can seamlessly and quickly deploy AI; that humans are machines that can be replicated; and that it is politically, socially and economically feasible to apply these technologies. A major omission in all studies is factoring in dramatic increases in the amount of work to be done. Adding in ageing populations, productivity gaps and skills shortages predicted across many G20 countries, the danger might be too little, rather than too much labour. The article concludes that, if there is going to be a Robo-Apocalypse, this will be from a collective failure to adjust to skills change over the next 12years. But the debate needs to be widened to the impact of eight other technologies that AI insufficiently represents in the popular imagination and that, in combination, could cause a techno-apocalypse.
The National Transport Commission 2020 Review of ‘Guidelines for trials of automated vehicles in Australia’: Discussion paper
 reviews the National Transport Commission (NTC) and Austroads’ Guidelines for trials of automated vehicles in Australia. The guidelines were released in 2017 to support nationally consistent conditions for automated vehicle trials in Australia. The NTC has undertaken research and targeted consultation to present potential updates to the guidelines that aim to benefit trialling organisations and road transport agencies. Updates could include: further detail about requirements; alignment with the future commercial deployment framework; clarifying the application of the guidelines to other technologies; and improving administrative processes.
 The paper states
The National Transport Commission and Austroads’ Guidelines for trials of automated vehicles in Australia were released in May 2017 to support nationally consistent conditions for automated vehicle trials in Australia. The guidelines were intended to:
  • provide certainty and clarity to industry regarding expectations when trialling in Australia 
  • help agencies manage trials in their own jurisdictions as well as across state borders 
  • establish minimum standards of safety 
  • help assure the public that roads are being used safely 
  • help raise awareness and acceptance of automated vehicles in the community.
Transport and infrastructure ministers directed that the guidelines should be reviewed every two years. We began this review of the guidelines in 2019 and it is the first to take place since they were published. The purpose of this discussion paper is to assess how well the guidelines are working in practice and to seek broader stakeholder views on any required changes. 
Context 
Since the guidelines were published in May 2017 there have been a number of developments in trialling and the development of regulatory frameworks for automated vehicles:
  • Trials have now taken place in every Australian state and territory, and trialling organisations and road transport agencies can share their experience of the application, approval and operation of trials. 
  • There has been further development of the regulatory framework for the commercial deployment of automated vehicles, which will eventually succeed the trials framework. 
  • International guidance has further evolved.
The objectives of the review are to identify:
  • whether the guidelines have assisted governments and trialling organisations 
  • challenges faced by governments and trialling organisations using the guidelines or in applying for, approving, operating and evaluating trials 
  • additional requirements governments have placed on trialling organisations 
  • whether the guidelines should be updated to ensure a nationally consistent and safe approach to automated vehicle trials in Australia.
Consultation topics 
In late 2019 the NTC undertook targeted consultation and a review of international guidance to inform this discussion paper. Through this consultation we have learned that trialling organisations and road transport agencies have found the guidelines useful, particularly as a starting point to guide trialling organisations as they prepare their trial applications. We have also learned that the guidelines could provide further detail to assist trialling organisations and to provide some consistency in applications for road transport agencies. As well, we have learned that there are a number of differences in trial requirements and application processes across states and territories, which has led to differing experiences in gaining approvals for trials. 
Consultation topics in this discussion paper fall under five broad categories:
  • content and level of detail in the current guidelines (chapter 3) 
  • application of the guidelines (chapter 4) 
  • administrative processes and harmonisation (chapter 5) 
  • other automated vehicle trial issues outside the scope of the guidelines (chapter 6).
There could be a number of updates to the guidelines that will benefit both trialling organisations and road transport agencies. These include
  • ▪ further detail about safety, traffic management and data and information requirements; 
  • further alignment with future safety requirements for commercial deployment; 
  • clarifying the application of the guidelines to other technologies, operating domains and types of trials; and 
  • improving the efficiency of administrative processes at the point of application. 
We are seeking views from stakeholders on the potential updates discussed in this paper and on any other useful changes. We want to ensure the guidelines support safe and innovative trials in Australia. This will help Australia gain the safety and productivity benefits of this technology. 
List of questions 
1 Should the guidelines be updated to improve the management of trials (section 3 of the guidelines) and, if so, why? Consider in particular:. 
2 Should the guidelines be updated to improve the safety management of trials (section 4 of the guidelines) and, if so, why? Consider in particular: 
3 What issues have been encountered when obtaining or providing insurance? 
4 Are the current insurance requirements sufficient (section 5 of the guidelines)? If not, how should they change?. 
5 Should the guidelines be updated to improve the provision of relevant data and information (section 6 of the guidelines)? 
6 Is there any additional information the guidelines should include for trialling organisations? 
7 Should the guidelines apply to any other emerging technologies (discussed in chapter 4 or other technologies) and operating domains?. 
8 Are there any additional criteria or additional matters relevant to the trials of automated heavy vehicles that should be included in the guidelines?. 
9 Are there currently any regulatory or other barriers to running larger trials? If so, how should these barriers be addressed? (Consider the guidelines, state and territory exemption and permit schemes, and Commonwealth importation processes.). 
10 Should the guidelines continue to allow commercial passenger services in automated vehicle trials? If so, should the guidelines reference additional criteria that trialling organisations should be subject to, and what should these criteria be?. 
11 What challenges have you faced with administrative processes when applying for approving trials of automated vehicles, and how could these be addressed?. 
12 Are there any other barriers to cross-border trials? Is there a need to change current arrangements for cross border trials? 
13 Should there be a more standardised government evaluation framework for automated vehicle trials? If so, what are the trial issues that should be evaluated?.
14 Should the results of evaluations be shared between states and territories? If so, how should commercially sensitive information be treated? 
15 What works well in the automated vehicle importation process, and what are the challenges?. 
16 Is there anything further that should be done to facilitate a transition from trial to commercial deployment? 
17 Are there any matters that the NTC should consider in its review of the guidelines?

Property

Own Data? Ethical Reflections on Data Ownership' by Patrik Hummel, Matthias Braun and Peter Dabrock in (2020) Philosophy and Technology comments
 In discourses on digitization and the data economy, it is often claimed that data subjects shall be owners of their data. In this paper, we provide a problem diagnosis for such calls for data ownership: a large variety of demands are discussed under this heading. It thus becomes challenging to specify what—if anything—unites them. We identify four conceptual dimensions of calls for data ownership and argue that these help to systematize and to compare different positions. In view of this pluralism of data ownership claims, we introduce, spell out and defend a constructive interpretative proposal: claims for data ownership are charitably understood as attempts to call for the redistribution of material resources and the socio-cultural recognition of data subjects. We argue that as one consequence of this reading, it misses the point to reject claims for data ownership on the grounds that property in data does not exist. Instead, data ownership brings to attention a claim to renegotiate such aspects of the status quo.
The authors argue
 Data seem to be produced on unprecedented scales. Sensors, wearables, and devices continuously translate physical movements and states of affairs into data points. When browsing the internet or using social media, analytic tools process each and every click. Shopping interests and behaviours feed into tailor-made adverts and products. Networked cars and autonomous driving rest on large-scale gathering and processing of vehicle and traffic data. Precision medicine aims to search for patterns and correlations in huge sets of patient data, and promises to personalize prevention, diagnostics, and treatments to the specific characteristics and circumstances of individual patients. Industry 4.0 datafies and automates steps in manufacturing and production. The Internet of Things extends digitization, datafication, and networked objects even further. The recurring observation is that data processing will become increasingly pervasive and powerful. Already now, we witness transformations in how we perceive, frame, think, value, communicate, negotiate, work, coordinate, consume, keep information confidential, and make it transparent. 
One—if not the—pressing question in the context of digitization is whether foundational rights of individual subjects are respected, and what it takes to safeguard them against interferences. One frequently discussed suggestion is that these questions arise against the backdrop of contested relations of ownership, i.e. the relation between an owner and her property. There is a set of expectations associated with data ownership. “It gives hope to those wishing to unlock the potential of the data economy and to those trying to re-empower individuals that have lost control over their data” (Thouvenin et al. 2017, 113). In this spirit, calls for data ownership demand that data can be property. Yet, commentators caution that “[s]implified versions of ownership […] may create compelling soundbites but provide little direction in practice” (The British Academy and The Royal Society 2017, 32). While data have tangible aspects, such as their relation to technical-material infrastructures, they also seem to differ from ordinary resources and tangible property (Prainsack 2019b, 5). In a digitized and datafied lifeworld, claims to data are indispensable towards claiming fundamental rights and freedoms. These preliminary observations prompt us to clarify what data ownership exactly means, how it is justified, what it tries to achieve, and whether it succeeds in promoting its aims. 
The present paper explores the content of claims for data ownership. It has two goals: first, it provides an in-depth analysis of different notions of data ownership and uncovers inherent conceptual tensions and puzzles. As we will argue, a variety of considerations are put forward under the heading of data ownership. The notion is ambiguous and even paradoxical: it used to articulate and taken to support claims that stand in tension and are mutually incompatible. 
Second, we argue that all of these dimensions of data ownership matter for informational self-determination, understood as the ability of data subjects to shape how datafication and data-driven analytics affect their lives, to safeguard a personal sphere from others, and to weave informational ties with their environment. Specifically, and drawing on a debate between Axel Honneth and Nancy Fraser, we demonstrate how the meanings of data ownership raise both issues of material ownership (pertaining to the sphere of distribution) and issues of socio-cultural ownership (pertaining to the sphere of recognition). Our proposal is that important entanglements between both spheres get overlooked if we merely focus on one of them. For informational self-determination, both are relevant. Thus, we need to take seriously the full range of dimensions of data ownership in order to understand how data subjects exercise informational self-determination, and how such exercises can be facilitated and promoted. We discuss these challenges under the heading of data sovereignty (German Ethics Council 2017, 2018; Hummel et al. 2018; Hummel et al. 2019, 26–27). 
In order to pursue these goals, we begin by briefly considering data ownership from a legal perspective (2.). As we will show, there is a debate about the compatibility between data ownership and current legal frameworks. Moreover, a number of rationales that typically discourage the institutionalization of data ownership are beginning to disintegrate in contemporary big data environments, which we take to suggest that the case for data ownership is worth debating. We then go on to provide interpretative contributions on what it would mean to establish or to maintain data ownership (3.). As it turns out, the substantive demands and goals vary significantly across discussants. We distinguish four dimensions of data ownership. Each of them is debated by reference to a pair of conceptual poles: the institutionalization of property versus cognate notions of quasi-property (3.1), the marketability versus the inalienability of data (3.2), the protection of data subjects versus their participation and inclusion into societal endeavours (3.3), and individual versus collective claims and interests in data and their processing (3.4). We propose that this characterization explains why different proposals on data ownership articulate diverging or even mutually incompatible demands, and helps us to get a grip on what is at stake when the notion is invoked. Drawing on Honneth and Fraser (4.), we go on to argue that all of these dimensions are vital for informational self-determination. Statements on data ownership touch upon and go back and forth between two different spheres: the redistribution of material resources and the socio-cultural recognition of data subjects. In view of our findings, the notion of data ownership can be understood as an expressive resource for articulating and negotiating claims concerning both spheres. 
In the following, we use the terms ‘ownership’ and ‘property rights’ as follows: “Property rights […] are the rights of ownership. In every case, to have a property right in a thing is to have a bundle of rights that defines a form of ownership” (Becker 1980, 189–190). ‘Property’ refers to the thing(s) to which property rights apply.