The NSW Ombudsman report The new machinery of government: using machine technology in administrative decision-making comments
Our role at the NSW Ombudsman is to oversee government agencies and officials – helping to ensure they are conducting themselves lawfully, making decisions reasonably, and treating all individuals equitably and fairly (chapter 2).
When agencies and officials fail to do this they are said to have engaged in maladministration or, more formally, section 26 conduct (referring to section 26 of the Ombudsman Act 1974 (NSW), which sets out the various categories of wrong conduct). Clearly, the use by government agencies of machine technology – which might be referred to as artificial intelligence or automated decision-making (see chapter 3) – is not inherently a form of maladministration.
There are many situations in which government agencies could use appropriately-designed machine technologies to assist in the exercise of their functions, which would be compatible with lawful and appropriate conduct. Indeed, in some instances machine technology may improve aspects of good administrative conduct – such as accuracy and consistency in decision-making, as well as mitigating the risk of individual human bias. However, if machine technology is designed and used in a way that does not accord with administrative law and associated principles of good administrative practice, then its use could constitute or involve maladministration. It could also result in legal challenges, including a risk that administrative decisions or actions may later be held by a court to have been unlawful or invalid.
1.1 Machine technology is on the rise, and offers many potential benefits
The use and sophistication of machine technology is increasing worldwide, and it has the potential to bring many potential benefits to government and the public (chapter 4).
These include:
- Efficiency and cost savings for government.
- Reduced red tape.
- Increased accuracy.
- Improved consistency.
- Increased productivity and re-focusing of staff to ‘higher value’ activities.
- Better customer service and experience.
- Insights and learning.
Of course, benefits cannot be assumed to follow as a matter of course, and it is important to be realistic about what benefits (and risks) particular technology will deliver in a particular context. Untested assumptions or utopian beliefs about technology should not drive automation strategies.
1.2 Why we have written this report
We were prompted to write this report after becoming aware of one agency (Revenue NSW) using machine technology for the performance of a discretionary statutory function (the garnisheeing of unpaid fine debts from individuals’ bank accounts), in a way that was having a significant impact on individuals, many of whom were already in situations of financial vulnerability.
Following a series of complaints to our office, Revenue NSW worked responsively with us over time to ensure that its garnishee system operated more fairly, by taking account of vulnerability and situations of hardship. However, we still had questions as to whether Revenue NSW’s system of garnishee automation was legally consistent with its statutory functions. We sought legal advice from Senior Counsel, which confirmed our doubts. The full Revenue NSW case study, including the legal advice, is set out in annexure A.
Currently, we do not know how many other NSW Government agencies are using, or developing, machine technology to assist them in the exercise of their statutory functions.
However, our experience with Revenue NSW and a scan of the Government’s published policies on the use of ‘AI’ and other digital technologies suggests that there may be inadequate attention being given to fundamental aspects of public law that are relevant to machine technology adoption.
1.3 Administrative law and practice must be given central attention
Some of the broader concerns about machine technology use by the private sector, in terms of privacy, human rights, ethics and so on, also apply (in some cases with greater force) to the public sector.
However, the powers, decisions and actions of government agencies and officials are constitutionally different from that of the general private sector.
This means that the public sector’s use of machine technology, particularly for the purposes of statutory decision-making, must also be assessed from an administrative law perspective (chapter 5). We believe that this assessment must be central to the use of this technology.
1.4 Administrative law requirements for good decision-making
For simplicity, we can broadly group the requirements for good decision-making in the following ways (chapter 6):
Proper authorisation – this means that there is legal power to make the relevant decision, that the person making the decision has the legal authority to do so, and that the decision is within the scope of decision-making power (including, in particular, within the bounds of any discretion conferred by the power) (chapter 7). The requirement for proper authorisation means that statutory functions are not and cannot be directly given or delegated to a machine. It does not necessarily mean that the authorised person cannot be assisted by machine technology. There is, however, no uniform answer as to what forms of machine technology can be used, and to what extent, in the performance of a particular statutory decision-making function. This must be carefully considered on a case-by-case basis by looking at the particular statute, its purpose, and the context in which it applies. However, if the function is discretionary, machine technology must not be used in a way that would result in that discretion being fettered or effectively abandoned. In effect, this means that discretionary decision-making functions cannot be fully automated.
Appropriate procedures – this means that the decision has followed a fair process, that it has met other legal and ethical obligations, and that reasons are given for the decision (particularly where it significantly affects the rights or interests of individuals) (chapter 8).
Generally, a fair process requires decisions to be made without bias on the part of the decision maker (‘no-bias rule’) and following a fair hearing of the person affected (‘hearing rule’). Machine technology can introduce the possibility of a different form of bias known as ‘algorithmic bias’. Algorithmic bias arises when a machine produces results that are systemically prejudiced or unfair to certain groups of people. It is unclear whether the presence of algorithmic bias would necessarily constitute a breach of the no-bias rule (as that rule is traditionally concerned with actual or apprehended bias on the part of the particular decision maker). Even if it does not, however, algorithmic bias may still lead to unlawful decisions (because they are based on irrelevant consideration or contravene anti-discrimination laws) or other maladministration (because they involve or result in conduct that is unjust or improperly discriminatory).
Where machine technology is used in the exercise of a function under a particular statute it also needs to comply with other statutes and common law requirements. Privacy, freedom of information and anti-discrimination laws, in particular, will almost always be relevant. Having appropriate procedures also means providing where required, or being able to provide where requested, reasons to those who are affected by a decision. In our view, this means also informing those affected if a machine has made (or contributed to the making of) a decision. Where reasons are required, they must be accurate, meaningful, and understandable, which can raise particular challenges when machine technology is used.
Appropriate assessment – this means that the decision answers the right question, that the decision is based on a proper analysis of relevant material, and that the decision is based on the merits and is reasonable in all the circumstances (chapter 9). Using machine technology in the exercise of statutory functions means translating legislation and other guidance material (such as policy) into the form of machine-readable code. A key risk is the potential for errors in this translation process, and the consequent potential for errors and unlawful decisions being made at scale. When designing and implementing machine technology, it is also essential to ensure that its use does not result in any obligatory considerations being overlooked or extraneous considerations coming into play. While the use of machine technology may enhance the consistency of outcomes, agencies with discretionary functions must be conscious of the duty to treat individual cases on their own merits.
Adequate documentation – agencies are required to properly document and keep records of decision-making (chapter 10). In the context of machine technology, this means keeping sufficient records to enable comprehensive review and audit of decisions. Documentation relating to different ‘versions’ of the technology, and details of any updates or changes to the system, may be particularly important.
1.5 Good practice for designing and implementing machine technology
In light of the above, there are some key proactive steps that agencies should take when considering the design and adoption of machine technology that will help them to ensure they comply with principles of administrative law and good decision-making practice. xx In particular, when setting out to design machine technology for use in the exercise of statutory functions, agencies should:
1. establish a multi-disciplinary design team that involves lawyers, policymakers, and operational experts, as well as technicians, with roles and responsibilities that are clearly defined (chapter 11)
2. assess the appropriate degree of human involvement in the decision-making processes, having regard to the nature of the particular function and the statute in question (chapter 12)
3. ensure appropriate transparency, including by deciding what can and should be disclosed about the use of machine technology to those whose interests may be affected (chapter 13)
4. test before operationalising, and establish ongoing monitoring, audit and review processes (chapter 14)
5. consider whether legislative amendment is necessary or prudent (chapter 15).
1.6 The role of Parliament in authorising machine technology
If legislation is introduced to enable the use of machine technology, then this provides an opportunity for public and Parliamentary debate on the properties that should be required of that technology.
Whether or not these are ultimately prescribed as mandatory requirements in the legislation itself, the kinds of questions that might be asked of government agencies that are seeking legislative authorisation of machine technology could include:
Is it visible? What information does the public, and especially those directly affected, need to be told regarding the involvement of the machine, how it works, its assessed accuracy, testing schedule etc? Are the design specifications and source code publicly available – for example as ‘open access information’ under the Government Information (Public Access) Act 2009? Is an impact assessment required to be prepared and published?Is it avoidable? Can an individual ‘opt out’ of the machine-led process and choose to have their case decided through a manual (human) process?
Is it subject to testing? What testing regime must be undertaken prior to operation, and at scheduled times thereafter? What are the purposes of testing (eg compliance with specifications, accuracy, identification of algorithmic bias)? Who is to undertake that testing? What standards are to apply (eg randomised control trials)? Are the results to be made public?
Is it explainable? What rights do those affected by the machine outputs have to be given reasons for those outcomes? Are reasons to be provided routinely or on request? In what form must those reasons be given and what information must they contain?
Is it accurate? To what extent must the predictions or inferences of the machine be demonstrated to be accurate? For example, is ‘better than chance’ sufficient, or is the tolerance for inaccuracy lower? How and when will accuracy be evaluated?
Is it subject to audit? What audit records must the machine maintain? What audits are to be conducted (internally and externally), by whom and for what purpose?
Is it replicable? Must the decision of the machine be replicable in the sense that, if exactly the same inputs were re-entered, the machine will consistently produce the same output, or can the machine improve or change over time? If the latter, must the machine be able to identify why the output now is different from what it was previously?
Is it internally reviewable? Are the outputs of the machine subject to internal review of a human decision maker? What is the nature of that review (eg full merits review)? Who has standing to seek such a review? Who has the ability to conduct that review and are they sufficiently senior and qualified to do so?
Is it externally reviewable? Are the outputs of the machine subject to external review or complaint to a human decision maker? What is the nature of that review (eg for example, merits review or review for error only)? Who has standing to seek such a review? If reviewable for error, what records are available to the review body to enable it to thoroughly inspect records and detect error?
Is it compensable? Are those who suffer detriment by an erroneous action of the machine entitled to compensation, and how is that determined?
Is it privacy protective and data secure? What privacy and data security measures and standards are required to be adhered to? Is a privacy impact assessment required to be undertaken and published? Are there particular rules limiting the collection, use and retention of personal information?
1.7 The way forward – starting with increased visibility
We are hopeful that this report will contribute to public and especially Parliamentary debate about the adoption of machine technology by government, and its proper limits and regulation. In the final chapter of this report we identify avenues for future consideration, including a question around whether some forms or applications of machine technology might raise such significantly new issues and risks that consideration should be given to new forms of regulation – including mandatory requirements around transparency, pre-operation validation testing and routine auditing, and external review and oversight (chapter 16).
One risk, for example, may be that machine technology will be capable of producing new forms of extremely large-scale systemic injustices, to which the existing framework and institutions of administrative law are ill-equipped to respond. However, a significant impediment to meaningful debate about the future governance of machine technology use by government is an almost complete lack of transparency about that use.
As mentioned above, we do not know how NSW Government agencies may currently be using machine technology to assist them in the exercise of statutory decision-making functions – and so we do not know how those systems have been designed, what they are being used for, and what (if any) assurance has been obtained that they are operating lawfully and in accordance with principles of good administrative practice. This is a significant problem. Some technology use may be lawful and appropriately designed and used, but other technology may not. While we do not consider that visibility is, of itself, a sufficient remedy to address potential concerns that might arise with the use of machine technology, it is an essential starting point.
Following this report, therefore, we will seek to work with relevant bodies, including Digital NSW (part of the Department of Customer Service) and the Office of Local Government, to comprehensively map current and proposed types and uses of machine technology (chapter 2). We will also look inward to consider what more we can do to support agencies and citizens, as well as our own staff, to understand the use of machine technology – and to ensure that administrative law and the enduring values of good public administration, including legality, transparency and fairness, are given central attention.