16 October 2024

Digital Courts and AI

Victorian Law Reform Commission's Artificial Intelligence in Victoria’s Courts and Tribunals: Consultation Paper reflects the following Terms of Reference

Opportunities and risks of artificial intelligence in Victoria’s courts and tribunals Artificial intelligence (AI) tools are rapidly evolving, with their application increasing across society. There is potential for the use of AI in Victoria’s courts and tribunals to improve user experiences and generate efficiencies. The use of AI tools carries both risks and opportunities for fairness, accountability, transparency and privacy as well as improvements to accessibility. 

The Victorian Law Reform Commission (the Commission) is asked to make recommendations on legislative reform opportunities and principles to guide the safe use of AI in Victoria’s courts and tribunals. In developing its recommendations, the Commission should consider:

• opportunities to build on existing legislation, regulations and common law in supporting the use of AI within Victoria’s courts and tribunals; • the benefits and risks of using AI in Victoria’s courts and tribunals, including risks relating to accountability, privacy, transparency, and the accuracy and security of court records; • the need to maintain public trust in courts and tribunals, and ensure integrity and fairness in the court system; • the rapid development of AI technologies and how this may influence the extent to which such technologies should be adopted and regulated, and; • applications of AI and how it is regulated in comparable jurisdictions and contexts (including work being done to develop a framework for regulating AI at the federal level in Australia) and potential learnings for Victoria.

The Commission is asked to provide principles or guidelines that can be used in the future to assess the suitability of new AI applications in Victoria’s courts and tribunals. 

 The Consultation Paper features a 'Question List' 

 Chapter 2: What is artificial intelligence? 

1. Should courts and tribunals adopt a definition of AI? If so, what definition? 2. Are there specific AI technologies that should be considered within or out of the scope of this review? 

Chapter 3: Benefits and risks of AI 

3. What are the most significant benefits and risks for the use of AI by a. Victorian courts and tribunals? b. legal professionals and prosecutorial bodies? c. the public including court users, self-represented litigants and witnesses? 

4. Are there additional risks and benefits that have not been raised in this issues paper? What are they and why are they important? 

Chapter 4: AI in courts and tribunals 

5. How is AI being used by: a. Victorian courts and tribunals b. legal professionals in the way they interact with Victorian courts and tribunals c. the public including court users, self-represented litigants and witnesses? 

6. Are there uses of AI that should be considered high-risk, including in: a. court and tribunal administration and pre-hearing processes b. civil claims c. criminal matters How can courts and tribunals manage those risks? 

7. Should some AI uses be prohibited at this stage? 

Chapter 5: Regulating AI: the big picture 

8. Are there lessons from international approaches that we should consider in developing a regulatory response for Victorian courts and tribunals? 

9. What would the best regulatory response to AI use in Victorian courts and tribunals look like? Consider: a. which regulatory tools would be most effective, including rules, regulations, principles, guidelines and risk management frameworks, in the context of rapidly changing technology. b. whether regulatory responses should be technologically neutral, or do some aspects of AI require specific regulation? 

10. How should court and tribunal guidelines align with AI regulation by the Australian Government? 

Chapter 6: Principles for responsible and fair use of AI in courts and tribunals 

11. Are the principles listed in this chapter appropriate to guide the use of AI in Victorian courts and tribunals? What other principles might be considered? 

12. Are principles sufficient, or are guidelines or other regulatory responses also required? 

13. What regulatory tools, including guidelines, could be used to implement these high- level principles in Victoria’s courts and tribunals? 

14. How can the use of AI by courts and tribunals be regulated without interfering with courts’ independence, and what risks should be considered? 

15. Is it appropriate to have varying levels of transparency and disclosure depending on the use of AI by courts and tribunals? (For example, use by administrative staff compared with judicial officers.) 

16. Who should be able to contest an AI decision, and when? Is the capacity to contest necessary for decisions made by court administration staff, or only judicial decisions? Consider how courts and tribunals can ensure sufficient information is available to enable decisions to be contested. 

Chapter 7: AI in courts and tribunals: current laws and regulation 

17. Building on Table 7, are other statutes or regulations relevant to the safe use of AI in Victorian courts and tribunals? 

18. Are there legislative or regulatory gaps or barriers where reform is needed for the safe use of AI in courts and tribunals? 

19. What, if any, changes to legislation, rules or processes are necessary to enable courts and tribunals to: a. safely use AI b. consider evidence in relation to AI c. implement human rights principles (Should there be a human rights impact assessment of any AI use in courts and tribunals?) d. align AI use with privacy responsibilities? 

20. How can changes be achieved while maintaining appropriate flexibility? 

21. Is there a need to strengthen professional obligations to manage risks relating to AI? If so, what changes might be required to the Legal Profession Uniform Law, Civil Procedure Act or regulations?

Chapter 8: Developing guidelines for the use of AI in Victoria’s courts and tribunals 

Guidelines for court and tribunal users

22. Should guidelines be developed for Victorian court and tribunal users relating to the use of AI? 

23. Should guidelines require disclosure of AI use? If so, who should it apply to: a. legal professionals b. expert witnesses c. the public (including self-represented litigants and witnesses)? 

24. What are the benefits and risks of disclosure? If mandatory, what form should disclosure take? 

25. What is the role for courts in regulating use of AI by legal professionals? What is the role of professional bodies such as the Victorian Legal Services Board and Commissioner, the Law Institute of Victoria and the Bar Association? 

26. Are there other guidelines or practice notes relevant to court users and AI use that should be considered by the Commission? 

Guidelines for courts and tribunals 

27. Should guidelines be developed for the use of AI by Victorian courts and tribunals including for administrative staff, the judiciary and tribunal members? If so, what should they include and who should issue them? 

28. Should there be dedicated guidelines for judicial officeholders? 

29. Are there tools from other jurisdictions you think should be incorporated into guidelines to support Victorian courts and tribunals in their use of AI? If so, what are they? 

30. Should courts and tribunals undertake consultation with the public or affected groups before using AI and/or disclose to court users when and how they use AI? What other mechanisms could courts and tribunals use to promote the accountable and transparent use of AI? 

31. Should there be different guidelines or additional considerations for the use of AI in relation to criminal and civil law matters? 

Assessment framework for courts and tribunal

32. Should an assessment framework be developed to guide the assessment of the suitability of AI technology in Victorian courts and tribunals? 

33. Does the NSW AI Assurance Framework provide a useful model for Victoria’s courts and tribunals? Why or why not? What other models or guidelines should be considered? 

34. How can risk categories (low, medium and high) be distinguished appropriately? What should be considered high risk? 

35. What potential harms and benefits should an AI assessment framework for Victoria’s courts and tribunals consider? 

Chapter 9: Support for effective use of principles and guidelines about AI 

36. Are there appropriate governance structures in courts and tribunals to support safe use of AI? 

37. What governance tools could be used to support the effective use of AI in courts and tribunals such as: a. an AI register for AI systems used in the justice system? b. accreditation of AI systems? 

38. Who should be responsible for developing and maintaining these systems? 

39. How can education support the safe use of AI in courts and tribunals? 

40. Are there opportunities to improve the current continuing professional development system for legal professionals about AI?