There has been several news reports recently of lawyers filing documents in court referencing non-existent cases and citations. One such instance was where an Australian lawyer was found to use ChatGPT to draft legal submissions which referenced cases that did not exist[1].
The Supreme Court of New South Wales (Australia) released some Practice Directions (“Practice Directions”) for lawyers around the use of Generative Artificial Intelligence (“Gen AI”).
While these AI guidelines for lawyers may not be applicable outside of New South Wales in Australia, they offer valuable insights for legal professionals worldwide on how Gen AI use is being regulated.
What is Generative Artificial Intelligence
The Practice Directions defines Gen AI as “form of artificial intelligence that is capable of creating new content, including text, images or sounds, based on patterns and data acquired from a body of training material.”[2]
It specifically excludes spell check software, transcription software, search engines and software used to search a database of judgements[3].
By this definition, Gen AI would include large language models such as Chat GPT, Claude, Sora etc. It would also include legal specific AI software such as AI Lawyer, Westlaw Precision and Lexis+AI.
Risks of using Gen AI in law
The Practice Directions highlight a few risks of using Gen AI in law which include:
- Hallucinations i.e. generation of apparently plausible, authoritative and coherent responses but which are actually inaccurate or fictitious such as false citations and cases.
- Data in a Gen AI dataset may be compromised by having been obtained in breach of copyright.
- Lack of safeguards to protect confidentiality of information inputted into Gen AI.
Restrictions of usage of Gen AI
The Practice Directions lay down some restrictions on specific use cases of Gen AI in the legal sector which include:
- Using Gen AI to generate expert reports as these reports are meant to demonstrate an expert’s reasoning process and opinion
- Embellishing, diluting or otherwise rephrasing written witness evidence
- Generating content of an affidavit or character references where a person’s opinion is to be expressed.
The Practice Directions also require a disclosure to be included in affidavits, witness statements and character references to the effect that Gen AI was not used to generate its content or content of any annexure or exhibit. However, they also provide for application for leave to use Gen AI in preparing such annexure/exhibit with such application identifying:
- the proposed use of Gen AI;
- the Gen AI program that will be used (including the relevant version);
- whether it is a closed-source or open-source program and or contains privacy and or confidentiality settings; and
- the benefit to be derived from the proposed use of Gen AI in the preparation of the annexure or exhibit.
Restrictions on Judges
The document also includes certain Guidelines for Judges and includes the following restrictions:
- No usage in proofing draft judgements
- No submission of draft judgements to Gen AI platforms
- No usage of Gen AI to analyse evidence for purposes of delivery of reasons for judgements
Conclusion
These AI guidelines and the case reinforce the risks that come with using Gen AI to generate legal pleadings. Lawyers should ensure that they cross check the accuracy of output of Gen AI platforms if they do end up using these platforms for research to avoid the risk of hallucinations.
AI may perhaps be more useful for tasks like summarising source documents that are uploaded by a user as opposed to doing legal research and drafting from scratch.
Even if you’re not practicing in New South Wales, these guidelines show where the legal profession is heading with AI regulation. The emphasis on verification, disclosure, and human oversight reflects principles that transcend jurisdiction.
Law firms should consider developing their own AI policies now, incorporating lessons from these guidelines. The message is clear: AI can be a powerful tool, but only when used with appropriate safeguards and human verification.
[1] Paul Karp, ‘Australian Lawyer Caught Using ChatGPT Filed Court Documents Referencing Non-Existent Cases’ The Guardian (1 February 2025) https://www.theguardian.com/australia-news/2025/feb/01/australian-lawyer-caught-using-chatgpt-filed-court-documents-referencing-non-existent-cases accessed 15 June 2025.
[2] Paragraph 2 of the Practice Directions
[3] Paragraph 6 of the Practice Directions

Leave a Reply