Introduction
The legal profession in New South Wales is facing a defining shift. As of 3 February 2025, the NSW Supreme Court’s Practice Note SC Gen 23 formally restricts the use of generative AI (Gen AI) in legal proceedings, warning of the risks to legal integrity and confidentiality posed by public AI tools like ChatGPT.
Simultaneously, the NSW Bar Association has issued firm guidance on barristers’ obligations under the Barristers’ Conduct Rules, reinforcing that core legal work must remain the product of human judgment—not machine-generated content.
The message is clear: AI may be used—but not at the cost of confidentiality, accuracy, or the integrity of the legal process.
The Core Problem: Public AI and the Breakdown of Legal Integrity
Public AI tools pose serious risks when used in legal practice. According to Practice Note SC Gen 23 and the Bar’s guidelines, these include:
Hallucinations – AI tools can fabricate cases, citations, and legal principles. A real NSW example led to disciplinary action after a lawyer submitted fake case references generated by ChatGPT.
Confidentiality breaches – Any information entered into a public AI may be stored, used to train models, or exposed to third parties. This risks violating client confidentiality, professional privilege, and court orders.
Loss of professional judgment – Overreliance on AI can erode the core legal duty of exercising independent forensic judgment. The NSW Bar warns that legal work must always reflect the lawyer’s own knowledge, skill, and accountability.
Public AI tools, while accessible, lack the safeguards needed for sensitive legal environments. And once client data is entered into a public LLM, it may be irretrievable and reused beyond your control.
What Is Prohibited by the NSW Supreme Court?
Practice Note SC Gen 23 sets strict limitations on how Gen AI can be used:
AI must not generate affidavits, witness statements, or character references
AI must not be used to alter, embellish, or rephrase a witness’s evidence
AI must not be used for expert reports without court approval
Confidential, subpoenaed, or suppressed material must not be uploaded into any AI program unless the platform guarantees controlled data handling, no external training, and secure use within a single proceeding
Even when AI is used for permitted tasks (e.g., summarising documents, generating chronologies), practitioners must manually verify all citations and legal references. Reliance on AI to validate its own content is not acceptable.
A Turning Point: Private AI Is No Longer Optional
With these restrictions in place, the path forward for law firms is clear: migrate to private AI platforms built for legal practice.
Private AI offers:
Data quarantine – Sensitive information stays within your firm’s secure infrastructure or approved Australian-hosted environments.
Customisation – AI can be trained on your own firm’s documents, processes, and preferred authorities—without introducing risk.
Control and compliance – You maintain full oversight of how AI is used, what it produces, and how it integrates with client work.
Actionable Takeaways for Legal Professionals
Audit your current AI tools: Are they public or private? Can you control how data is stored and used?
Stop using public AI for sensitive or matter-related tasks: It’s too risky under both ethical rules and the new court Practice Note.
Implement a private AI system that meets the court’s confidentiality, verification, and disclosure requirements.
Train your legal teams to understand when and how AI can be used—ethically and compliantly.
Document your use of AI internally for accountability and audit readiness.
As the NSW Supreme Court puts it:
“Legal practitioners and unrepresented parties should also be aware that data entered into Gen AI programs may be used to train the large language model, potentially making confidential information available to others.”
— NSW Supreme Court Practice Note SC Gen 23, para 8
Conclusion
The NSW legal system has set the tone for responsible AI adoption. The protection of client confidentiality and the integrity of legal proceedings cannot be compromised. The era of experimenting with public AI tools is over—now is the time for law firms to secure their operations with private, compliant AI platforms.
Firms that act early will be best placed to modernise workflows, safeguard client data, and uphold the standards that define the legal profession.
Recommended Solution: Omnisenti Smart Ask