The Use of AI in Family Law Documents: Can You Get Away With It?

February 19, 2026
use of ai in legal documents

Using AI in family law documents without proper oversight is risky. Australian courts have made it clear they will not overlook this. Artificial Intelligence (AI) has become an increasingly popular tool to help with many aspects of daily life. In the legal field, AI has been used to simplify legal research and writing. Whilst this can ease the background preparation for a case, it is questionable whether AI is an accurate and reliable resource to use in the legal field.


The 2025 family law case of Mertz & Mertz (No 3) [2025] FedCFamC1A 222 highlights this issue and the significant consequences of relying upon inaccurate, AI-generated information. If you are navigating a family law matter, speaking with experienced family lawyers in Sydney is the safest and most reliable first step.


What Happened in Mertz & Mertz: A Landmark AI Legal Case


This case stands as the most significant Australian judicial statement to date on the use of AI in family law proceedings.


i. The AI Error That Reached the Federal Circuit and Family Court


In this case, the Appellant's former solicitor filed a Summary of Argument and a List of Authorities. A month later, they filed an amended version of these documents where some authorities were removed from the footnotes and from the list. These amendments were made to remove the inclusion of misleading, non-existent and inaccurate legal authorities.


Whilst the changes were not acknowledged on the documents, the Appellant wrote a letter accompanying the amended documents acknowledging and apologising for these significant errors.


The Appellant's former solicitors then filed written submissions, explaining that AI had been used to prepare, but not write the contents of the documents. She stated it was her paralegal who had used AI. Despite this, the solicitor accepted responsibility for the use of AI by her paralegal.


ii. The Financial and Professional Fallout


The financial consequences were immediate. The former solicitor was ordered to pay the Respondent a further $10,000 as costs thrown away correcting the errors generated by AI.


The matter was referred to the Legal Services Commissioner for further investigation. The reputational and regulatory consequences of this single AI error continue to unfold.


What Australian Courts Say About AI in Legal Practice


Courts across Australia are no longer treating AI misuse as an isolated incident. A clear judicial position is forming, and practitioners need to understand it.


i. The FCFCOA's Position on AI


An important takeaway from this case is the Federal Circuit and Family Court's (FCFCOA) position on the use of AI in family law drafting. Judges Aldridge, Carew and Behrens quoted the Full Court in Helmold & Mariya (No 2) [2025] FedCFamC1A 163, where AI was described as having the capacity to confuse, to create unnecessary complexity, to result in wasted time and to mislead the Court and other parties.


Using AI risks legal practitioners violating their duty not to mislead the Court and to deliver competent and diligent legal services that will not diminish public confidence in the administration of justice (Dayal [2024] FedCFamC2F 1166). The Court also noted that there is a risk that the use of AI to draft documents will breach s 114Q of the Family Law Act and rules relating to material produced under a subpoena.


ii. The NSW Supreme Court Practice Note (February 2025)


On 3 February 2025, the NSW Supreme Court's Practice Note on generative AI came into effect. Whilst it is jurisdiction-specific, it reflects the direction all Australian courts are heading and should be treated as a benchmark for best practice across all jurisdictions.


The Practice Note imposes several key obligations on legal practitioners. Practitioners must be aware of the limits, risks and shortcomings of the generative AI program used, including the potential for inaccuracies (AI hallucinations, i.e. fabricated outputs) and biases. Data entered into generative AI programs may be used to train large language models, potentially making confidential information available to others with consequences for legal professional privilege.


Certain categories of documents must not be entered into a generative AI program unless the practitioner is satisfied the information will remain within the controlled environment of the technology provider. Generative AI must not be used to generate affidavits, witness statements, character references or expert reports without prior leave of the court.


Crucially, practitioners must manually verify every legal citation, academic authority, case law and legislative reference. That verification cannot itself be completed using a generative AI program.


iii. A Pattern Emerging Across Jurisdictions


The NSW Practice Note is not an isolated response. In Valu v Minister for Immigration and Multicultural Affairs (No 2) [2025] FedCFamC2G 95, a lawyer submitted court documents containing AI-generated non-existent case citations and fabricated quotes from a Tribunal decision that did not exist. After admitting the error, the lawyer was referred to the Office of the NSW Legal Services Commissioner. Judge Skaros noted that the use of generative AI in legal proceedings is a live and evolving issue, and that it was in the public interest for the regulator to be made aware of such conduct as it arises.


The FCFCOA has also indicated it is considering its own guidelines and practice directions regarding AI use. Formal regulation is coming. The question for practitioners is whether they will be ahead of it or caught by it.


The Real Risks of Using AI in Family Law Documents


The risks are not hypothetical. They are playing out in courtrooms right now, and the consequences span costs, reputation, and professional registration.


i. AI Hallucinations: When AI Invents the Law


AI hallucinations occur when a generative AI program produces confident-sounding but entirely fabricated information, including case names, citations, quotes and legislative references that do not exist. In a legal context, this is not a minor technical glitch. It is a submission to the court that misleads the bench, opposing counsel, and the administration of justice.


The courts in both Mertz & Mertz and Valu confirmed that submitted documents contained non-existent authorities. In both instances, the practitioners faced regulatory referral and financial penalties.


ii. Confidentiality and Legal Professional Privilege


There is a risk that entering draft documents into an AI program could lead to breaches of rules in respect of material produced under subpoena and give rise to a waiver of legal professional privilege. Once confidential information is entered into many commercially available AI tools, the practitioner loses control over how that data is stored, used, or shared.


In family law matters, which routinely involve sensitive financial disclosures, parenting histories, and personal correspondence, this risk is particularly acute.


iii. Responsibility Falls on You, Not the AI


The use of AI does not absolve the author and the responsible person from their professional or ethical obligations to the Court. This was made explicit in Mertz & Mertz. The solicitor accepted full responsibility for her paralegal's use of AI, even whilst denying she had used it herself.


There is no delegation defence. Whether it was you, your paralegal, or an automated tool that produced the error, the responsible legal practitioner carries the liability.


What AI Can and Cannot Be Used For in Family Law


Not all AI use is prohibited. Understanding the boundary between permitted and prohibited use is now a core professional competency for AI lawyers practising in Australia.

Permitted AI Use Prohibited AI Use
Generating chronologies and timelines Drafting affidavits or witness statements
Summarising lengthy documents or transcripts Preparing expert reports (without court leave)
Creating indexes, briefs, and witness lists Submitting citations without manual verification
Preparing Crown Case Statements Inputting subpoena material into AI programs
Reviewing and condensing large document sets

These guidelines are drawn from the NSW Supreme Court Practice Note and the judicial statements made in Mertz & Mertz and Helmold & Mariya. Practitioners in all jurisdictions are strongly encouraged to apply them regardless of whether a specific practice direction has been issued in their court.


What Practitioners Must Do Right Now


The courts have set out clear expectations. Practitioners who engage with AI in legal practice must treat these not as aspirational standards, but as minimum professional obligations.


i. Verify Every Citation Manually


If AI is used to identify authorities, the author and those accepting responsibility need to verify that they are relevant and accurate. That verification must be done by a qualified human, using primary legal sources. Using AI to verify AI output does not meet the standard.


ii. Train and Supervise All Staff


If it is used more widely, for example preparing text of submissions, creating footnotes or preparing chronology, the same responsibility about accuracy and relevance arises. Training, supervision, and clear firm-wide policies on AI use must be in place before a matter reaches the court.


iii. Know What Cannot Be Entered Into AI


Material produced under subpoena, documents that could attract legal professional privilege, and any content intended to reflect a witness's evidence or opinion must not be entered into a generative AI program unless the practitioner is satisfied it will remain in a fully controlled environment.


Conclusion


AI is not banned from legal practice in Australia, but it comes with serious obligations that cannot be ignored. The cases of Mertz & Mertz and Valu, together with the NSW Supreme Court Practice Note, draw a clear line: AI is a starting point only, never a finish line. Every citation must be verified. Every staff member must be trained. Every document must reflect the practitioner's own professional judgement.


The courts are not waiting for practitioners to catch up. Formal guidelines across all jurisdictions are on the way, and referrals to regulators are already happening. If you are involved in a family law matter and need advice from experienced, accredited specialists, contact Norton Law Group today for a free 30-minute consultation to help you understand your position and your options.


Frequently Asked Questions


Can a lawyer use AI to prepare court documents in Australia?


Yes, but with significant limitations. AI may be used for tasks such as generating chronologies, summarising documents, and creating indexes. It must not be used to draft affidavits, witness statements, or expert reports without court leave. All citations must be manually verified. The lawyer remains fully responsible for all content, regardless of how it was generated.


What are the consequences of submitting AI-generated errors to the court?


The consequences can include substantial costs orders against the practitioner personally, referral to the relevant state legal regulator, and potential disciplinary action. In Mertz & Mertz, the solicitor was ordered to pay $10,000 in additional costs and was referred to the Legal Services Commissioner. Professional standing and client trust are also at significant risk.


Does the NSW Supreme Court Practice Note apply in all Australian jurisdictions?


No. The Practice Note applies specifically to the NSW Supreme Court. However, the principles within it reflect the direction all Australian courts are moving. Practitioners across all jurisdictions are strongly advised to adopt these standards now, in advance of formal rules being introduced in their own courts.


Is the lawyer responsible if their paralegal used AI?


Yes. As Mertz & Mertz confirmed, the supervising solicitor accepts full responsibility for all AI use within their matter, including use by paralegals and support staff. Denying personal use of AI is not a defence if the error reached the court under the practitioner's name.

who is the best family lawyer
By Gabriella Pomare February 9, 2026
Learn what truly matters when choosing the right family lawyer beyond rankings, awards and reviews.
family law appeal
December 19, 2025
Complete guide to family law appeals in NSW. Learn when you can appeal, how to challenge, procedural steps, and whether you should challenge a court decision.
NSW Ex not paying Child Support
December 15, 2025
Learn what to do if your ex stops paying child support in NSW. Understand your rights, enforcement options, court action and how to recover unpaid support.
High-Net-Asset-Pool
December 11, 2025
Expert guidance on managing high-asset-pool family law complex matters: disclosure, forensic analysis, valuations, settlement, mediation, & litigation to court prep.
Parenting Orders Vs Parenting Plans
November 20, 2025
Learn the differences between parenting plans and parenting orders in Australia. Understand when to choose each, their legal effect, costs, and enforceability.
Property Settlement Lawyer
November 13, 2025
Learn how family lawyers can protect your assets during property settlement after separation or divorce. Get expert advice on time limits and fair division.
Signs to contact Divorce Lawyer
October 30, 2025
Learn to identify early warning signs of separation. And when to contact a divorce lawyer to protect your assets, rights, and make informed legal decisions.
Property Settlement Mistakes
October 28, 2025
Separating? Avoid the 10 biggest property settlement mistakes that can cost your future. Get practical steps for a fair and stress-free financial settlement.
Co-Parent After Separation or Divorce
October 21, 2025
A practical guide to co-parenting after separation, with legal insights, communication tips, and child-focused strategies for calm, consistent arrangements.
Child Relocation After Separation
October 17, 2025
Thinking of moving with your children after separation? Learn how Australian family law handles relocation, consent, and court approval.