21 October 2025

Five key risks of artificial intelligence in family law

Authored by: Justine Woods and Craig Turvey
Artificial intelligence is beginning to appear in Australian courtrooms. While it may create efficiencies, recent cases show the real dangers of lawyers or parties relying on it too heavily.

Introduction

AI has the potential to streamline legal processes and reduce costs. In family law, however, accuracy, privacy and judgment are too important to leave to machines. Recent Australian cases highlight why lawyers and clients must remain cautious.

Hallucinations and inaccurate case law

The most obvious danger is AI’s tendency to ‘hallucinate.’ This occurs when an AI tool invents cases or misstates facts about real cases.

Australian courts have already encountered this. In 2025, a Melbourne lawyer used AI to generate case citations in a family law matter. Many of the cases did not exist and the hearing had to be adjourned. The lawyer was referred to the Victorian legal complaints body for investigation. In another matter, a junior solicitor in a Federal Court native title case relied on AI-produced citations that were fabricated or inaccurate. The Court ordered indemnity costs against the law firm. Similarly, in Victoria, a barrister holding the title of King’s Counsel apologised to the Supreme Court after filing submissions in a murder case that included fake AI-generated quotations.

These incidents show the real risks that come from relying on AI without checking every citation against the primary source.

Lack of context and human judgment

AI processes information quickly but does not understand human context. Parenting disputes, allegations of family violence and decisions about children’s best interests require sensitivity and judgment that AI cannot provide.

An AI tool might draft suggested parenting arrangements that look even-handed on paper, for example, alternating weeks. Yet, it may fail to account for a history of subtle family violence that makes such an arrangement unsafe. No machine can replace the professional judgment needed to weigh these considerations properly.

Australian regulators have already warned about this risk. While no family law judgment has yet turned on AI’s lack of context, disciplinary decisions reflect the danger of outsourcing judgment to machines.

Privacy and data security risks

Using AI usually requires uploading personal or financial information into external systems. In family law, that information may include sensitive financial records, medical reports or details about children.

Australian regulators have reminded practitioners that uploading confidential client material into uncontrolled AI platforms risks breaching both professional duties and privacy law. In a field where the protection of personal information is paramount, this risk is particularly acute.

Bias in training data

AI systems learn from the data they are trained on. If that data reflects bias, the AI will reproduce it.

For example, an AI tool trained mostly on overseas case law to prepare substantive court documents might underestimate the importance Australian courts place on shielding children from family violence. Instead, it may place greater emphasis on equal time arrangements. That kind of bias risks reinforcing outcomes that are inconsistent with the way Australian family law operates.

Family law disputes frequently involve issues of gender, culture and economic imbalance. If AI tools reflect hidden bias, they may produce outcomes that disadvantage already vulnerable groups.

Overconfidence and misuse

AI delivers its outputs confidently, even when they are wrong. This creates a risk that users will trust those outputs without sufficient scrutiny.

In August 2025, a Western Australian lawyer submitted documents citing four cases that did not exist. He later admitted he had placed too much confidence in AI tools without verifying the sources. The Court referred him to the state regulator and ordered him to pay costs.

In family law, this type of overconfidence could lead to inaccurate affidavits, flawed advice and submissions, or negligence claims for lawyers. Parties may also be misled if they rely on AI directly for legal guidance.

Concluding thoughts

AI may help in the future by streamlining routine processes and improving efficiency. But in family law, where decisions affect children and families at their most vulnerable, the consequences of overreliance are already visible in Australian cases.

Hallucinations, lack of judgment, privacy concerns, bias and overconfidence all highlight the need for caution. Family law requires a human lawyer’s expertise, care and judgment – qualities that no machine can yet replicate.

If you would like advice about how family law might affect your circumstances, please contact one of our experienced family lawyers at Cooper Grace Ward.

Like this article? Share it via:

This publication is for information only and is not legal advice. You should obtain advice that is specific to your circumstances and not rely on this publication as legal advice. If there are any issues you would like us to advise you on arising from this publication, please let us know.

Stay up to date with CGW

Subscribe to our interest lists to receive legal alerts, articles, event invitations and offers.

Key contacts

Justine-Woods-web
Justine Woods
Partner
Craig-Turvey-web
Craig Turvey
Special Counsel

Areas of expertise

Read next