With over 50% of businesses across Australia utilising artificial intelligence (AI) technologies there is no doubt that the business landscape is continually evolving as companies increasingly adopt AI tools across their operations to achieve time efficiencies and cost savings.

With AI technologies such as Open AI’s Chat GPT, Anthropic’s Claude and Google’s Gemini, many organisations, from professional services through to construction, start-ups and more, are using these tools to generate contracts. These contracts are varied and include general business contracts, vendor/supplier agreements, service agreements, terms and conditions, employment and subcontractor agreements as well as complex and commercial contracts.

However, whilst AI-generated contracts may appear comprehensive, professional and legally sound, it is imperative that business owners and operators have a solid understanding of AI’s limitations, especially with regards to contracts, as it is extremely likely that these AI-generated documents are lacking an in depth understanding of the transaction and carry significant risks for the business.

AI tools generate contracts by analysing patterns in millions of documents.

It then produces language that mimics real agreements but lacks genuine legal understanding.

This creates a false sense of security. While the document appears comprehensive and professional, it often fails to address specific business needs or comply with Australian law, leaving your business vulnerable to unnecessary risks.

A qualified lawyer can help you identify and rectify any gaps, weaknesses or non-compliant conditions in an AI-generated contract, helping to protect your business.

What are the risks of AI-generated contracts?

There are a multitude of risks that are inherent when generating contracts with AI-tools. Outlined below are some of the most common risks a business may face when creating and enforcing AI-generated contracts:

1. Missing or Misleading Clauses

Quite often, AI-generated contracts omit crucial terms and conditions such as intellectual property ownership, confidentiality provisions, dispute resolution processes, limitation of liability clauses, human resource policies (such as termination, leave entitlements, equity arrangements) as well as many other regulations or obligations. The unfortunate result is that any missing or misleading clauses have the potential to render an agreement unreliable or void as contracts need to be accurate, clear and compliant to be enforceable.

2. A Generic Approach

AI generates content via pattern and language prediction. It has no context or material understanding of a particular business or its goals, challenges, industry or any factor that may pertain to the particular contract being generated. Essentially, AI-generated contracts do not incorporate any level of personalisation to ensure that the document is aligned to the needs of a particular business, its people, or to effectively protect its underlying interests.

3. Redundant Information

AI models don’t always reflect current Australian legislation or case law and may misrepresent legal obligations, cite old legislation or overlook recent regulatory changes or case law. This poses a substantial risk that the AI-generated contract may include outdated information which renders it non-compliant and potentially unenforceable.

4. Jurisdictional Mistakes

AI-generated contracts may mistakenly reference incorrect geographical areas when producing regulation information, from applying foreign regulations instead of Australian laws to incorporating the incorrect state-specific requirements, putting a business or individual in a vulnerable position in the event of a dispute.

5. Regulation Requirements

Australian business contracts are subject to multiple areas of legal regulation including the Australian Consumer Law ACL, Privacy Act 1988 Cth and Fair Work Act 2009. Regardless of whether you prompt AI to ensure the contract adheres to these laws, there is no guaranteed method to ensure AI will apply the correct and relevant information, resulting in exposure to risk.

6. Incorrect Inputs > Incorrect Output

An AI-generated contract is only as compliant and complete as the prompt that creates it. If the instructions entered are ambiguous or incomplete, the output will reflect this. This creates a significant risk of miscommunication, misleading information and omitted clauses, culminating in an ineffective and non-compliant contract.

7. Legal Drift

‘Legal drift’ occurs where minor AI edits re-allocate risk without notice. For example, an AI-generated contract may include phrases such as “subject to the buyer’s or supplier’s discretion” where this was not intended, or it may reference old or non-existent standards. This legal drift can prove insidious as it often goes unnoticed until a dispute arises at which point the business may unfortunately discover the contract has shifted liability, created compliance failures, diminished protections or is unenforceable.

8. Confidentiality Concerns

Using AI tools to create contracts holds inherent confidentiality risks, particularly under Australia’s Privacy Act 1988 (Cth) and Australian Consumer Law, as it can lead to unintended data exposure or breaches.

AI platforms, especially public or cloud-based ones like ChatGPT, often retain user inputs for training or improvement, potentially incorporating your confidential data (e.g., trade secrets, client details or contract terms) into models accessible to others. Not only does this violate confidentiality obligations, it could also expose your data to breaches or unintended sharing.

9. Hallucinations

A concerning risk of AI-generated contracts is the phenomenon of “hallucinations.” This occurs when AI invents legal authorities, case citations, or regulations that don’t exist. Research indicates that general purpose AI tools (such as those aforementioned) display a high rate of hallucinations on legal queries, making AI-generated contracts unreliable as even the most well structured contract may include fabricated legal references which undermine its validity and expose a business to legal challenges.

In fact, the risks around AI-generated contracts are reflected in the growing number of Courts and tribunals in Australia already encountering cases where AI-drafted agreements have resulted in disputes, compliance failures or unenforceable terms.

The benefits of AI-generated contracts are further eroded when one takes into consideration the overall time and cost involved in what is seemingly a ‘cost and time saving exercise’. Often it is more time and cost intensive to create a contract using AI tools, especially when accounting for the research, drafting, prompting and legal revisions required, than engaging a qualified lawyer to draft a contract.

Whilst AI tools can be helpful tools in the contract process, for example for initial outlines, first version or checklist drafts, or creating a brief for your lawyer – it is crucial to have a qualified lawyer review any AI-generated outputs to ensure you haven’t exposed the business or yourself to any potential financial penalties, weakened legal positions or reputational damage.

Furthermore, when it comes to complex legal documents, engaging a qualified lawyer is the optimal way to protect yourself and your business.

Conversely, AI holds no accountability or liability and the platforms disclaim responsibility, software providers offer no insurance, and businesses cannot pursue negligence claims. This lack of accountability is a primary reason why courts and tribunals, including the Federal Court of Australia and NCAT, advise against relying on AI for contract drafting without legal review.

Whilst there is no doubt that AI will continue to evolve, the current AI tools lack the insight, accuracy, contextual awareness, personalisation and legal expertise required to create a contract that protects you and your business.

Therefore, investing in professional legal advice upfront is far less costly than resolving disputes stemming from an AI-generated contract that delivers little more than a false sense of security. Your business is worth more than that.

If you have any questions or would like some guidance with regards to your contracts, feel free to reach out to our team at Antcliffe:Scott. Our team is more than happy to help you.

AI-Contract Checklist

Below are 8 helpful tips when using AI to generate contracts for your business:

1. Establish the parameters of AI use

  • Use AI for: first drafts of simple clauses, plain‑language summaries, comparing versions, brainstorming options.
  • Do not use AI for: final versions of key contracts (major customers, investors, IP, employment), unfamiliar areas of law, or high‑risk/regulated deals.

2. Always have a human owner

  • Assign a named person (usually the business owner, GM or in‑house legal) to be responsible for each important contract.
  • Make it clear that AI is just a tool; the human owner must read and understand every clause before it’s sent or signed.

3. Protect confidentiality and privacy

  • Do not upload customer lists, prices, payroll data, trade secrets, designs or deal strategy into public AI tools.
  • Opt for business or enterprise versions of AI-tools with proper security and privacy terms, or strip out names, addresses and identifiers when using AI tools.
  • Check that using AI does not breach any NDAs or privacy promises you have already given to others.

4. Anchor AI to good templates

  • Start from templates drafted or checked by a lawyer; let AI suggest edits, not design the whole contract from scratch.
  • Keep a small library of “approved” clauses (e.g. payment, IP ownership, liability, termination) and compare AI suggestions against those.
  • If AI adds something you don’t recognise, either remove it or seek legal advice before keeping it.

5. Check for Australian law and fairness

  • Confirm the contract refers to Australian law and the correct state or territory (e.g. “New South Wales”) where appropriate.
  • Watch for unfair terms (one‑sided indemnities, extreme penalties, hidden auto‑renewals) that might breach Australian Consumer Law or employment rules.
  • If you can’t explain a clause in plain language, get it reviewed by a lawyer.

6. Keep a simple approval process

  • For important contracts, require two sets of eyes: drafter plus another manager or advisor.
  • Use a short checklist before signing: parties correct, scope clear, price and payment correct, term and termination clear, IP and confidentiality understood, dispute process specified.
  • Save the final version, plus a note that it was reviewed by a human, in a central folder.

7. Train your team (light‑touch)

  • Give staff a one‑page guide on when they can use AI, what they must never paste into it, and who must sign off on contracts.
  • Explain that AI can “sound” confident but still be wrong or out of date, so they must never assume it is legally correct.
  • Encourage them to ask questions if they do not understand a clause, rather than trusting the tool.

8. Seek legal advice and review

  • Have your AI-generated contract reviewed by a lawyer to ensure it protects you and your business, is clear, compliant and enforceable.
  • Engage with a lawyer for complex and commercial contracts. This will negate the possibility of costly legal action and reputational damage.