Every law firm, accounting practice and financial services firm in Australia is now using AI. Most of them are doing it without a policy, without a register, and without thinking about what happens when a partner pastes a client file into ChatGPT on their home laptop. This white paper gives you the playbook to fix that in 90 days.
Executive summary
Between now and the end of 2026, three regulatory shifts will change what is expected of every Australian professional services firm. The Privacy Act reforms come into force on 10 December 2026, with explicit transparency rules for automated decision-making. APRA CPS 230 applies from 1 July 2026, hardwiring operational risk expectations into contracts with AI vendors. And the Cyber Security Act 2024 is already live, with mandatory ransomware payment reporting for any entity over $3 million in annual turnover.
On top of those legislative changes, the professional bodies have moved. APES 110 confidentiality obligations and APES 320 Quality Management System requirements now apply directly to how your firm uses AI. The Legal Services Board in Victoria has already stripped a practising certificate for AI misuse. A US federal court ruled in February 2026 that AI-processed documents may lose legal professional privilege, and Australian courts are watching closely.
The good news: none of this is hard to get right. The firms that will thrive through 2026 are not the ones with the most sophisticated AI tools. They are the ones with clear boundaries, a written policy, and a small number of well-implemented controls.
The three things that matter most
1. Stop shadow AI. Your staff are already using ChatGPT, Claude, Gemini and Copilot. The question is whether they are doing it inside your controls or outside them.
2. Put a written AI Use Policy in place. APES 320 requires it for accountants. The AICD expects it for directors. Your PI insurer will ask for it after the next claim.
3. Build one tenant, one policy, one audit trail. A correctly configured Microsoft 365 environment with Copilot does more for compliance than ten separate AI tools.
Part 1. Why this matters now
AI adoption in Australian professional services has gone from novelty to normal in eighteen months. The 2026 Thomson Reuters AI in Professional Services Report found that 40 percent of Australian professionals are now using generative AI at work, with more than 80 percent of users engaging with it weekly. The CPA Australia Business Technology Report 2025 put AI use in Australian businesses at 89 percent, up from 69 percent the year before.
The regulatory response has been slower, but it has arrived. The Australian approach has not copied the EU AI Act. Instead, existing frameworks have been updated to cover AI use. That matters because it means there is no single “AI law” to read. The obligations are scattered across the Privacy Act, APES 110, APES 320, CPS 230, the Cyber Security Act 2024, and regulator guidance from the OAIC, APRA, ASIC, AUSTRAC and ASD.
For professional services firms, this creates a specific risk. Your duty of confidentiality is absolute. Your professional standards do not make allowances for tools that leak data. Your clients expect that if they hand you privileged information, it stays privileged. AI challenges all three of those assumptions in ways that are easy to miss until something goes wrong.
What changed in the last 12 months
- The Cyber Security Act 2024 received Royal Assent in November 2024, introducing mandatory ransomware payment reporting for entities over $3 million in turnover.
- The Privacy Act Tranche 1 amendments came into effect, with the ADM transparency rule scheduled for 10 December 2026.
- APRA published updated guidance on AI risk management under CPS 230, with a hard deadline of 1 July 2026 for vendor contract compliance.
- The OAIC issued formal guidance on privacy and commercially available AI products.
- CPA Australia and Chartered Accountants ANZ confirmed that AI use falls squarely within the APES 110 Code of Ethics.
- A Victorian lawyer lost their principal practising certificate after submitting AI-generated material with fabricated case citations.
- The US federal court decision in United States v Heppner (February 2026) found that AI-processed material may lose privilege. Australian courts are already citing it.
Part 2. The 2026 regulatory stack
Seven frameworks. One firm. No single law to read. This section maps the obligations that apply to your firm. Not all of them apply to every practice. The table below is a quick reference. Each framework is then explained in plain terms below.
| Framework | Who it applies to | What it requires | Key date |
|---|---|---|---|
| Privacy Act 1988 | Any firm handling personal information | Privacy policy updates, automated decision-making transparency, OAIC breach notification | 10 Dec 2026 |
| APES 110 Code of Ethics | All CPA, CA and IPA members | Confidentiality, professional competence, transparency with clients on AI use | In force |
| APES 320 QMS | All public practice accountants | Documented Quality Management System, including a written AI Use Policy | In force |
| APRA CPS 230 | AFSL and ACL holders | Vendor risk management, operational resilience, AI supply chain contracts | 1 Jul 2026 |
| Cyber Security Act 2024 | Entities over $3M turnover | Mandatory ransomware payment reporting within 72 hours | In force |
| Essential Eight | Recommended baseline for all firms | Eight mitigation strategies. ASD recommends Maturity Level 2 or above | Voluntary |
| Voluntary AI Safety Standard | All AI deployers | Ten guardrails covering testing, transparency, accountability | Voluntary |
Privacy Act reforms and the ADM rule
The Privacy Act reforms that became law in late 2024 introduced the first major update to Australian privacy law in a decade. The most significant change for professional services firms takes effect on 10 December 2026: the Automated Decision-Making transparency rule.
In practice, this means your privacy policy must disclose whether you use AI systems to make, or substantially contribute to, decisions that affect individuals. For a law firm that uses AI to triage matters, an accounting firm that uses AI to flag audit anomalies, or a financial planning firm that uses AI to generate advice summaries, this is a live obligation.
The OAIC has made clear that the obligation sits with the entity using the AI system, not the vendor. You cannot contract it away.
APES 110 and the confidentiality problem
APES 110 Section 114 requires members to respect the confidentiality of information acquired as a result of professional and business relationships. CPA Australia has confirmed, through its Centre of Excellence in Ethics and Professional Standards, that uploading client data into a public-facing AI tool is not acceptable. The reasoning is simple: if the tool does not guarantee confidentiality, using it is a breach.
What counts as public-facing is the important question. The free version of ChatGPT is public-facing. The free version of Claude is public-facing. The consumer version of Gemini is public-facing. A correctly licensed Microsoft 365 Copilot deployment inside your tenant is not, because the data stays inside your Microsoft environment and is not used for model training.
APES 320 and the Quality Management System
APES 320 requires public practice accountants to maintain a documented Quality Management System. Since the 2024 update, this explicitly includes technology risks and the policies governing them. An AI Use Policy is now a required component of a compliant QMS for any accounting firm using generative AI.
The minimum the policy must cover: what tools are approved, what data can and cannot be put into them, how outputs are verified, who is accountable, and how staff are trained.
APRA CPS 230 and AI vendor risk
CPS 230 applies to APRA-regulated entities and anyone holding an Australian Financial Services Licence or Australian Credit Licence. It came fully into force on 1 July 2025, with a grandfathering provision that allows existing contracts to continue until renewal or 1 July 2026, whichever comes first.
The standard treats AI vendors the same as any other material service provider. Your contracts must address resilience, sub-contracting, data location, incident notification, and termination rights. Most AI vendor terms of service do not meet CPS 230 out of the box. If you hold an AFSL or ACL, this is the item most likely to catch your next audit.
The Cyber Security Act 2024
The Cyber Security Act 2024 is Australia’s first standalone cyber security legislation. Three provisions matter for professional services firms:
- Mandatory ransomware payment reporting for entities over $3 million in turnover. The 72-hour reporting window applies from the moment a payment is made, not from the moment of the incident.
- Limited use protections for information voluntarily shared with the National Cyber Security Coordinator during an incident. This is a material improvement on the previous position.
- The Cyber Incident Review Board, which will publish reviews of significant incidents. Firms named in reviews should expect client and insurer questions.
Part 3. The four real-world risks we see in firms
The risks below are not hypothetical. Each one has played out in an Australian firm in the last twelve months.
Risk 1. Shadow AI
A tax partner in a mid-sized Sydney accounting firm drafted a client memo using the free version of ChatGPT. The memo contained client names, financial figures, and a reference to a specific ATO audit. None of it was malicious. The partner was efficient and the client got a good answer. But the content of the prompt became part of OpenAI’s training data, and under the then-current terms of service, the firm had no way to recall it.
This is shadow AI. It is the single biggest risk in Australian professional services today, and it is happening in every firm we have audited. The fix is not to ban AI, which does not work. The fix is to give staff a sanctioned alternative with equivalent or better capability, inside a governed environment.
Risk 2. Privilege waiver
In February 2026, the US federal court in United States v Heppner ruled that documents processed through AI tools may lose attorney-client privilege. The reasoning: privilege depends on confidentiality, and sending material through an AI tool that stores or learns from inputs breaks confidentiality.
Australian courts have not yet ruled on the point, but they are watching. Hamilton Locke and several other Australian firms have already published board-level warnings about the risk. For law firms and the in-house counsel they advise, this is a live exposure.
Risk 3. Hallucinated authority
The Victorian Legal Services Board stripped a practitioner of their principal practising certificate in 2025 for submitting AI-generated material to a court that contained case citations to cases that did not exist. The pattern has now been seen in three Australian jurisdictions and multiple accounting engagements.
Hallucinated authority is not an AI failure. It is a supervision failure. The professional standard is clear: the person sending the advice is responsible for every word of it, regardless of how it was produced.
Risk 4. Supply chain compromise
The MOVEit breach in 2023 and the subsequent wave of third-party compromises through 2024 and 2025 made one thing clear: your firm is only as secure as your least secure vendor. For professional services firms, the vendors of highest concern are not the obvious ones. They are the document management system, the practice management system, the AI transcription tool for client meetings, and the e-signature platform.
CPS 230 responds to exactly this risk by requiring firms to map critical operations to the vendors that support them, and to have plans in place for vendor failure. For firms not regulated by APRA, it is still the right framework to use.
Part 4. The playbook: ten controls every firm should have in place
These are the ten controls we implement for professional services clients. Together they cover the Privacy Act, APES 110, APES 320, CPS 230, the Cyber Security Act 2024, and Essential Eight ML2. Implementation time for a typical 20 to 100 staff firm is 8 to 12 weeks.
Part 5. Self-assessment scorecard
Score your firm honestly against the ten controls. Give yourself 1 for “fully implemented”, 0.5 for “partially”, and 0 for “not in place”.
- 8 to 10: You are ahead of the curve. Focus on continuous improvement and supplier assurance.
- 5 to 7: You have the core in place. Fill the gaps before your next PI renewal or audit.
- 2 to 4: You are exposed. A single shadow AI incident could trigger a notifiable data breach. Start with Controls 1, 2 and 3 this quarter.
- 0 to 1: You are where most Australian firms were twelve months ago. The good news: a focused 12-week program gets you to 7+.
Part 6. What good looks like in 90 days
The firms that move fastest on this are not the biggest. They are the ones with a clear owner and a three-month plan. Here is the plan we run with clients.
Weeks 1–4
Discover and decide
- Run a shadow AI audit across staff and Microsoft 365 logs.
- Decide on the approved tools list with leadership.
- Draft the AI Use Policy from the Appendix A template.
- Identify the AI owner (practice manager, CFO or IT lead).
Weeks 5–8
Configure and roll out
- Deploy Microsoft 365 Copilot with Australian residency, DLP and Conditional Access.
- Send the vendor questionnaire to every AI tool in use.
- Update privacy policy and engagement letter template.
- Build the AI inventory in SharePoint.
Weeks 9–12
Train and operate
- Run staff training and collect attestations.
- Run a tabletop exercise against an AI incident scenario.
- Deliver the first quarterly board or partnership report.
- Schedule the next review for 90 days out.
If you take one thing from this paper
Do not let perfect be the enemy of good. A firm with a written policy, an approved tools list, and Copilot configured correctly is in the top 20 percent of Australian professional services firms on AI governance. You do not need to be first. You need to be defensible.
Frequently asked questions
Can Australian law firms use ChatGPT with client data?
No. The free and consumer versions of ChatGPT are public-facing AI tools. Using them with client data breaches legal professional privilege, the APES 110 confidentiality obligation for accountants working with law firms, and creates a notifiable data breach risk under the Privacy Act. The safe alternative is Microsoft 365 Copilot deployed inside the firm’s own tenant with Australian data residency.
Does an Australian accounting firm need an AI Use Policy?
Yes. Under APES 320 Quality Management Systems, a documented AI Use Policy is now a required component of a compliant QMS for any accounting firm using generative AI. The policy must cover approved tools, prohibited data, human review requirements, staff training, and consequences of breach.
What is the Privacy Act ADM rule that takes effect in December 2026?
From 10 December 2026, the Privacy Act requires entities to disclose in their privacy policy whether they use AI systems to make, or substantially contribute to, decisions that affect individuals. The obligation sits with the entity using the AI system, not the vendor. Professional services firms using AI for matter triage, audit anomaly detection, or advice generation are within scope.
What does APRA CPS 230 require for AI vendors?
CPS 230 treats AI vendors as material service providers. Contracts must address resilience, sub-contracting, data location, incident notification, and termination rights. Existing contracts must comply by their next renewal or 1 July 2026, whichever comes first. Most AI vendor standard terms do not meet CPS 230 out of the box and require a negotiated schedule.
Does AI-processed material lose legal professional privilege?
The question was tested in the US federal case United States v Heppner in February 2026, where the court found that material processed through AI may lose the confidentiality required to sustain privilege. Australian courts have not yet ruled, but are actively watching. Using AI tools inside a controlled environment where data is not used for external model training is the current best-practice protection.
When does the mandatory ransomware payment reporting obligation apply?
Under the Cyber Security Act 2024, entities with annual turnover over $3 million must report any ransomware payment to the Department of Home Affairs and the Australian Signals Directorate within 72 hours of the payment being made. The rules took effect on 30 May 2025. Reporting is through the ReportCyber portal.
How long does it take to get a professional services firm AI-compliant?
A typical 20 to 100 staff firm can implement all ten core controls in 8 to 12 weeks. The program runs in three phases: discover and decide, configure and roll out, then train and operate. The program includes a shadow AI audit, an approved tools list, Microsoft 365 Copilot deployment, staff training, and a quarterly board report.
Is Microsoft 365 Copilot safe for a law firm to use?
Yes, when deployed correctly. A Microsoft 365 Copilot deployment inside the firm’s tenant with Australian data residency, Purview data loss prevention, and Conditional Access requiring multi-factor authentication meets the APES 110 confidentiality standard and the OAIC guidance on commercial AI use. Data stays inside the Microsoft 365 environment and is not used to train external models.
Download the 2026 playbook (PDF)
30 pages. Ten controls. Three sample appendices including a ready-to-adapt AI Use Policy, engagement letter disclosure wording, and a 12-question vendor due diligence questionnaire.
Book a 90-minute AI governance assessment
Complimentary. Written gap analysis against the ten controls. No obligation to continue.
About All IT Services
All IT Services is an Australian managed IT and cybersecurity provider headquartered on Sydney’s Northern Beaches, with offices in Brisbane, Melbourne and Central West NSW. We have been protecting Australian businesses since 2005.
We work with professional services firms across law, accounting, financial planning and advisory. Our clients include regulated entities under APRA, ASIC and the Legal Services Commission. We hold Microsoft Solutions Partner designations across the security, modern work and Azure workloads, and we maintain our own Essential Eight posture at ML2 or above.
This white paper is general information only. It is not legal, accounting or financial advice. Firms should seek advice from qualified professionals on their specific circumstances before relying on any of the content in this paper.
