Home / Learning Hub / Modules

LM-G02 ยท GRC ยท Practitioner tier

Privacy Act 1988 (Cth) and the Australian Privacy Principles

APPs, NDB, the new statutory tort, and ADM transparency

๐Ÿ“– 15 min read๐Ÿ“ 30-question assessment๐ŸŽฏ 3 scoring tiers (Foundation / Practitioner / Leader)

TheAICommand Learning Library

Module LM-G02

Privacy Act 1988 (Cth) and the Australian Privacy Principles

A practitioner module for Australian financial services. Privacy Act amendments (2024), the new statutory tort of serious invasion of privacy (2025), Automated Decision Making transparency (2026), and operating the framework with AI.

READING TIME 23 minutes (4,995 body words at 220 wpm)AUDIENCE TIER PractitionerASSESSMENT 30 MCQs in 25 to 30 minutes

Module Metadata

FieldDetail
Module IDTAIC-LM-G02
Module titlePrivacy Act 1988 (Cth) and the Australian Privacy Principles
Audience tierPractitioner (suitable for Foundation entrants who complete Part 4 carefully and Leaders who use Sections 7 and 9)
Estimated reading time23 minutes module body at 220 wpm (4,995 body words). 25 to 30 minutes assessment.
PrerequisitesWorking understanding of Australian financial services, basic data handling literacy, awareness of APRA prudential standards. Recommended companion modules: LM-G01 Corporations Act, LM-G07 CPS 234, LM-G08 CPS 230.
Cross-referencesLM-G01 (Corporations Act), LM-G05 (CPS 230), LM-G07 (CPS 234), LM-G03 (AML/CTF), LM-G04 (ASIC Act)
CurrencyReflects regulatory position as at April 2026
Author voiceAustralian English, neutral and direct, evidence-based

Learning outcomes

By the end of this module a learner will be able to:

  1. Explain (Bloom: Understand) the structure of the Privacy Act 1988 (Cth), the 13 Australian Privacy Principles, and the role of the OAIC.
  2. Apply (Bloom: Apply) the 2024 amendment package, including the statutory tort and the Automated Decision Making transparency obligations, to common Australian FS use cases.
  3. Analyse (Bloom: Analyse) a Notifiable Data Breach scenario and produce a defensible eligible data breach assessment within the 30-day statutory window.
  4. Evaluate (Bloom: Evaluate) AI tooling against APP 1, APP 6, APP 11 and the new ADM transparency obligations and select an appropriate deployment pattern.
  5. Create (Bloom: Create) a Privacy Impact Assessment outline and an ADM disclosure statement using a governed AI workflow with mandatory human-in-the-loop checkpoints.
  6. Justify (Bloom: Evaluate) the interaction between the Privacy Act, APRA CPS 234, and CPS 230 in a regulated financial services environment.
TheAICommand. Intelligence, At Your Command.

1. Executive Summary

The Privacy Act 1988 (Cth) is the principal Commonwealth statute governing personal information handling in Australia. It binds APP entities, including all Australian financial services providers because of the financial services carve-out from the small business exemption. The 13 Australian Privacy Principles (APPs) set lifecycle obligations from collection through to disposal. The Office of the Australian Information Commissioner (OAIC) is the regulator and now operates with materially expanded enforcement powers under the Privacy and Other Legislation Amendment Act 2024.

This matters for Australian FS for three reasons. First, customer trust depends on demonstrably lawful and transparent personal information handling, particularly across credit, claims, and identity. Second, the regulatory perimeter has tightened: a statutory tort of serious invasion of privacy is live from 10 June 2025, the Automated Decision Making transparency obligations commence 11 December 2026, and OAIC now has a tiered civil penalty regime plus an infringement notice power. Third, AI adoption multiplies the surface area of every APP obligation: models, cloud inference, and decision systems all sit inside the privacy perimeter.

After this module you will be able to:

  • Stand up a defensible privacy posture across the customer information lifecycle without invoking unnecessary legal escalation.
  • Run an APP-mapped impact assessment for a new product, vendor, or AI capability and identify the privacy controls that must be in place before go-live.
  • Operate a Notifiable Data Breach assessment within the statutory window and draft regulator and individual notifications.
  • Build a Claude or ChatGPT project space that supports privacy-by-design drafting, mapping, and review with mandatory de-identification, governance logging, and human review.
  • Translate the new ADM obligations into board-ready language and into the disclosure mechanisms that customers will see.

2. Regulatory and Strategic Context

Issuer and statutory authority

The Privacy Act 1988 (Cth) is administered by the OAIC, with the Privacy Commissioner leading privacy-specific functions including assessments, determinations, enforceable undertakings, infringement notices, and Federal Court proceedings. The Act sits inside a broader information policy stack including the Consumer Data Right under the Competition and Consumer Act 2010 (Cth), CPS 234, and the Identity Verification Services Act 2023 (Cth).

Scope of application in financial services

Most FS entities are captured regardless of turnover because section 6D of the Act excludes entities that provide a health service or hold credit reporting information from the small business exemption. Banks, insurers, superannuation trustees, AFSL and ACL holders, and brokers are all caught. Part IIIA and the Credit Reporting Code add specific obligations for credit providers and credit reporting bodies. APRA-regulated entities sit in an overlap zone where APP 11 is reinforced by CPS 234 and service provider arrangements by CPS 230.

Key dates and transitional periods

The Privacy and Other Legislation Amendment Act 2024 received Royal Assent on 10 December 2024. Immediate commencement provisions included the OAIC infringement notice power, the new tiered civil penalty regime, doxxing offences, and the APP 8 white-list mechanism for prescribed countries. The statutory tort of serious invasion of privacy commenced 10 June 2025 (Schedule 2). The Automated Decision Making transparency obligations commence 11 December 2026 (Schedule 1, amending APP 1) and require entities using computer programs to make or substantially make decisions that significantly affect individuals to update their privacy policies with prescribed disclosures. Treat 2025 to 2027 as an active reform window.

Tiered civil penalty regime (post-2024)

Three tiers now apply. Tier 1 (serious or repeated interference with privacy under s 13G): maximum civil penalty for a body corporate is the greatest of $50m, 30 percent of adjusted turnover for the relevant period, or 3 times the benefit obtained. Tier 2 (interference with privacy that is not serious or repeated): substantially lower civil penalty units. Tier 3 (specific administrative breaches): lowest tier. OAIC may also issue infringement notices for specific contraventions without court proceedings.

Interplay with adjacent frameworks

  • APRA CPS 234 Information Security (LM-G07): operational implementation of APP 11. CPS 234 sets information security capability and incident notification timing (72 hours) that runs faster than the NDB clock.
  • APRA CPS 230 Operational Risk Management (LM-G05): personal information processing in critical operations, material service provider obligations, business continuity, tolerance levels.
  • AML/CTF Act 2006 (Cth) (LM-G03): KYC collection sits inside APP 3, tipping off provisions (s 123) constrain SMR discussion. AML data must still meet APP 11.
  • Corporations Act 2001 (Cth) (LM-G01): directors duties oversight of privacy risk, Part 9.4AAA whistleblower discloser protections.
  • Consumer Data Right under the Competition and Consumer Act 2010 (Cth): separate Privacy Safeguards (PS1 to PS13) that mirror but are distinct from the APPs.

Visual 1: Regulatory authority map (Australian privacy stack)

Layered diagram with statutory authority flowing down and complaints flowing up.

LayerElementDesigner notes
TopAustralian ParliamentStatutory authority source.
L2Privacy Act 1988 (Cth) | Privacy and Other Legislation Amendment Act 2024Two parallel blocks, connector 'amends and extends'.
L3APPs 1 to 13 | Part IIIA Credit Reporting | NDB schemeThree blocks, SKY accent on NDB.
L4Subordinate instrumentsPrivacy (Credit Reporting) Code 2014, registered industry codes.
L5RegulatorOAIC. Enforcement levers as call-outs.
L6Adjacent regulatorsAPRA, ASIC, AUSTRAC, ACCC.
L7Courts and tribunalFederal Court, ART.
L8IndividualComplaints entry point and statutory tort.

3. Core Concepts and Defined Terms

Defined terms (minimum 8 rows)

TermDefinition (simplified)Source
Personal informationInformation or an opinion about an identified individual, or about an individual who is reasonably identifiable, whether the information is true or not and whether recorded in a material form or not.s 6(1) Privacy Act 1988 (Cth)
Sensitive informationA subset of personal information including health, racial or ethnic origin, political opinions, membership of a political association, religious beliefs, philosophical beliefs, membership of a professional or trade association, membership of a trade union, sexual orientation or practices, criminal record, genetic, and biometric information.s 6(1) Privacy Act 1988 (Cth)
APP entityAn agency or organisation bound by the APPs. Includes most Australian FS entities under the financial services carve-out.s 6(1) Privacy Act 1988 (Cth)
Eligible data breachUnauthorised access to, disclosure of, or loss of personal information that is likely to result in serious harm to one or more individuals where remediation has not removed that risk.s 26WE Privacy Act 1988 (Cth)
Likely to result in serious harmAssessed objectively from the perspective of a reasonable person with regard to the kinds of information, sensitivity, security state, persons who could obtain it, and nature of the harm.s 26WG Privacy Act 1988 (Cth)
Notifiable Data Breach (NDB) schemeMandatory regime requiring eligible data breach notifications to OAIC and affected individuals as soon as practicable.Part IIIC Privacy Act 1988 (Cth)
Automated Decision Making (ADM)Use of computer programs (including AI systems) to make a decision, or do a thing that is substantially and directly related to making a decision, that could reasonably be expected to significantly affect the rights or interests of an individual.Sch 1 Privacy and Other Legislation Amendment Act 2024 (commences 11 December 2026)
Statutory tort of serious invasion of privacyAn actionable tort with five elements: (1) intrusion upon seclusion or misuse of private information, (2) the plaintiff had a reasonable expectation of privacy in all the circumstances, (3) the invasion was serious, (4) the invasion was intentional or reckless, and (5) the public interest in the plaintiff's privacy outweighs any countervailing public interest.Sch 2 Privacy and Other Legislation Amendment Act 2024 (commenced 10 June 2025)
De-identified informationPersonal information from which the identity of, and reasonable identifiability of, an individual is removed using accepted de-identification techniques.OAIC De-identification Decision-Making Framework
APP Privacy PolicyA clearly expressed and up-to-date policy required by APP 1.3, describing the management of personal information, including categories collected, purposes, disclosure recipients, and access and correction mechanisms. New APP 1 sub-paragraphs added by Schedule 1 of the 2024 Act, commencing 11 December 2026, expand the privacy policy content to include disclosure of ADM use, the kinds of personal information used, and the kinds of decisions made.APP 1.3 with new APP 1 sub-paragraphs, Sch 1 Privacy and Other Legislation Amendment Act 2024

The 13 Australian Privacy Principles in plain English

The APPs are organised around the personal information lifecycle. Practitioners should treat them as an ordered control framework rather than a checklist:

APPs 1 and 2 govern the entry point. APP 1 requires open and transparent management through a current privacy policy and named accountable officers. APP 2 grants the option to deal anonymously or pseudonymously where lawful and practicable; anonymity is rarely achievable in FS because of identity and AML obligations, but pseudonymity is more often available than entities assume.

APPs 3 to 5 govern collection. APP 3 limits collection of personal information to that reasonably necessary for, or directly related to, the entity's functions or activities, and restricts sensitive information collection to consent-based or specifically authorised circumstances. APP 4 deals with unsolicited personal information. APP 5 sets the notification obligations at the point of collection.

APP 6 governs use and disclosure. The default rule is primary purpose only. Secondary use is permitted where the individual would reasonably expect it and the secondary purpose is related (or directly related, for sensitive information). Exceptions cover consent, authorisation by law, enforcement, and serious threat. APP 7 limits direct marketing. APP 8 governs cross-border disclosure and requires reasonable steps to ensure overseas recipients meet APP-equivalent standards. The 2024 Act introduced a white-list mechanism for prescribed countries.

APPs 10 and 11 govern data quality and security. APP 11 is the operational backbone for FS: take reasonable steps to protect information from misuse, interference, loss, unauthorised access, modification, and disclosure, and destroy or de-identify when no longer needed. CPS 234 operationalises APP 11 for APRA-regulated entities and is more prescriptive.

APPs 12 and 13 govern individual rights. APP 12 grants access (with limited exceptions). APP 13 grants correction rights and requires reasonable steps to notify other recipients where correction has occurred.

From 11 December 2026, new APP 1 sub-paragraphs (Schedule 1, 2024 Act) require entities using computer programs to make decisions that significantly affect individuals to disclose that use, the kinds of personal information used, and the kinds of decisions made. This is transparency, not consent. Plan an APP 1 uplift through calendar 2026.

4. Practical Application in Australian FS

Four worked examples spanning ADI, insurance, superannuation, and AFSL settings. All identifiers are placeholders.

(a) ADI: Customer service AI assistant suggesting case actions

Trigger. [ADI Placeholder] deploys an AI assistant in retail banking that recommends next-best actions on hardship cases, drawing on transaction history and prior case notes.

Obligations. APP 1 (new APP 1 ADM disclosure from 11 December 2026), APP 3 (collection necessity), APP 6 (use limitation), APP 11 (security), CPS 234.

Artefact. PIA using the OAIC PIA structure, with ADM disclosure language ready for the privacy policy uplift. CPS 230 vendor risk assessment with residency and retention controls.

Audit trail. PIA signed off by Chief Privacy Officer and CISO, recorded in the privacy register. Model card maintained. ADM disclosure ledger. CPS 234 residual risk acceptance.

(b) General insurer: Claims triage with biometric voice analytics

Trigger. [Insurer Placeholder] pilots voice-print analytics to flag identity fraud on motor claims. Vendor is offshore and proposes training on aggregated samples.

Obligations. APP 3 (sensitive information consent for biometrics), APP 8 (cross-border), APP 11 (template security), new APP 1 ADM disclosure from 11 December 2026, and statutory tort exposure if used beyond disclosed purpose.

Artefact. Express, specific, current, voluntary and informed consent design. APP 8 reasonable steps assessment. ADM disclosure language. PIA rejecting the broader 'training on aggregated samples' use as outside primary purpose.

Audit trail. Consent log, vendor due diligence file, APP-equivalent contract clauses, biometric template destruction schedule, complaint pathway prepared.

(c) Superannuation trustee: Member-facing chatbot summarising statements

Trigger. [Trustee Placeholder] deploys a member chatbot that summarises last-statement balance and insurance entitlements on natural language queries.

Obligations. APP 1, APP 3 (collection minimisation), APP 6 (aligned to APP 5 notification), APP 11 (authentication), trustee best interests duty under the SIS Act 1993, CPS 234.

Artefact. Authentication tied to portal identity. Output filter blocking disclosure of sensitive information outside the authenticated member. Transcript retention policy. Out-of-scope rejection pathway.

Audit trail. Transcript retention log, authentication failure register, hallucination rate evaluation, quarterly Trustee Risk Committee reporting.

(d) AFSL holder: Adviser productivity copilot drafting Statements of Advice

Trigger. [Licensee Placeholder] rolls out a copilot that drafts SOAs from adviser fact-find data and notes.

Obligations. APP 6 (use limitation), APP 8 (if offshore), APP 11 (tenancy and security), best interests duty under the Corporations Act 2001 (Cth), new APP 1 ADM disclosure from 11 December 2026.

Artefact. System prompt prohibiting client direct identifiers. Fact-find redaction routine producing placeholder inputs. SOA reviewer checklist. Privacy policy update describing the copilot's role.

Audit trail. Redaction logs, version-controlled prompts, SOA reviewer sign-off log, model risk register entry against the Risk Appetite Statement.

Visual 2: NDB notification timeline (process diagram)

StepTimingActionOwner
1T+0 (discovery)Trigger Privacy Incident Response. Begin s 26WH assessment.Privacy Officer / CISO
2T+1 to T+7 daysContain, investigate, gather evidence. Engage forensics and counsel.CISO + Privacy Officer
3T+7 to T+15 daysApply s 26WG 'likely to result in serious harm' test. Document.Privacy Officer + Risk Committee
4T+25 days maxReach decision: eligible / not eligible / further investigation.Privacy Officer
5Within 30 daysIf eligible, notify OAIC and affected individuals.Privacy Officer + Comms
6CPS 234: 72 hoursIf APRA-regulated and material, notify APRA. Runs parallel and faster than NDB.CISO
7Post-eventPost-incident review, control uplift, KRI calibration, board reporting.Risk Committee

Visual 3: Comparative obligation table (Privacy Act vs CPS 234 vs CDR Privacy Safeguards)

ObligationPrivacy Act / APPsAPRA CPS 234CDR Privacy Safeguards
Scope of regulated informationPersonal information including sensitive informationInformation assets including but not limited to personal informationCDR data (consumer data and product data)
Governance accountabilityAPP 1.2 reasonable steps and accountable officerRoles and responsibilities of board and senior management explicitPrivacy Safeguard 1, accredited data recipient officer
Risk classificationImplied through 'reasonable steps'Explicit information security capability assessmentStrict accreditation tiers
Third-party arrangementsAPP 8 and reasonable stepsSpecific service provider obligations (current and prospective)Trusted Adviser and outsourced service provider regime
Incident notification timing30 days assessment then notify (NDB)Within 72 hours of becoming aware of material incidentNotify Data Recipient and ACCC under accreditation conditions
PenaltiesCivil penalties up to the greater of $50m, 30% of adjusted turnover, or 3x benefit (serious or repeated)Direction by APRA, licence conditions, prudential capital impactCivil penalties under CCA, accreditation revocation
Data minimisationAPP 3Implicit through information classificationPrivacy Safeguard 3
Cross-borderAPP 8 and accountabilityService provider obligations and cyber resilience expectationsRestricted data flows and consent requirements
AI / ADM transparencyAPP 1 ADM disclosure from 11 December 2026Information security control of model assetsNot yet codified for CDR-specific AI use

Visual 4: Privacy Act operating model RACI

ActivityBoardExec / CROPrivacy OfficerCISOLine of BusinessPrivacy Counsel
Privacy StrategyARRCCC
APP 1 Privacy Policy maintenanceIARCCR
NDB scheme assessmentIARRCC
PIAs (new product / AI / vendor)IARRRC
ADM disclosure register (from Dec 2026)IARCRC
APP 11 controls (security)IACRRC
APP 8 cross-border decisionsIARCRR
Customer access / correction (APP 12, 13)IARCRC

R = Responsible, A = Accountable, C = Consulted, I = Informed.

Visual 5: Illustrative penalty exposure for serious or repeated interference with privacy

Stacked bar chart by turnover band showing the three Tier 1 statutory ceilings. Illustrative.

Entity adjusted turnover band (AUD, illustrative)Statutory ceiling 1: $50mStatutory ceiling 2: 30% adjusted turnoverStatutory ceiling 3: 3x benefit obtained
$100m turnover$50m$30mVariable, often the binding cap if benefit is small
$1bn turnover$50m$300mVariable
$10bn turnover$50m$3bnVariable

Note. Statutory maxima, not expected outcomes.

Visual 6: The 5 things to remember

Five anchors of Australian Privacy in 2026 1. Personal information includes anything that makes a person reasonably identifiable, not only direct identifiers. Treat 'reasonably identifiable' as a fact question, not a tick box. 2. Use limitation under APP 6 is the most under-managed control in financial services. Secondary use of data for AI training requires either consent or a tight 'reasonably expected and related' justification. 3. The NDB clock is 30 days for assessment. CPS 234 runs faster (72 hours). Treat the more onerous timing as the operating standard for APRA-regulated entities. 4. The statutory tort of serious invasion of privacy is live from 10 June 2025. Civil exposure now sits alongside regulator enforcement. 5. The ADM transparency obligations commence 11 December 2026. APP 1 privacy policies must disclose ADM use that significantly affects individuals.

5. Operating the Privacy Framework With AI

AI multiplies the surface area of every APP obligation, and produces strong leverage for privacy operations when governed properly.

Use cases at scale

Eight high-leverage use cases:

  1. Drafting and refreshing APP 1 privacy policies, including the new ADM disclosures.
  2. Mapping personal information flows across systems, vendors, and regions for PIAs and Records of Processing.
  3. Drafting Privacy Impact Assessments using OAIC PIA guidance as a knowledge source.
  4. First-pass NDB eligibility analysis from a de-identified incident summary, labelled DRAFT FOR PRIVACY OFFICER REVIEW.
  5. Drafting NDB customer notification language with placeholder fields for the regulator-required elements.
  6. ADM disclosure statements for the privacy policy and customer-facing FAQs.
  7. Board and Risk Committee updates on privacy KRIs, breach trends, and regulator developments.
  8. Regulator correspondence triage with classification, statutory mapping, and draft acknowledgement.

Project space setup

Set up two parallel workspaces, one in Claude Projects and one in ChatGPT Projects or a Custom GPT, sharing one knowledge base. The duplication is a continuity control.

Knowledge sources (de-identified, no live customer data):

  • Privacy Act 1988 (Cth) consolidated text (latest)
  • OAIC APP Guidelines (current edition)
  • OAIC NDB Resource Hub materials
  • OAIC PIA Guide
  • OAIC De-identification Decision-Making Framework
  • APRA CPS 234 (and the related Practice Guide CPG 234)
  • APRA CPS 230 (and the related Practice Guide CPG 230)
  • Your entity's privacy policy, privacy framework, breach response procedure, and privacy-by-design playbook (de-identified or marked CONFIDENTIAL with access controlled)

File structure:

  • /01-statutes-and-rules/
  • /02-regulator-guidance/
  • /03-internal-templates/
  • /04-prompt-library/
  • /05-output-archive/ (with retention metadata)

System prompt scaffold (paste at project level, not per chat):

System prompt: Privacy Workspace v1.0 You are a senior Australian privacy practitioner working inside an APRA-regulated entity. You operate to the Privacy Act 1988 (Cth), the Australian Privacy Principles, OAIC guidance, APRA CPS 234, and APRA CPS 230. You write in Australian English. You do not use em dashes. You never invent facts. You never include real personal information in any output. When unsure you say so. You ask up to two clarifying questions if the brief is ambiguous. You produce structured outputs with clear sections. You always flag where a Privacy Officer or Privacy Counsel must review before the output is used. You include APP and section references in every substantive answer. You never produce a final regulator notification without explicit human review. You never produce content that would create a defensible record without explicit human review.

Naming. [PIA-YYYYMM-Project-vN], [NDB-YYYYMMDD-Incident-vN], [ADM-YYYYMM-System-vN]. Append Privacy Officer initials and status flag (DRAFT, REVIEW, FINAL).

Prompt library (minimum 6 prompts)

Each prompt follows the Role / Context / Task / Constraints / Output Format / Quality Bar pattern.

Prompt 1: Privacy Impact Assessment outline ROLE: Senior Australian privacy practitioner. CONTEXT: A new product or capability described in the supplied brief, in an APRA-regulated FS entity bound by the Privacy Act 1988 (Cth). TASK: Produce a Privacy Impact Assessment outline aligned to OAIC PIA Guide structure. CONSTRAINTS: Australian English. No em dashes. No real personal information. Cite APP numbers. Identify CPS 234 and CPS 230 interactions. Flag ADM transparency under APP 1 if commencing 11 December 2026. OUTPUT FORMAT: Section headers (Project Description, Information Flows, Privacy Risk Assessment by APP, Mitigation, Residual Risk, Sign-off, Review schedule). Tables where useful. QUALITY BAR: 10/10 means every APP that could plausibly apply is named, with a one-line rationale and a control proposal.
Prompt 2: ADM disclosure statement (APP 1, from 11 December 2026) ROLE: Privacy practitioner drafting customer-facing language. CONTEXT: An ADM use case described in the supplied brief. TASK: Draft the ADM disclosure language for the privacy policy and a FAQ entry that aligns to it. CONSTRAINTS: Plain English readable at Year 8. No em dashes. Australian spelling. Avoid technical jargon. Do not promise outcomes. Avoid admissions. OUTPUT FORMAT: (1) Privacy policy paragraph, (2) FAQ Q&A, (3) Reviewer notes flagging any phrasing that requires Privacy Counsel review. QUALITY BAR: 10/10 means a customer can identify whether the system applies to their situation, what kinds of decisions it supports, and how to seek human review.
Prompt 3: Obligation map (APP-by-APP control mapping) ROLE: Privacy practitioner mapping a process to APPs. CONTEXT: A described data process (collection, use, disclosure, retention) supplied by the requestor. TASK: Map the process to each APP, identify whether the APP is engaged, the obligation, the current control, and the residual risk. CONSTRAINTS: Australian English. No em dashes. Cite section numbers and APP clauses. Use 'engaged / not engaged / partially engaged' classification only. OUTPUT FORMAT: 13-row table (one row per APP) with columns: APP, Engaged?, Obligation summary, Current control, Residual risk, Recommendation. QUALITY BAR: 10/10 means no APP is missed, language is consistent, and recommendations are specific.
Prompt 4: Control narrative (APP 11 security) ROLE: Information security control owner drafting an evidence narrative for assurance. CONTEXT: An APP 11 control area described in the supplied brief, against CPS 234 control taxonomy. TASK: Draft a control narrative suitable for internal audit, external assurance, or regulator response. CONSTRAINTS: Australian English. No em dashes. No invented evidence. Use only the evidence supplied or labelled placeholders for evidence to be sourced. Tie to APP 11.1, APP 11.2, and CPS 234 paragraphs where relevant. OUTPUT FORMAT: Control objective, Description, Evidence supplied, Evidence to be sourced (placeholders), Residual risk, Owner, Reviewer. QUALITY BAR: 10/10 means an internal auditor can sign off on the narrative without rewriting.
Prompt 5: Eligible data breach assessment (NDB) ROLE: Privacy Officer assessing an incident under Part IIIC of the Privacy Act 1988 (Cth). CONTEXT: A de-identified incident summary supplied by the requestor. TASK: Apply s 26WE and s 26WG to the facts and produce an eligibility assessment. CONSTRAINTS: Australian English. No em dashes. Make explicit assumptions where facts are missing. Flag any decision that requires human Privacy Officer sign-off (which is all final NDB decisions). Never mark a final decision as 'eligible' or 'not eligible' without prefixing 'DRAFT, FOR PRIVACY OFFICER REVIEW'. OUTPUT FORMAT: Facts as supplied, Assumptions, Information involved, Persons who could obtain it, Likelihood of access, Nature and severity of harm, Remediation steps already taken, Conclusion (DRAFT), Recommended next steps. QUALITY BAR: 10/10 means the Privacy Officer can review and sign off in 10 minutes or less.
Prompt 6: Regulator response triage and acknowledgement ROLE: Privacy Officer triaging incoming OAIC correspondence. CONTEXT: Incoming letter or email from OAIC supplied as text. TASK: Classify (preliminary inquiry, complaint, investigation, formal notice). Identify mandatory response timeframes. Draft an acknowledgement letter. CONSTRAINTS: Australian English. No em dashes. Do not concede liability. Do not state factual matters that have not been verified. Flag any matter that should be escalated to Privacy Counsel before response. OUTPUT FORMAT: Classification, Statutory basis, Required response timing, Severity rating, Suggested internal escalation list, Draft acknowledgement letter. QUALITY BAR: 10/10 means no factual claim is made beyond what was in the regulator's letter, and the response timing is correct to the day.

Governance, audit, privacy, and risk appetite controls

De-identification. Inputs to any model not within an enterprise tenancy bound to APP-equivalent handling must be de-identified to OAIC standard. Remove direct identifiers, reasonably identifying combinations, and unnecessary sensitive information. When in doubt, redact and use placeholders.

Prohibited inputs. Customer PII, claimant data, KYC and identity data, biometrics, market sensitive data, sanctions data, unreviewed regulator drafts, raw incident data prior to privilege assessment, and Part 9.4AAA whistleblower information.

Human-in-the-loop. Every NDB output, regulator response, privacy policy update, PIA, and ADM disclosure must be reviewed by a named accountable officer before use. Model output is draft. The accountable human is the decision.

Retention and logging. Maintain prompt and output logs for at least two years, access controlled to Privacy Officer, internal audit, and model risk lead. Treat the log as APP 11 protected if it contains personal information.

Model selection. Prefer enterprise tenancies (Claude Enterprise, ChatGPT Enterprise) over consumer products. For regulated information, prefer sovereign cloud or contractually equivalent protections. For sensitive information at scale or critical operations, on-premise or private cloud.

CPS 230 critical operations. If the AI tool is part of a critical operation (for example, AI-assisted credit decisioning), set tolerance levels, identify alternative processes, and ensure workflow survival under a model outage.

APP alignment. APP 1 (governance), APP 5 (notification), APP 6 (use limitation), APP 8 (cross-border), APP 11 (security), APP 12 (access). Treat the workspace itself as a system that processes personal information.

Quality assurance loop

Run every output through this five-step QA rubric before it leaves the workspace:

  1. Accuracy: Is every cited section, APP, or guideline correctly attributed?
  2. Currency: Does the output reflect the regulatory position as at the date of generation, including the 2024 amendments and the ADM commencement date?
  3. Privacy hygiene: Is the input fully de-identified? Are any placeholders properly labelled?
  4. Decision integrity: Are statements of fact distinguished from inferences? Is uncertainty flagged?
  5. Sign-off readiness: Is the output ready for review by the named accountable officer, with all assumptions explicit?

Red team prompt to stress-test your own draft:

Red team prompt Take the role of a privacy advocate, a sceptical regulator, and an opposing counsel in turn. For each role, identify the three weakest points in the supplied draft. State the test the role would apply, the evidence required to defeat the test, and any factual or legal assumption that would be challenged. Conclude with a single 'go / no-go' recommendation written for the Chief Privacy Officer.

Scaling pattern

Maintain the prompt library in version control with change logs. Quarterly model evaluation cadence with documented test cases. KRIs for prompt completion volumes, NDB time-to-decision, ADM disclosure currency, and PIA backlog. Treat material prompt library changes as CPS 230 changes.

6. Common Pitfalls and Watch-outs

Treating de-identification as binary. Run the de-identification test against the OAIC framework using the motivated intruder test. Re-test if the data, audience, or technology changes.

Using consumer AI for regulated tasks. Move regulated workflows to enterprise tenancies with contractual APP-equivalent terms. Block consumer endpoints at the network layer for staff handling regulated data.

Confusing the NDB 30-day clock with CPS 234 72-hour notification. Map both clocks at the start of every incident response. The faster obligation governs. Document each separately.

Assuming legitimate interest is an APP 6 exception. There is no general legitimate interest carve-out under the APPs. Use APP 6.2(a) reasonable expectation, consent, or specific legal authorisation. Do not import GDPR concepts uncritically.

Treating ADM transparency as a 2026 problem. Privacy policies need uplift through 2026. Inventory ADM use now, classify by significance, and plan disclosure language. Late preparation invites enforcement attention.

Overlooking APP 8 'reasonable steps' for offshore vendors. Reasonable steps must be proportional to sensitivity. For sensitive information offshore, expect contractual APP-equivalent terms, audit rights, and data residency commitments.

Confusing the statutory tort with regulator action. The tort is private civil action. Regulator enforcement is separate. Both can run in parallel for the same conduct.

Failing to log AI-generated outputs. Maintain prompt and output logs for two years. Treat the log as APP 11 protected. Apply access controls.

7. Decision Frameworks and Tools

Decision tree: Is this an eligible data breach?

  1. Has personal information been the subject of unauthorised access, disclosure, or loss? If no, NDB does not apply. If yes, continue.
  2. Is the entity an APP entity? If no, the NDB scheme does not apply. If yes, continue.
  3. Is a reasonable person likely to conclude that the access, disclosure, or loss is likely to result in serious harm to one or more individuals? If yes, eligible data breach.
  4. Have remediation steps already removed the risk of serious harm? If yes, the breach is not eligible. Document the assessment.
  5. If still unclear, run the s 26WH assessment within 30 days. Document each step.
  6. If APRA-regulated, run the parallel CPS 234 materiality assessment with the 72-hour notification clock.

Maturity ladder: Privacy operating model

  1. Level 1 - Reactive: Incident-driven privacy work. Privacy policy outdated. No PIAs. No model inventory.
  2. Level 2 - Documented: Privacy policy current. PIAs run on major projects. NDB procedure tested annually. AI governance in place but not enforced.
  3. Level 3 - Embedded: Privacy-by-design tollgates in change management. PIAs on every product, vendor, and AI deployment. Privacy KRIs reported quarterly. Mandatory privacy training. AI workspace governed.
  4. Level 4 - Optimised: APP 1 disclosures live for ADM. NDB drills quarterly. Privacy posture reviewed against peers and global standards. AI workspace producing measurable productivity uplift with documented controls.
  5. Level 5 - Anticipatory: Contributes to industry standard-setting. Anticipates regulatory reform. Designs for the next privacy reform tranche before it commences.

Self-check questionnaire (rate 1 to 5)

  1. Our privacy policy reflects current APPs and our actual data handling, including ADM use.
  2. We can produce a current personal information register on demand within five working days.
  3. We have run an NDB drill in the last 12 months.
  4. We have an AI workspace governance standard with mandatory de-identification and human review.
  5. We have inventoried ADM use cases that significantly affect individuals.
  6. We have rehearsed the parallel NDB and CPS 234 notification timing.
  7. Our APP 8 cross-border arrangements are documented, current, and proportionate to data sensitivity.

Score 30 to 35: Embedded or above. Score 20 to 29: Documented. Score below 20: Reactive. Use the gap to set a 12-month uplift plan.

8. Further Reading and Authoritative Sources

Primary statutes and rules:

  • Privacy Act 1988 (Cth)
  • Privacy and Other Legislation Amendment Act 2024 (Cth)
  • Privacy Regulation 2013 (Cth)
  • Privacy (Credit Reporting) Code 2014 (registered code)

OAIC guidance:

  • OAIC Australian Privacy Principles Guidelines (current edition)
  • OAIC Notifiable Data Breaches Resource Hub
  • OAIC Guide to Undertaking Privacy Impact Assessments
  • OAIC De-identification Decision-Making Framework
  • OAIC Privacy Self-Assessment Tool for APP Entities

APRA guidance:

  • Prudential Standard CPS 234 Information Security
  • Prudential Practice Guide CPG 234 Information Security
  • Prudential Standard CPS 230 Operational Risk Management
  • Prudential Practice Guide CPG 230 Operational Risk Management

Adjacent and international:

  • Competition and Consumer Act 2010 (Cth) Pt IVD (Consumer Data Right)
  • National Institute of Standards and Technology AI Risk Management Framework (NIST AI RMF)
  • ISO/IEC 27701 Privacy Information Management System

Professional bodies and resources:

  • International Association of Privacy Professionals (IAPP) Australia and New Zealand chapter
  • Governance Institute of Australia, Privacy and Ethics resources
  • Risk Management Institution of Australasia, Privacy and AI risk publications

9. Closing Sign-off

This module provides general information and education for Australian financial services practitioners. It is not legal, compliance, or professional advice. Apply the framework to your entity's specific circumstances, take advice where the position is unclear, and document your decisions.

TheAICommand. Intelligence, At Your Command.

Test your knowledge

LM-G02 assessment โ€” 30 questions

25-30 minutes. One question per screen. Your progress is saved locally for 30 days, so you can pick up where you left off. Submit anytime to see your score, tier, and per-question rationale.

Loading assessmentโ€ฆ

General information and education only. Not legal, compliance, financial, or professional advice. Verify any time-sensitive obligation against the primary source.

TheAICommand. Intelligence, At Your Command.