Loading component...
At a glance
By Rosalyn Page
Accounting workflows and artificial intelligence (AI) are becoming increasingly intertwined across Australia and the Asia-Pacific, but this shift requires more than robust cybersecurity protections and strong governance.
Workforce upskilling and junior talent development must not be overlooked in the rush to contain headcount, minimise cost and maximise return on investment.
“Accounting and finance are in a unique position: the profession is simultaneously being disrupted by AI while serving as the trusted gatekeeper of accurate financial information,” says Tony Vizza, founder and managing partner at Novera, which specialises in AI, cybersecurity and risk management.
AI use security gaps to be aware of
According to CPA Australia’s Business Technology Report 2025, 89 per cent of organisations across the Asia–Pacific region are using artificial intelligence in some capacity. Yet for most, adoption remains nascent, with only 16 per cent reporting that AI is deeply embedded across their operations.
Where AI is in use, organisations report that improved accuracy, greater workflow efficiency and a better employee experience are the key tangible benefits.
Vizza warns that businesses scaling AI faster than controls could lead to significant security gaps.
Shadow AI, where tools are adopted by employees without IT control, occurs without proper risk assessment or oversight, and increases the potential for sensitive, proprietary information to be entered into these systems.
A lack of clear ownership of AI tools is all too common, according to Vizza. “Very often, the use of AI tools falls into an organisational grey area where there is a lack of both accountability and responsibility.”
Nick Kervin, national digital advisory leader at BDO Australia, believes many organisations are still grappling with basic data governance. “They’re not properly classifying and labelling data, validating access rights and confirming permissions before granting large language models access to vast volumes of unstructured information.”
Adding to the security gaps, identity and access management is not keeping pace with the uptake in AI, with some businesses granting agents credentials and permissions without proper review or ongoing checks.
“Organisations are still applying human-centric identity frameworks to non-human AI agents,” Vizza says.
Traditional IT controls must evolve to keep up
There are also AI explainability gaps, posing risks for finance and accounting functions that rely on accurate, reliable and auditable information.
“If organisations deploy AI without controls and something goes wrong, such as a material misstatement, a data breach or a biased output, they’re not just dealing with a technology failure. They’re dealing with a trust failure. And in this profession, trust is everything,” Vizza says.
Traditional IT quality controls are not keeping pace with AI, Kervin notes. Old testing techniques relied on test cases and scripts that confirmed the IT system was ready to be put into production. “AI breaks this traditional control and there is a lack of guidance on what the alternative is.”
As AI becomes embedded in core systems, it must be treated as critical infrastructure rather than an experimental overlay, Kervin continues. This requires stronger controls around data ingestion, access management, third-party dependencies, model monitoring and output validation.
“Security teams will need to expand their threat models to explicitly account for AI-specific risks, and governance frameworks must evolve accordingly,” he says.
Organisations must shift from cybersecurity prevention to resilience
While many organisations are investing in cybersecurity tools, breaches and incidents are a daily news story.
CPA Australia’s report shows that most respondents rate their cyber maturity as high, yet over 20 per cent of businesses surveyed report losing time and/or money due to a cyber incident in 2025. This disconnect highlights ongoing cybersecurity risks for businesses.
Kervin says many organisations take a “lumpy” approach to cyber investment. “A significant uplift program might temporarily improve maturity, but once investment slows, threat actors adapt.”
To meet the challenges, organisations need to shift from prevention to resilience. But this is about more than just withstanding a cyber crisis.
“It must also encompass a business’s capacity to adapt to changing market conditions, regulatory developments, supply chain disruptions and technological evolution such as AI,” Kervin says.
Resilience, explains Vizza, includes incident response plans, business continuity testing, supply chain risk assessments and governance frameworks.
“Operational resilience accepts that incidents will happen and asks: ‘Can you keep operating when incidents inevitably happen?’,” he says.
The Australian Prudential Regulation Authority’s CPS 230 operational risk management framework, which applies to APRA-regulated entities such as banks, offers guidance that accounting and finance teams can use when approaching resilience.
“While [the framework] applies directly to banks, insurers and superannuation funds,” Vizza says, “its principles are sound for any business.”
Why the shrinking junior talent pipeline is a fundamental problem
CPA Australia’s report also recorded a 17 per cent decline in hiring for entry-level or junior accounting and finance roles, which may reflect a broader pattern of AI and workforce transformation across professional services.
However, in these roles, graduates and juniors are not just learning the numbers — they are developing critical skills in professional judgement. “That tacit knowledge does not come from a training course. It comes from doing the work,” Vizza says.
Without the training pipeline, the next generation of partners and managers may not develop the foundational skills and understanding that teaches them to know when AI is wrong.
“In professions built on professional skepticism and independent judgement, that is a fundamental problem,” he says.
It may also concentrate risk, leaving fewer employees capable of challenging AI outputs, identifying errors or recognising when a system is producing biased or incorrect results.
“You’re essentially creating a dependency on AI without maintaining the human capability to oversee it,” Vizza notes. He recommends firms take several important steps to minimise the risk [see breakout box].
“If you don’t invest in growing the next generation’s judgement and expertise, all the AI in the world will not save you when something goes wrong — and nobody in the room will understand what they’re looking at,” he says.
Kervin agrees that generative AI is disrupting the traditional graduate pathway into the workforce. He warns that short-term savings could create capability gaps in future, including in relationship building between professionals and clients. “The face-to-face client communication and relationship-building skills are important building blocks for professional services graduates.”
Three tips for minimising AI risk
Tony Vizza, founder and managing partner at Novera, which specialises in AI, cybersecurity and risk management, suggests that accounting and finance teams take several important steps to minimise AI oversight risk:
- Redefine junior roles to handle quality assurance, exception review and oversight alongside AI.
- Introduce structured training programs to build foundational knowledge, even when the manual work has been automated.
- Build a deliberate succession plan that recognises the governance and capability risks of a thinning talent pipeline.

