Australiancybersecuritymagazine

ISACA poll finds AI adoption outpaces governance and ROI


A global ISACA survey of more than 3,400 digital trust professionals has found most organisations believe staff are already using artificial intelligence, but many report weak governance, limited visibility of return on investment, and uncertainty about how to respond to AI-related incidents.

ISACA’s 2026 AI Pulse Poll found 90 percent of respondents believe employees are using AI in their organisation, while only 22 percent said AI return on investment (ROI) has met or exceeded expectations.

Incident response readiness was also mixed. More than half of respondents (56 percent) said they were unsure how long it would take to halt an AI system due to a security incident, while 39 percent did not know whether their organisation had a documented process for shutting down or overriding AI systems.

In Oceania, half of respondents said boards and executive leadership are ultimately accountable if AI systems cause harm or serious error in their organisation.

On governance, the research found 38 percent of organisations have a formal, comprehensive AI policy, up from 28 percent in 2025. Another 30 percent reported having a limited policy, while 25 percent said they have no active AI policy.

Perceptions of AI’s ROI remained uncertain across respondents: 23 percent said it is too early to tell, 22 percent said they do not know the ROI, and 20 percent cited limited ROI so far. Only 22 percent said ROI has met or exceeded expectations.

Jamie Norton, vice chair of the ISACA board, said the findings show AI is no longer confined to IT teams and has become a leadership issue.

“What we’re seeing now is a shift from experimentation to accountability,” said Mr Norton. “Organisations are moving quickly to embed AI into operations, but many are still developing the policies, governance structures and skills needed to ensure those systems deliver long-term value safely and responsibly.

“The research also shows AI-related risk is now an immediate organisational priority for organisations across Oceania, while many are still seeing only limited ROI from AI initiatives. It highlights the growing pressure on leaders to balance innovation with governance, oversight and measurable business outcomes.”

The poll found respondents most commonly use AI for increasing productivity (62 percent), creating written content (62 percent), automating repetitive tasks (50 percent), and analysing large amounts of data (49 percent).

Skills and training featured as a growing issue. ISACA reported 78 percent of respondents said AI skills are very or extremely important to their profession, up from 72 percent last year. It also found 33 percent said their organisations train all employees on AI, up from 22 percent in 2025.

Despite expectations that AI could reduce workloads, nearly seven in 10 respondents said job responsibilities have increased or not changed in the last year, even as 36 percent said their organisation will increase AI-related jobs in the next 12 months, up from 31 percent in 2025.

On risk, 45 percent of respondents said AI risks are an immediate priority, while 38 percent said they are confident in their board’s understanding of and action against AI risks. The most-cited AI risks were misinformation and disinformation (82 percent), privacy violations (74 percent), social engineering (60 percent), loss of intellectual property (58 percent), and job displacement (42 percent).

ISACA also reported an increase in respondents’ confidence in detecting AI-powered misinformation: 41 percent said they were confident in their own ability, up from 30 percent in 2025. However, only 36 percent said they were confident in their organisation’s ability to detect AI-powered misinformation.

Beyond operational risks, the poll found 77 percent of respondents consider environmental concerns associated with using AI in their organisation, while 11 percent strongly agreed that organisations are paying sufficient attention to ethical standards related to AI implementation.





Source link