Technology leaders face a 2026 marked by extreme complexity, where Artificial Intelligence (AI) stands as the engine of innovation, but also as the main vector of cyber threats. A new ISACA report, presented at a roundtable within the “ISACA Europe Conference 2025” framework, reveals widespread concern among European IT and cybersecurity professionals: 51% fear that threats driven by AI and deepfakes they lose sleep next year.
The report highlights that although AI and machine learning are the top technology priority for 61% of respondents, a critical lack of preparedness and a shortage of qualified talent are creating a “explosive mixture”.
In this context, Chris Dimitriadis, Director of Global Strategy at ISACAhighlights that the key to facing technological advances and new threats lies in strengthening the workforce, “We are facing an increasingly urgent need to create a stronger cybersecurity workforce in all regions of the world, definitely here in the UK and within the European Union«.
Dimitriadis stressed that this is the “underlying common denominator” to defend digital ecosystems. However, the talent landscape is worrying, as while 63% of organizations plan to hire staff for digital trust roles (such as audit, risk and cybersecurity), 51% anticipate serious difficulties in finding qualified candidates.
AI: opportunity and ungoverned risk
Artificial Intelligence and machine learning have been identified as the top technology priorities for 2026, with 61% of respondents citing them. Karen Heslop, Vice President of Content Development at ISACA, explained that although AI has been a topic of conversation for years, the real change is that companies are moving from the conceptual phase to implementation, «the change goes from, ‘it’s an idea, I have to think about it, what do I do in my company’, to ‘I’m actually doing something with it in my company‘. “There are serious risk and governance implications for that.”
The ISACA report reveals widespread concern among European IT and cybersecurity professionals: 51% fear that AI-driven threats and deepfakes they lose sleep next year
This acceleration, however, has put the AI-powered threat at the top of concerns. In Heslop’s view, it is notable that AI-powered social engineering is the most significant cyber threat (59%). He has emphasized that, «The most significant cyberattacks for 2026 are social engineering driven by AI, with 59%. This is honestly the first time we’re seeing AI-powered social engineering at the top. “Bad actors are now using this technology.”
Concern intensifies given the preparedness gap. Well, only 14% of organizations feel “very prepared” to manage the risks associated with Generative AI. Karen Heslop has also had an impact on the phenomenon of «Shadow AI» (shadow AI) within corporations, where staff use generative AI tools without the vetting of the company, creating significant risks.
The “Explosive Mixture” Warning
By combining the increasing sophistication of AI attacks with the talent shortage, Chris Dimitriadis has warned, ““If we put that together, we must realize that this is an explosive mixture,” has asserted. Dimitriadis has stated that we are at a “crossroads” and that “we need to make a decision now in order to build the workforce before it is too late, before AI attacks become common.” Your call to action has been clear, “Otherwise, very, very soon we will witness the next digital pandemic”.
Although the upskilling in data security is considered essential, only 30% rate it as “very important.” Dimitriadis commented that the cybersecurity community must work so that that 30% “becomes 100%.”
Regulation and legacy systems: The brakes on progress
Regulatory complexity and global compliance risks keep 32% of professionals awake. In the words of Karen Heslopthe EU is “the one that leads the way in technology compliance”. However, preparation is low; Less than 1 in 5 professionals feel “fully ready” for NIS2 (18%), DORA (18%) or the EU AI Law (11%).
Despite this, three quarters (79%) of respondents agree that cyber regulation will foster digital trust. Dimitriadis recalled that compliance should not be the only driving force, but rather that organizations must integrate cybersecurity into the business strategy.
Another factor restricting progress is aging infrastructure. Karen Heslop explains that modernizing legacy systems is an important priority, as they act as a “inhibitor”. Heslop added that there is a trend where companies are saying, “I have to upgrade my legacy system before I can even adopt AI”.
Faced with these challenges, ISACA recommends companies establish robust AI governance, accelerate talent development, modernize legacy systems, strengthen cyber resilience, and prepare for regulatory complexity. Chris Dimitriadis has concluded that, in this context, 2026 will be a year of “accountability”
