Madrid has been the final touch to the NetApp Insight Xtra tour, which this multinational has been carrying out throughout Europe to present its news and strategies. In the case of the Spanish capital, the main presentation was given by the head of NettApp, César Cernuda, who shared with clients and partners the company’s roadmap around data management, artificial intelligence and the role of the cloud in a hybrid world where trust and security have become the new currency.

“We live in the world of data” was one of the first messages from the manager, recalling that the idea that “information is power” is not new, but the volume and complexity of the data generated today by organizations and society as a whole is. And as César Cernuda explained during his speech, “the true differentiating factor is no longer accumulating information, but knowing how to access it, convert it into useful knowledge and use it to make decisions that allow the business to advance.”

César Cernuda reviewed the history of NetApp, a company with more than three decades of history focused from its origins on the world of data. He acknowledged that the brand has traditionally been identified as a storage provider, but defended the evolution that has led the company to become a benchmark in unified storage and the disaggregation between computing and storage, a milestone that he defined as one of the great innovations of its first years. This evolution has translated into the creation of a unique platform, capable of offering a “unified storage” that other companies have tried to emulate, but that NetApp claims as a differential proposal, supported by a single operating system and a common core to maximize the sustainability of customer investments.

For César Cernuda the cloud is the turning point

The head of NetApp placed the jump to the cloud as another of the great turning points in the corporate strategy. He recalled the company’s early vision with its Data Fabric concept, designed to make data accessible and interoperable no matter where it resides, whether on-premises, in the public cloud or in private environments. Given the rise of hyperscalars, Cernuda explained the dilemma that management faced: compete head-on or collaborate. The decision was clear: ally with large cloud providers so that clients did not have to choose between models, but rather design hybrid architectures where NetApp acts as a connecting link, guaranteeing interoperability, security and coherent data management.

Cernuda insisted that NetApp cannot stop at mere storage, because organizations demand, in addition to capacity, security, protection and advanced information management. The stated goal is to be the most secure company in data storage, but also to provide the tools to manage the entire data lifecycle in an environment dominated by artificial intelligence. Rather than talking about AI in the abstract, the executive wanted to focus on how to prepare for projects based on this technology to be successful: the key, he said, is to build an intelligent data infrastructure supported by a platform such as its NetApp Data Platform, designed to integrate and orchestrate data of a very diverse nature. As he explained, all of this is supported by a single operating system that allows environments to be interconnected and that has contributed to consolidating NetApp as a world leader in Flash technology, both globally and in Europe and in the Spanish market, as well as in cloud data solutions outside the realm of hyperscalers.

To illustrate the value of this proposal, César Cernuda shared several examples of clients in very different sectors, united by the criticality of the data. Among them, the case of the European Space Agency stood out, with which NetApp has worked on mapping one billion new stars in the Milky Way, a challenge that exemplifies the scale and complexity of contemporary scientific data. To this he added the company’s presence in sectors such as finance, public administration, defense and intelligence services, where performance, availability and data security are absolutely critical.

On this basis, César Cernuda placed trust as the central axis of the relationship with the client. He related how, in his conversations with CEOs and CIOs, the word that is most repeated is “trust,” in reference to the decision to host their data in company technologies. The president of NetApp highlighted that this trust forces NetApp to maintain a constant pace of innovation and minimize incidents on information systems. Likewise, he recognized that problems can arise, but defended the importance of being proactive instead of reactive, relying on solutions such as Active IQ, which leverages artificial intelligence to anticipate possible failures and collaborate with the client before they impact their operation.

A significant part of the presentation was dedicated to differentiating between the “data era” and the “smart data era.” Looking to 2030, he recalled figures from studies such as those from McKinsey that predict a huge volume of information and a potential for improving productivity of between 6.1 billion and 7.9 billion dollars, but warned that this potential will only materialize if companies are able to properly prepare their data. To illustrate this, he told a recent anecdote at a round table on AI with Ibex advisors held at a prestigious Spanish business school, where the debate focused on whether artificial intelligence was good or bad, and the impact it will have on employment, but almost no one had delved into what is behind a successful AI project. In his opinion, many managers focus only on how many intelligent agents they can deploy to reduce costs, without understanding that these agents need a solid and well-governed database.

The reason for not being successful with AI

For César Cernuda, therein lies the explanation why so many AI projects do not come to fruition: they are approached from the top down, with aggressive deadlines and high expectations, but without having previously done the “homework” of preparing the platform and data infrastructure. Faced with this approach, he defended the importance of recognizing and supporting the work of the technology and infrastructure teams, to whom he showed “maximum respect” from the stage. He stressed that when organizations take the time to prepare their data environment and build a solid business case, results do come and AI projects can generate the expected return. In this sense, he defined one of his main tasks as president of NetApp: to help the business, beyond the technological area, understand that the success of artificial intelligence involves preparing the data first.

The executive also outlined the steps that, in his opinion, organizations should follow to move towards this “smart data.” First, understand where the data is, structured and unstructured, and build a catalog that allows it to be classified and treated as a coherent whole. From there, establish clear governance: who accesses what, with what security and privacy requirements, under what regulatory frameworks and with what control policies. Once this platform is created, the challenge is for the data to stop being passive and become dynamic assets, capable of learning from the context and integrating diverse sources such as video, audio or text into the same architecture. This process, he warned, is not static: each new data that enters the system must go through all these phases again, which makes data management a continuous challenge for organizations.

Cernuda wanted to frame all these steps in NetApp’s evolution from a storage provider to a company that offers a complete platform to build intelligent data infrastructures. He assured that his responsibility is no longer just to provide cabins or storage services, but to provide an architecture that combines security, advanced management and data preparation for the world of artificial intelligence. All of this, he stressed, in a hybrid scenario where the corporate data center and the public clouds of hyperscalers coexist, with Microsoft as a prominent ally, and where interoperability and control over data must remain in the hands of the client.