NetApp is not only expanding its portfolio of solutions, but has strengthened its data platform business with products that seek to convert information management into an asset directly linked to the creation of value and the acceleration of innovation based on AI.

The move coincides with the global transition of AI from pilot projects to mission-critical applications driven by intelligent agents, where data preparation, traceability and security determine the viability of any productive deployment. In this context, NetApp AFX and NetApp AI Data Engine (AIDE) They represent two complementary pillars that unify storage, performance and governance under a single operational plane.

Disaggregated architecture and AI-ready data

NetApp AFX introduces an all-flash, disaggregated architecture that decouples performance from capacity, a key step toward linear scalability in AI environments. The new system, based on ONTAPextends the capabilities of a long-standing leader in data management into an infrastructure ready to support large-scale training, inference, and recovery-augmented creation (RAG) loads.

AFX acts as the core of the so-called AI Factoriesthose intensive data infrastructures that feed machine learning processes, continuous model integration and high-demand hybrid environments. Certificate with NVIDIA DGX SuperPODthe system maintains cyber resilience guarantees and advanced metadata management with optional nodes DX50which enable a real-time global catalog of enterprise data.

Intelligent and native data management for AI

The second major component, NetApp AI Data Enginepositions data as a comprehensive service that covers the entire lifecycle of artificial intelligence—from ingestion and preparation to the delivery of generative applications. Its value proposition lies in something more strategic than technical: moving intelligence towards data instead of moving data towards intelligence.

Integrated with the reference design of the NVIDIA AI Data PlatformAIDE combines accelerated computing and software NVIDIA AI Enterprise to deliver vectorization, semantic search, automated synchronization, and information protection in a governed, real-time updateable environment. This approach allows organizations to reduce friction between data engineering and data science teams, making infrastructure act as a direct enabler of digital agility.

Ecosystem convergence on the data platform

NetApp also expands its multicloud strategy with new capabilities in Azure NetApp Filesincluding an Object API that simplifies access to AI services and data from Microsoft Fabric, Azure OpenAI, Azure Synapse and Azure AI Search without the need to replicate the data sets.

The new global namespace, powered by FlexCacheoffers visibility and editing across multiple on-premises and cloud environments, supporting seamless migrations, disaster recovery, and workload balancing through SnapMirror. This model reinforces the principle of full data portability, a key requirement in the deployment of distributed and multicloud AI.

A new cognitive infrastructure paradigm

The launch confirms a broader transition in the market: the move from traditional storage infrastructure to cognitive infrastructurewhere the data platform It is no longer a passive support, but a living environment that accelerates the flows of transformation, learning and decision.

As Justin Boitano, vice president of enterprise AI at NVIDIA, summarizes, “the data platform “NetApp has become a native environment for AI by integrating accelerated computing, intelligent software and advanced models.” Michael Leone, principal analyst at Omdia, reinforces this vision by noting that “NetApp’s strategy demonstrates a clear understanding of the scalability, governance and efficiency challenges that organizations face on their path to operational artificial intelligence.”