DataDirect Networks (DDN) is the only independent data intelligence technology Nvidia has relied on internally for the past eight years, and it’s been central to driving GPU efficiency inside Nvidia’s own AI infrastructure. A provider of AI and data solutions, DDN has established a significant and growing presence in India, and some of its customers include Yotta, IIT Bombay, CDAC, Mercedes and more.

“Our R&D teams in Pune and Bengaluru represent roughly a quarter of our global workforce. India is an innovation hub that influences what we build for the world,” Paul Bloch, president & co-founder, DDN, told Sudhir Chowdhary in an interview. Excerpts:

India is investing billions into AI compute and data centres. Is that enough without world-class storage infrastructure?

India’s investment in AI compute is the right move – but GPUs alone don’t create AI capability. They create AI potential.

What’s happening globally right now is an industrial buildout. We’re not just training models – we’re manufacturing intelligence at scale. And like any industrial system, performance depends on the whole line working together.

The reality is simple: the most expensive part of an AI factory is the GPU – and the easiest way to waste that investment is to starve it of data. If the storage and data layer can’t keep up, the GPUs idle, the economics break, and the system never reaches its intended output. That’s why world-class storage infrastructure isn’t optional. It’s the difference between owning GPUs and operating an AI factory.

What should Indian cloud providers and government platforms get right if they want to run national AI models?

National AI models can’t be built like lab projects. They need to be built like national infrastructure. That means three things from day one: sovereign data control, strong multi-tenant security, and predictable performance at scale.

And it’s important to recognise: training is only one moment in the life cycle. Inference is the real workload – it runs continuously. It shows up inside banking systems, healthcare platforms, agriculture services, multilingual citizen applications, and every regulated industry that needs AI to operate reliably.

The countries that win won’t be the ones that run the biggest single training job. They’ll be the ones that can run AI in production, safely and efficiently, every day.

Does data localisation help or hurt India’s ability to build powerful AI?

Done well, it helps – and it can be a strategic advantage. Data localisation strengthens control, governance, and trust. Those are not “policy details.” They are the foundation of sovereign AI.

But localisation should not mean isolation. The best model is one where India retains control over its data and AI outcomes, while still remaining interoperable with global ecosystems and innovation.

India also has a unique advantage here: localisation enables models grounded in India’s own languages, industries, and datasets. That’s how you build AI that serves the country – not just AI that runs in the country.

Is AI today more compute-starved or data-starved?

In many production environments, AI is becoming data-starved. We see this pattern repeatedly at scale: organisations acquire large amounts of compute, but then discover the system can’t keep GPUs productive because the data layer wasn’t designed for sustained throughput, concurrency, and governance.

When that happens, the problem isn’t solved by adding more GPUs. You’re just buying more idle capacity. The winners build balanced systems – where the data layer is engineered to match the pace of the compute. That’s how you turn AI investment into AI output.

Do you believe the next AI breakthrough will come more from better data pipelines rather than better algorithms?

Algorithms will keep advancing – that’s the nature of this field. But the bigger breakthrough over the next few years will come from operationalising AI at scale. Not just training a model, but running AI as a reliable, continuously improving system across industries.

The real leap is moving from experimentation to execution – where AI becomes a durable capability, not a series of isolated projects. That’s where pipelines, governance, and utilisation matter. AI doesn’t change the world when it’s impressive. It changes the world when it’s dependable.

In five years, what will matter more for winning the AI race: chips, data, or storage architecture?

Chips will improve – that’s a given. The harder question is: who will be able to convert compute into sustained national and economic output? In five years, the winners won’t simply be the ones with the most advanced silicon. They’ll be the ones who can run AI factories efficiently at scale – keeping systems productive, secure, and cost-effective over time.

That comes down to data: how it’s stored, moved, governed, and delivered – and how well the system is integrated end-to-end. And for sovereign AI, there’s an added dimension: performance matters, but control and independence matter just as much.

What are the opportunities you see in the Indian market?

India is at a defining moment in its AI journey. It’s moving from adoption to building long-term national capability, and that shift creates enormous opportunity. We see strong demand across sovereign AI platforms, financial services, healthcare, and large-scale multilingual digital services. These aren’t isolated deployments. They are long-horizon infrastructure programmes, and they require industrial-grade data foundations.

India is also strategic for DDN in three very real ways. First, we have deep talent here – our teams in Pune and Bengaluru represent roughly a quarter of our global workforce. Second, we align with national priorities through local manufacturing and partnerships. And third, we are supporting sovereign-scale deployments, including Yotta’s Shakti Cloud under the IndiaAI Mission with 8,000 Nvidia B200 GPUs. For DDN, India is not just a market. It’s a growth engine and an innovation hub that influences what we build for the world.