Why Blackstone’s $1.2 billion bet on Neysa matters for India’s AI future

Share This Post


The deal comprises about $600 million in primary equity from Blackstone and co-investors, with plans to add up to another $600 million in debt financing for expansion.

Unlike consumer-facing AI tools, Neysa occupies the layer underneath—the compute infrastructure and essentially the Graphics Processing Unit (GPU)-powered data centres that train and run AI models.

The company plans to deploy 20,000 GPUs across India as part of a structural response to the country’s broader push for sovereign AI compute that is local, performant, and policy-aligned.

In an interview, Sharad Sanghi, chief executive of Neysa, explains why this layer matters, where the startup fits in, how India’s policy environment is shaping demand, and what the future could look like. Edited excerpts:

Where does Neysa fit in India’s AI stack?

At its core, Neysa provides the infrastructure layer that powers modern AI. Think of it as the engine room.

AI models, whether they are performing compliance automation or fraud detection, require significant compute to train, fine-tune, and operate. That computing resides in data centres built with hardware called GPUs. We provide GPU compute, but also a platform layer where enterprises can run inference, build models with notebooks, use orchestration tools, and integrate third-party applications, essentially AI infrastructure and platform as a service. This enables organizations to deploy AI workloads without owning and managing the physical hardware themselves.

How does this infrastructure layer connect with India’s larger sovereign AI debate?

When people talk about indigenous models or DeepSeek, they often miss the compute backbone needed to actually run those models. There was a narrative that efficiency improvements might mean less computing. That’s partly true. But at the same time, frontier models are also getting larger. So demand is moving in both directions. DeepSeek and other initiatives spurred urgency, and the government accelerated the India AI Mission, funding domestic model efforts. But trained models still need GPUs and lots of them—ideally within the country for data sovereignty, compliance, and performance. India has 22 official languages and hundreds of dialects. You can’t serve that without local data and local compute. All of that requires infrastructure.

Hyperscalers like AWS and Azure already operate in India. Why would enterprises choose an Indian AI cloud?

Investors asked us that repeatedly.

There are a few differentiators. One, flexibility. Hyperscalers have standardized products that are hard to customize at scale. We can tailor better.

Second is security. Many Indian enterprises want air-gapped environments, not multi-tenant infrastructure.

Third is pricing. Focusing solely on AI workloads tends to be more cost-efficient.

Fourth is local GPU capacity. There was not enough GPU capacity in India, pushing companies to use overseas clouds. As data localization frameworks like the Digital Personal Data Protection Act, 2023, (DPDP Act) become enforceable, local infrastructure becomes compelling. And fifth is support. Enterprises often have applications teams but lack infrastructure expertise. We bridge that gap.

We’ve migrated customers from hyperscalers without losing them.

How does data localization and policy affect your strategy?

Data localization will help, but we didn’t build Neysa because of policy.

As India’s DPDP Act regime takes shape, certain data must remain within the country. That naturally increases demand for localization-aligned compute.

Our goal is to be globally competitive, not just local. Independent global analysts have rated us alongside international AI clouds, the only Indian entrant to get such recognition. That’s a benchmark we’re striving for.

Which sectors are currently driving real demand for your services?

In the enterprise space, financial services lead. Then manufacturing, healthcare and media.

Among startups, fintech is the most active. In research, seven IITs and IISc use our infrastructure. On the government side, we’re engaged with centres of excellence in agriculture, education, and manufacturing, and soon, health. We’re also talking to global labs that need capacity, regardless of sovereignty concerns.

Deploying 20,000 GPUs in India is ambitious. What bottlenecks do you anticipate?

Two main constraints:

One is data-centre space and energy. AI workloads are power-intensive. But with partners and Blackstone’s portfolio, we’ve secured access to capacity.

Second is GPU supply. Lead times from Nvidia remain around four to five months. High-bandwidth memory remains scarce globally, with most capacity sold out through 2026. We’ll bring smaller clusters online sooner, and the larger ones will take around six to nine months.

Looking ahead, how do you see the future of AI compute and infrastructure in India evolving?

The space is moving very fast. New models are emerging, and enterprise demand is evolving with them. Sovereign AI isn’t just a policy slogan anymore. It’s about production-grade infrastructure that can support research, enterprise, and government at scale. If India is to move from model announcements to operational deployments, the decisive layer will be compute, where data resides, how performant it is and how scalable the infrastructure can be.

That’s where we see the next phase of the AI stack being built.



Source link

spot_img

Related Posts

GoPro Max 2 review: some issues, but a brilliant 360 camera

Introduction Four years can be a long time in...

Bitcoin won over Wall Street and now it’s paying the price

Bitcoin’s Wall Street embrace was supposed to bring...

Andhra inks seven agreements at AI Summit focusing on education, skilling

The Andhra Pradesh government has signed seven agreements...

Bruno says he joined Blue Origin to work on ‘urgent’ national security projects

WASHINGTON — Tory Bruno, former chief executive of...

OpenAI’s Hardware Device Just Leaked, and You Will Cringe

Stuffing an AI chatbot into a consumer electronics...
spot_img