Economy USA

Big Pharma’s New Power Plant: Inside Lilly + Nvidia’s “AI Factory” for Faster Drugs

Big Pharma’s New Power Plant: Inside Lilly + Nvidia’s “AI Factory” for Faster Drugs
Lilly Chair and CEO Dave Ricks speaks during a press conference for Eli Lilly and Company in Houston, Texas, US, Sept. 23, 2025 (Antranik Tavitian / Reuters)

CNBC, Reuters, Nvidia contributed to this report.

Eli Lilly and Nvidia are teaming up on a moonshot: build what they say will be the most powerful supercomputer in the pharma industry and use it as an “AI factory” to speed up how medicines are discovered, designed, tested, and ultimately made. It’s part of a bigger bet sweeping biotech right now — that industrial-scale AI can shave years off development timelines and slash costs along the way.

Lilly says the build is wrapping in December and will switch on in January. Nobody’s pretending miracle cures arrive next spring; executives are candid that the real dividends won’t show up until late in the decade. As Lilly’s chief information and digital officer Diogo Rau puts it, the kinds of discoveries enabled by this much compute are more likely to hit around 2030. Still, the scope here is eye-popping — and the ripple effects could reach far beyond Lilly’s own pipeline.

At the heart of the project is a customized Nvidia DGX SuperPOD — the first in pharma with DGX B300 systems — stacked with 1,016 Nvidia Blackwell Ultra GPUs, lashed together with Nvidia Spectrum-X high-speed Ethernet and run with the company’s full-stack AI software. Lilly will own and operate the system itself. For context, Lilly notes a single Blackwell Ultra can outrun what used to take millions of old Cray boxes. In aggregate, the setup is rated at over 9,000 petaflops of AI performance — north of 9 quintillion operations per second. Translation: a staggering amount of number-crunching for biology.

Nvidia’s Mission Control will orchestrate workloads across the thousand-plus GPUs, while the software stack plugs straight into healthcare staples: Nvidia BioNeMo for large-scale biomolecular models, Nvidia Clara open foundation models for imaging and clinical data, and MONAI for medical imaging research. The goal isn’t just to “run AI,” it’s to standardize how Lilly trains, tunes, and deploys huge models in a highly regulated environment — securely and repeatably.

Thomas Fuchs, Lilly’s chief AI officer, calls the supercomputer “a novel scientific instrument… like an enormous microscope for biologists.” The idea is to train frontier models on millions of experiments, then use them to design and prioritize better candidates — antibodies, nanobodies, small molecules — far faster than traditional discovery can.

Discovery is the obvious headline. Feed vast piles of real experimental results (plus public research) into foundation models; generate new candidates that match target biology; rank and refine them; and repeat. Lilly says BioNeMo-powered models will help chemists hunt for new “motifs and configurations of atoms” that were essentially invisible with old methods. If that sounds abstract, think of it as turning drug hunting from artisanal craft into high-throughput, model-guided search.

But Lilly’s not stopping at the lab bench. The same factory will touch development and operations:

  • Clinical development: Large language models to accelerate medical writing, trial document prep, and more; imaging models to discover new biomarkers and track disease progression; patient stratification to target the right therapy to the right person — core precision-medicine building blocks.
  • Manufacturing & supply chain: Using Nvidia Omniverse and RTX PRO Servers, Lilly plans digital twins of production lines — virtual replicas to model bottlenecks, stress-test quality changes, and optimize throughput before touching the physical plant. Less downtime, faster scale-up.
  • Robotics: With Nvidia Isaac and Isaac Sim, the company can prototype and deploy smarter robots for inspection and materials movement, helping keep factories humming. In a business where line stoppages delay medicine for patients, even marginal gains matter.
  • Agentic AI in the lab: With Nvidia NeMo, Lilly wants AI agents that can reason and plan across digital and physical workflows — proposing molecules in silico, coordinating experiments in vitro, and tightening the loop between them. As Rau puts it, the point isn’t machine learning for its own sake; it’s “machines helping make humans smarter” by surfacing ideas we might never test otherwise.

A standout twist is Lilly TuneLab, the company’s AI/ML platform that gives biotech partners access to select Lilly-trained models — including ones built on roughly $1 billion worth of proprietary data. It’s arranged as federated learning (via Nvidia FLARE), which means startups can benefit from Lilly’s models without either party sharing raw data. Contribute your own results back to improve the models, keep your data private, and skip years of expensive bootstrapping.

That’s a big deal for cash-constrained biotechs: NVIDIA’s healthcare VP Kimberly Powell calls it an “extra starting point” that can save startups from “burning their capital” just to reach baseline. In theory, the more participants, the smarter the models — and the better the odds that new therapies emerge from unlikely corners of the ecosystem.

Let’s level-set. There are still no drugs on the market designed using AI — yet. But the pipeline is swelling: more AI-generated candidates are entering preclinical and early clinical phases, big pharmas are striking platform deals, and infrastructure is finally catching up to the hype. Analysts at Jefferies estimate AI-related R&D spend could reach $30–$40 billion by 2040, which sounds wild until you remember traditional drug development can top $2 billion per successful asset and 10+ years from first-in-human to launch.

Rau’s timeline — real business impact closer to 2030—is both sober and encouraging. Discovery isn’t a sprint to press-release milestones; it’s a grind of designing, testing, failing, and iterating. The bet behind the AI factory is that you can iterate vastly faster and fail smarter, steering capital to winners earlier and recycling the learnings at industrial scale.

For Nvidia, this is a flagship use case that stitches together nearly every piece of its healthcare stack: accelerated computing (Blackwell Ultra GPUs), networking (Spectrum-X), orchestration (Mission Control), foundational science platforms (BioNeMo, Clara, MONAI), industrial simulation (Omniverse), robotics (Isaac), and LLM tooling (NeMo). If it works, it’s a blueprint other pharmas will copy.

For Lilly, it’s about becoming AI-native, not AI-curious. The company’s already been tagged by CB Insights as the most “AI-ready” pharma. Pair that with a stated $50 billion expansion in US manufacturing and R&D — including four new facilities and a proposed $4.5 billion Lilly Medicine Foundry in Indiana — and you start to see a vertically integrated plan: generate better candidates, prove them faster, make them at scale, and do it all under one AI-enabled roof.

Jobs? Lilly projects about 13,000 high-wage roles tied to the broader build-out, with roughly 500 at the Medicine Foundry alone. Those are economic development numbers, yes, but they also hint at the scope of what “AI factory” really means: not just a data center, but an operating model.

The big “ifs” that still matter:

  • Regulatory alignment: Running powerful models is great. Getting them accepted by regulators — especially for decision-critical steps like patient selection or adaptive trial designs — requires rigorous validation, transparency, and audit trails. The good news: the FDA’s push to reduce animal testing opens the door to AI-augmented safety science, but the bar for evidence will remain high.
  • Data governance: Federated learning is privacy-preserving by design, but it doesn’t eliminate governance headaches. Contracts, provenance, bias checks, and monitoring for model “drift” are table stakes when real patients are at the end of the pipeline.
  • Talent and culture: AI doesn’t replace chemists, biologists, or manufacturing pros — it changes their toolset. Upskilling thousands of people across discovery, development, quality, and supply will be as decisive as the hardware.

That’s the bet. The supercomputer gives Lilly the raw horsepower; the AI factory provides the playbook; TuneLab spreads the benefits across the ecosystem; and the manufacturing twins + robotics tackle the back half of the value chain. Even if timelines to market don’t collapse overnight, shaving months in discovery, quarters in development, and weeks in scale-up adds up — especially across a large portfolio.

As Fuchs puts it, we’re “just scratching the surface” of AI in medicine design. With a thousand Blackwell Ultras at its back, Lilly now has the chance to find out how deep that surface really goes.

Wyoming Star Staff

Wyoming Star publishes letters, opinions, and tips submissions as a public service. The content does not necessarily reflect the opinions of Wyoming Star or its employees. Letters to the editor and tips can be submitted via email at our Contact Us section.