Archives

NVIDIA Launches DGX Cloud Lepton to Link Developers Globally

NVIDIA

NVIDIA introduced NVIDIA DGX Cloud Lepton™, an innovative AI platform featuring a robust compute marketplace designed to connect developers worldwide who are creating agentic and physical AI applications with access to tens of thousands of GPUs from a vast global network of cloud providers.

To support the soaring demand for AI compute power, NVIDIA Cloud Partners (NCPs) such as CoreWeave, Crusoe, Firmus, Foxconn, GMI Cloud, Lambda, Nebius, Nscale, Softbank Corp., and Yotta Data Services will provide access to cutting-edge NVIDIA Blackwell and other NVIDIA architecture GPUs via the DGX Cloud Lepton marketplace.

This platform empowers developers to leverage GPU compute resources across specific geographic regions, catering to both on-demand needs and long-term computing strategies—ideal for meeting operational, strategic, and sovereign AI requirements. Furthermore, leading cloud service providers and GPU marketplaces are anticipated to join the DGX Cloud Lepton ecosystem.

“NVIDIA DGX Cloud Lepton connects our network of global GPU cloud providers with AI developers,” said Jensen Huang, founder and CEO of NVIDIA. “Together with our NCPs, we’re building a planetary-scale AI factory.”

Also Read: Liquibase & Databricks Join to Modernize Lakehouse Change Management 

Addressing the critical challenge of securing dependable, high-performance GPU resources, DGX Cloud Lepton streamlines access to cloud AI services and GPU capacity throughout the NVIDIA compute ecosystem. The platform seamlessly integrates with NVIDIA’s software stack — including NVIDIA NIM™, NeMo™ microservices, NVIDIA Blueprints, and NVIDIA Cloud Functions — to accelerate and simplify AI application development and deployment.

For cloud providers, DGX Cloud Lepton delivers advanced management software that offers real-time GPU health diagnostics and automates root-cause analysis, significantly reducing manual operations and minimizing downtime.

Key Benefits of NVIDIA DGX Cloud Lepton:

  • Enhanced Productivity & Flexibility: Developers enjoy a unified experience across AI development, training, and inference stages. The platform enables direct GPU capacity purchases from cloud providers in the marketplace or the option to use their own compute clusters, offering increased control and flexibility.

  • Seamless Deployment: AI applications can be deployed effortlessly across multi-cloud and hybrid environments with minimal operational overhead, utilizing integrated services for inference, testing, and training workloads.

  • Agility & Sovereignty: Quick access to GPU resources in specific regions ensures compliance with data sovereignty laws and meets low-latency requirements for sensitive AI workloads.

  • Consistent, Enterprise-Grade Performance: Participating cloud providers deliver reliable, secure, and high-performance GPU services to ensure a predictable and smooth user experience.

Setting a New Standard for AI Cloud Performance

Alongside DGX Cloud Lepton, NVIDIA announced NVIDIA Exemplar Clouds — a program designed to help NCPs enhance security, usability, performance, and resiliency by leveraging NVIDIA’s expertise, reference hardware and software, and operational tools.

Powered by NVIDIA DGX™ Cloud Benchmarking, this initiative provides an extensive suite of tools and performance-optimization recipes that enable workload fine-tuning on AI platforms and help quantify the balance between cost and performance.