Domino Data Lab, provider of the leading Enterprise MLOps platform trusted by over 20% of the Fortune 100, announced its new Nexus hybrid Enterprise MLOps architecture that will allow companies to rapidly scale, control and orchestrate data science work across different compute clusters — in different geographic regions, on premises, and even across multiple clouds.
Despite the attention paid to cloud migration, concerns over cost, security and regulations compel a growing majority of enterprises to adopt AI infrastructure strategies that straddle on-premises data centers and the cloud. 66% of IT decision makers have already invested in hybrid support for AI workload development, Forrester Consulting found, and 91% plan to do so within two years.[1] First previewed at Domino’s Rev 3 conference, the new Nexus architecture enables Enterprise MLOps for this new reality. It delivers the portability and cost management for AI development and deployment that enterprises require, and flexibility that data science teams need, to accelerate breakthrough innovations at scale.
“Though the shift to cloud is on, a growing number of enterprises have some type of on-premises and cloud-based architecture currently in place,” said Melanie Posey, Research Director for Cloud & Managed Services Transformation at 451 Research, a part of S&P Global Market Intelligence. “The reality is that cost optimization persists as an ongoing issue for both cloud veterans and cloud beginners.”[2]
Nexus is a highly scalable hybrid Enterprise MLOps platform architecture delivering enterprises the best of both worlds: the cost benefits of on-premises infrastructure and the flexibility to quickly scale to the cloud using a single control point. Customers gain maximum cost optimization by leveraging owned on-premises NVIDIA GPUs, and the ability to move workloads
1 “Solving The Challenge Of Enterprise AI Infrastructure: IT Platform Trends That Are Scaling AI Success,” a commissioned study conducted by Forrester Consulting on behalf of NVIDIA and
Advanced Micro Devices, September 2021
2 451 Research, a part of S&P Global Market Intelligence, Public cloud pushes further into IT estates, but the future is hybrid – Highlights from VotE: Cloud, Hosting & Managed Services, June 4 2021
to cloud-based GPUs when additional capacity is needed – all without sacrificing reliability, security or usability.
“Enterprise data science and IT organizations are consistently asking for more infrastructure flexibility, to optimize compute spend, data security, and avoid vendor lock-in,” said Nick Elprin, CEO and co-founder of Domino Data Lab. “Our Nexus architecture will help our customers unleash data science while future-proofing their infrastructure investments.”
Domino Expands Collaboration with NVIDIA as First Nexus Launch Partner
Domino has already begun development of Nexus with NVIDIA as a launch partner, an effort which will include specific solution architectures validated for NVIDIA technologies, with a release targeted for later this year. Today, enterprise IT teams can learn how to scale data science workloads by taking a free, immediately available, hands-on lab that includes the Domino Enterprise MLOps Platform and the NVIDIA AI Enterprise software suite, accessed on NVIDIA LaunchPad.
To enable further powerful competitive advantages through innovative AI-enabled use cases, Domino has also joined the NVIDIA AI Accelerated program, which enables software and solution partners to leverage the NVIDIA AI platform and its expansive libraries and SDKs to build accelerated AI applications for customers. Domino continues to collaborate with NVIDIA on streamlining development, deployment, and management of GPU-trained models across a variety of computing platforms, from on-premises infrastructure to edge devices, leveraging Domino and the NVIDIA AI platform, which includes NVIDIA AI Enterprise and NVIDIA Fleet Command. Hybrid MLOps is a continuation of Domino’s vision to design the most innovative, flexible solutions – balancing customer needs for the most effective data science work.