Reported $700 million acquisition is aimed at helping AI users with workload management and resource allocation.
Nvidia is acquiring Run:ai, an Israeli software firm best known for creating an efficient containerization layer between AI hardware and workloads.
Run:ai’s Kubernetes-based software layer provides complex scheduling and performance optimization technology, which can give customers better control over resource allocation in their AI computing infrastructure, according to Nvidia.
“Customer AI deployments are becoming increasingly complex,” Nvidia said in its blog post announcing the acquisition. “Managing and orchestrating generative AI, recommender systems, search engines and other workloads requires sophisticated scheduling to optimize performance at the system level and on the underlying infrastructure.”
Run:ai’s orchestration layer will work with any physical configuration, Nvidia said, whether that’s on premises, in the cloud, or in a hybrid environment.
IDC research manager Madhumitha Sathish said that the layer can help solve GPU resource constraints through techniques like batch scheduling and advanced queuing mechanisms. “Run:ai’s platform also maintains tool flexibility as researchers and data scientists use different data science tools across different regions and teams,” she said.
The two companies have worked closely together since 2020, according to Nvidia, which said that it will continue to offer Run:ai’s products through its current subscription-based model for the “immediate future.”
Run:ai’s orchestration technology is important for modern AI users, according to Peter Rutten, a research vice president at IDC, but its close relationship with Nvidia means that the deal doesn’t appreciably change the nature of Nvidia’s offerings to the AI market.
“Run:ai is a great solution for managing and orchestrating GPU workloads,” he said. “But I would say this is an incremental improvement, not a transformative one.”
For one thing, Rutten said, Nvidia already has a container environment, as well as an AI Enterprise suite designed to help manage AI workloads. Run:ai is better at that, allowing for “much more control” over a given pool of computing resources, but it’s “not anything game-changing,” he said.
The move does help Nvidia accomplish its apparent longer-term AI strategy, however, according to Sathish. The idea is to bring the AI vendor ecosystem closer to Nvidia, with the end goal of creating a more complete software stack that it can offer to the market.
“Run:ai’s platform is a good add-on and also an essential for companies that are in the business of scaling AI or running data science at scale,” Sathish said. “There is always a need for faster model iteration and deployment for better business value.”
Gartner distinguished VP analyst Arun Chandrasekaran said that the acquisition is well-suited to preparing Nvidia’s AI offerings for the changing AI marketplace.
“As more smaller and open models evolve in the marketplace, Nvidia envisions more customization and self-hosting of these models, where Run:ai can add value in terms of infrastructure efficiency and collaborative development,” he said.
Terms of the deal were not disclosed, but Israeli newspaper Calcalist placed the value of the Run:ai deal at $700 million.