Other updates to the service includes new racks, Apigee integration, and survivability features. Credit: Gorodenkoff / Shutterstock Google has added new features to its distributed cloud offering of hardware that customers can run in their own data centers as part of their Google Cloud landscape. Google Distributed Cloud (GDC) is primarily aimed at customers with unusual data sovereignty, latency, or local data-processing requirements, and the new features unveiled at the Google Cloud Next conference run the gamut from flashy new generative AI tools to simple things such as being able to decide whether to dedicate more space in a rack to storage or to compute. AI search The new AI search feature powered by Gemma will enable organizations to retrieve and analyze data on premises or at the network edge, the company said. Under the hood, the AI search uses the Gemma 7B model, Vertex AI for LLM serving, and a pre-trained API (speech-to-text, translation, and optical character recognition) for data ingestion. It also supports AlloyDB Omni pgvector extension for an enterprise deploying a vector database inside GDC, the company said. AI search will be in preview in the second quarter of this year. Other updates to Distributed Cloud includes a sandbox environment, new racks, storage flexibility, survivability enhancements and Apigee integration. The sandbox added to GDC is a managed experience that can help application developers build and test services designed for GDC in a Google Cloud environment, without needing to navigate the air-gap and physical hardware, the company said, adding that developers could use GDC services, including virtual machines, containers, databases, and Vertex AI. The cloud services provider has also added storage flexibility to Distributed Cloud, which will enable enterprises to increase their storage independent of compute. This will help support large analytics and AI workloads, the company said, adding that it was providing options across block, file, and object storage. New racks Distributed Cloud, according to the company, will now have access to racks specially optimized for AI and general compute workloads. These racks offer the added flexibility of choosing network- or storage-optimized nodes for a deployment, the company said. Google’s partnership with Nvidia will also provide enterprises with access to a new server for Google Distributed Cloud equipped with an energy-efficient Nvidia L4 Tensor Core graphics processing unit (GPU). “A single server can be deployed as a standalone appliance for use cases where high availability is not necessary. This offering is an addition to our AI-optimized servers with Nvidia H100 Tensor Core GPUs to accelerate your AI innovation on-prem,” the company said in a statement. The new updates to survivability enhancements includes a disconnected mode with support up to 7 days, and a suite of offline management features to help ensure deployments, workloads are accessible in working condition while they are disconnected. Google will also support Apigee, its API management offering, on Distributed Cloud as well, it said. Related content news Cisco patches actively exploited zero-day flaw in Nexus switches The moderate-severity vulnerability has been observed being exploited in the wild by Chinese APT Velvet Ant. By Lucian Constantin Jul 02, 2024 1 min Network Switches Network Security news Nokia to buy optical networker Infinera for $2.3 billion Customers struggling with managing systems able to handle the scale and power needs of soaring generative AI and cloud operations is fueling the deal. By Evan Schuman Jul 02, 2024 4 mins Mergers and Acquisitions Networking news French antitrust charges threaten Nvidia amid AI chip market surge Enforcement of charges could significantly impact global AI markets and customers, prompting operational changes. By Prasanth Aby Thomas Jul 02, 2024 3 mins Technology Industry GPUs Cloud Computing news Lenovo adds new AI solutions, expands Neptune cooling range to enable heat reuse Lenovo’s updated liquid cooling addresses the heat generated by data centers running AI workloads, while new services help enterprises get started with AI. By Lynn Greiner Jul 02, 2024 4 mins Cooling Systems Generative AI Data Center PODCASTS VIDEOS RESOURCES EVENTS NEWSLETTERS Newsletter Promo Module Test Description for newsletter promo module. Please enter a valid email address Subscribe