At Mobile World Congress, Qualcomm is demonstrating its vision for better exploiting artificial intelligence with the help of on-device computing and connectivity. Credit: Irene Iglesias / Computerworld España While last year it was still a matter of tapping into the potential of ChatGPT and similar services via browsers and apps, efforts are now being made to run generative AI directly on the end device or in a hybrid mode. From the device manufacturers’ point of view, this newly awakened interest is understandable, as AI offers the opportunity to get the somewhat dormant PC and smartphone market back on track. But do users in the enterprise also benefit from this? On-device-AI from Qualcomm One of these players is Qualcomm. The company has already laid the foundation for corresponding AI-optimized chips on the device with the Snapdragon 8 Gen 3 smartphone chipset and the Snapdragon X Elite notebook processor unveiled at the end of 2023, both with an integrated NPU (Neural Processing Unit). While notebooks equipped with the Snapdragon X Elite are not expected until the summer, the newly unveiled Xiaomi 14, Xiaomi 14 Ultra and Honor Magic 6 Pro are recent examples of smartphones with Qualcomm’s Snapdragon 8 Gen3 that already use on-device AI in different ways. At Mobile World Congress in Barcelona the company demonstrated the performance of the Snapdragon SoC on an Android smartphone using the Large Language and Vision Assistant (LLaVA). According to Qualcomm, this is the first multimodal large language model (LLM) to run on a smartphone. The model, which reportedly has more than seven billion parameters, can accept not only text, but also images and speech as a prompt. In one of the demonstrations, images of different ingredients are shown, after which a recipe for these ingredients is created offline and the calorie count of the resulting meal is estimated. In another demo, the open-source graphics program GIMP is demonstrated with a Stable Diffusion plugin on a Snapdragon X Elite laptop alongside an x86 laptop running Intel Core Ultra 7 to demonstrate the benefits of hardware designed for generative AI. Qualcomm claims that the Snapdragon box is three times faster at image generation than the system without an NPU. A developer platform for AI models In addition, Qualcomm unveiled the AI Hub. The platform offers a library of pre-optimized AI models for seamless deployment on devices powered by Snapdragon and Qualcomm platforms, including smartphones, PCs, AR/VR devices, and other devices. The list includes not only well-known generative AI solutions such as Stable Diffusion, Llama or ControlNet, but also various models for speech recognition, image classification, object recognition, image superscaling and the like. According to Qualcomm, developers can run the models with just a few lines of code, even on cloud-hosted devices running Qualcomm platforms. As a modem manufacturer, Qualcomm is of course not only focusing on on-device AI, but is convinced that the combination with the cloud can bring further benefits. “The future of generative AI is hybrid,” Qualcomm CEO Cristiano Amon said. “The intelligence on the device works with the cloud to provide more personalization, privacy, reliability, and efficiency.” Amon emphasized the importance of connectivity to scale and extend generative AI beyond the cloud, edge, and device. AI-Optimized Connections At the same time, it is also possible to provide next-generation connectivity with the help of AI. For example, the newly introduced Snapdragon X80 5G modem-RF system with up to six times carrier aggregation (6CA) and six receiver antennas (6Rx) uses AI to optimize the use of multiple antennas in a smartphone. In this way, according to Qualcomm, it is possible to improve signal quality and thus data throughput, as well as increase energy efficiency. The Snapdragon X80 also supports NB-NTN (Narrow Band Non Terrestrial Network) satellite connections. The latest version of its radio chip, the FastConnect 7900, also uses AI to increase performance and improve energy efficiency. The main thing here is to understand what the Wi-Fi connection is used for, for example to watch videos, listen to music or for online meetings, and optimize the connection accordingly. According to Qualcomm, this can save up to 30 percent of energy compared to applications without AI. In addition, many Wi-Fi parameters could be better adjusted to provide an optimal user experience. The FastConnect 7900 supports Wi-Fi 7, Wi-Fi 6E, and Wi-Fi 6 for peak speeds of up to 5.8 Gbps, and also integrates Bluetooth and ultra-wideband (UWB) on a single chip. It is also said to consume less electricity than the previous generation. Related content news Cisco patches actively exploited zero-day flaw in Nexus switches The moderate-severity vulnerability has been observed being exploited in the wild by Chinese APT Velvet Ant. By Lucian Constantin Jul 02, 2024 1 min Network Switches Network Security news Nokia to buy optical networker Infinera for $2.3 billion Customers struggling with managing systems able to handle the scale and power needs of soaring generative AI and cloud operations is fueling the deal. By Evan Schuman Jul 02, 2024 4 mins Mergers and Acquisitions Networking news French antitrust charges threaten Nvidia amid AI chip market surge Enforcement of charges could significantly impact global AI markets and customers, prompting operational changes. By Prasanth Aby Thomas Jul 02, 2024 3 mins Technology Industry GPUs Cloud Computing news Lenovo adds new AI solutions, expands Neptune cooling range to enable heat reuse Lenovo’s updated liquid cooling addresses the heat generated by data centers running AI workloads, while new services help enterprises get started with AI. By Lynn Greiner Jul 02, 2024 4 mins Cooling Systems Generative AI Data Center PODCASTS VIDEOS RESOURCES EVENTS NEWSLETTERS Newsletter Promo Module Test Description for newsletter promo module. Please enter a valid email address Subscribe