The facility, with sites in the US and South Korea, will develop chips to support the processing demands of ‘artificial general intelligence,’ which refers to AI that can perform as well as or better than humans.
Samsung has built a research lab dedicated to developing a new type of chip for the next iteration of artificial intelligence: AI that is even smarter than humans.
The Samsung Semiconductor AGI Computing Lab will be under the direction of Dong Hyuk Woo, Samsung senior vice president, and will be located both in the US and South Korea.
The lab’s sole purpose will be to build a semiconductor designed to meet the compute-intensive processing demands of what Samsung calls “artificial general intelligence,” or AGI, according to a LinkedIn post by Kye Hyun Kyung, CEO of Samsung Semiconductor.
Samsung’s definition of AGI is a type of AI in which the models have intelligence capabilities even greater than, or at least equal to, humans and learn on their own without the need to be trained on human data first. This type of technology is significantly more computationally intensive than the current large language models (LLMs) associated with today’s AI technology, which must be trained by humans on various data sources.
From LLM to AGI
Samsung’s lab will first focus on developing chips for LLMs, specifically on AI inference and service applications, Kyung said. This will set up the development of more sophisticated chips for AGI down the road, he said.
“To develop chips that will dramatically reduce the power necessary to run LLMs, we are revisiting every aspect of chip architecture, including memory design, light-weight model optimization, high-speed interconnect, advanced packaging, and more,” Kyung wrote in the post.
Eventually, Samsung plans to continuously release new versions of AGI Computing Lab chip designs in “an iterative model that will provide stronger performance and support for increasingly larger models at a fraction of the power and cost,” he said.
“Through the creation of the AGI Computing Lab, I am confident that we will be better positioned to solve the complex system-level challenges inherent in AGI, while also contributing affordable and sustainable methods for the future generation of advanced AI/ML models,” Kyung wrote.
Untapped market potential
Samsung’s move in part appears to be an effort to find new revenue streams in an as-yet untapped market, as its core business, which is memory, has become a commodity, noted Gaurav Gupta, VP analyst, emerging trends and technologies, at Gartner.
“They are looking for another opportunity to grow,” he said. “This is where chips for inference come in.”
Indeed, most companies that currently build components for computer processing and memory are trying to keep pace with the rapid evolution of AI in various individual strategies to provide cost-effective computing resources.
Currently, the generative AI chip market for training models is dominated largely by Nvidia, with AMD having some share in the space, Gupta said. But these are for models running on GPUs, which can be scarce and costly and thus aren’t a long-term solution for running AI models.
Companies are looking for resource alternatives to run inference-driven AI, and Samsung hopes to be an early adopter of offering semiconductors for this technology, Gupta said.
“When it comes down to AI deployment at edge or endpoint for use-cases, it is expected that inference will be done on custom chips [that are] designed specifically to run fine-tuned models,” he said. “This is what Samsung wants to do – get into design of these chips and enter this market, which is yet to really take off.”