The company specializes in inferencing with its analog, in-memory processor. Credit: Dell Technologies Just six months after unveiling its first AI inferencing processor, Mythic AI has announced a new round of funding for $70 million in Series C investment to begin mass production of its chips and to develop its next generation of hardware and software products. In November, the company announced the M1108 Analog Matrix Processor (AMP) aimed at edge AI deployments across a wide range of applications, including manufacturing, video surveillance, smart cities, smart homes, AR/VR, and drones. For a company that is nine years old and has zero sales, it’s got some heavy hitters behind it. The new investment round was led by led by venture fund giant BlackRock and Hewlett Packard Enterprise (HPE). Other investors include Alumni Ventures Group and UDC Ventures. Mythic Mythic M1108 Mythic AI said it will use the latest funding to accelerate its plans to begin mass production of the M1108 while expanding its support to customers globally, building up its software offerings, and developing the next-generation of its hardware platform. Inference is the second step in machine learning, following training, and has far lower compute requirements. Training requires GPUs, FPGAs, and CPUs with their massive horsepower, but inference is just a yes/no comparison and a CPU is overkill. By way of comparison, Intel’s early stab at an inference processor, the Nervana (since discontinued), consumed as little as 10 watts. A CPU consumes 200 watts and a GPU up to 500 watts. The M1108 would use as little as 4 watts, so you can see why it might be ideal for a low-power edge deployment. The M1108 chips are analog processors designed to provide high-performance with low power requirements. At the heart of the M1108 is the Mythic Analog Compute Engine for analog compute-in-memory with on-chip deep neural-network model execution and weight-parameter storage with no external DRAM. Each Mythic ACE is complemented by a digital subsystem that includes a 32-bit RISC-V nano processor, SIMD vector engine, 64KB of SRAM, and a high-throughput network-on-chip router. It uses a M.2 design which is becoming rather popular among SSD designs. M.2 is about the size of a stick of gum and plugs into the motherboard, lying flat. Depending on the motherboard, the M.2 uses PCI Express Gen3 or Gen4. The Mythic processor board has a four-lane PCIe interface with up to 2GB/s bandwidth. Device makers and original-equipment manufacturers (OEMs) can choose from the single-chip M1108 Mythic AMP or a variety of PCIe card configurations, including the M.2 M Key and M.2 A+E Key form factors to fit a variety of needs. On the software side, the M1108 supports standard machine-learning frameworks like PyTorch, TensorFlow 2.0, and Caffe. Mythic has not said when it plans to bring its products to market. Related content news Pure Storage adds AI features for security and performance Updated infrastructure-as-code management capabilities and expanded SLAs are among the new features from Pure Storage. By Andy Patrizio Jun 26, 2024 3 mins Enterprise Storage Data Center news Nvidia teases next-generation Rubin platform, shares physical AI vision ‘I'm not sure yet whether I'm going to regret this or not,' said Nvidia CEO Jensen Huang as he revealed 2026 plans for the company’s Rubin GPU platform. By Andy Patrizio Jun 17, 2024 4 mins CPUs and Processors Data Center news Intel launches sixth-generation Xeon processor line With the new generation chips, Intel is putting an emphasis on energy efficiency. By Andy Patrizio Jun 06, 2024 3 mins CPUs and Processors Data Center news AMD updates Instinct data center GPU line Unveiled at Computex 2024. the new AI processing card from AMD will come with much more high-bandwidth memory than its predecessor. By Andy Patrizio Jun 04, 2024 3 mins CPUs and Processors Data Center PODCASTS VIDEOS RESOURCES EVENTS NEWSLETTERS Newsletter Promo Module Test Description for newsletter promo module. Please enter a valid email address Subscribe