Memory expansion modules from Micron comply with Compute Express Link 2.0, which promises new security features and far more versatility than previous versions. Micron has introduced memory expansion modules that support the 2.0 generation of Compute Express Link (CXL) and come with up to 256GB of DRAM running over a PCIe x8 interface. CXL is an open interconnect standard with wide industry support that is meant to be a connection between machines allowing for the direct sharing of contents of memory. It is built on top of PCI Express for coherent memory access between a CPU and a device, such as a hardware accelerator, or a CPU and memory. PCIe is normally used in point-to-point communications, such as SSD to memory, while CXL will eventually support one-to-many communication. So far, CXL is capable of simple point-to-point communication only. Development on the CXL standard began in early 2019, but it has only recently come to market because it required a faster PCIe bus as well as native support from CPU vendors Intel and AMD. Only their most recent CPUs support it. For now, the initial applications revolve around attaching DRAM to a PCIe interface. That’s what Micron is offering with its CZ120 memory expansion modules. The modules are available in 128GB and 256GB capacities, which is pretty big for a memory module. They use a special dual-channel memory architecture capable of delivering a maximum of 36GB/s of bandwidth. Ryan Baxter, senior director of the data center segment at Micron, said security features were paramount in this release. “There are a lot of security features in 2.0 that don’t exist or are not supported in 1.1,” Baxter said. Security is important when you have servers talking to each other. The CXL 2.0 standard now supports any-to-any communication encryption through the use of hardware acceleration built into the CXL controllers. This means that silicon providers do not have to build encryption security into their own hardware. Baxter said that a lack of security is why customers testing and deploying CXL 1.1 tended to only experiment and use lower capacity memory. Some customers might have deployed CXL 1.1 in some internal workloads, but many avoided doing anything really ambitious while they waited for 2.0, he said. CXL 2.0 will also support persistent memory, which stores data and memory like NAND flash but is much faster, almost as fast as DRAM. CXL 2.0 enables distinct PMEM support as part of a series of pooled resources. Micron sees two key primary use cases for CXL 2.0: adding memory to a system to provide extra memory to a CPU under heavy workload; and supporting bandwidth intensive workloads, since the PCIe spec is actually faster than memory slots. So, guess which workloads they have in mind? “We’re seeing [interest] with AI training and inference, where these use cases are driving a much bigger memory footprint around the CPU,” said Baxter. He also cited more traditional uses, such as in memory databases, as benefiting from CXL 2.0 memory capacity. CXL has a lot of planned obsolescence in it. The 2.0 version is not backwards compatible with 1.1, and the 3.0 version will not be compatible with 2.0, since 3.0 uses the next generation of PCIe. Baxter doesn’t expect to see significant 2.0 use and product availability until at least next year if not 2025. Related content news Pure Storage adds AI features for security and performance Updated infrastructure-as-code management capabilities and expanded SLAs are among the new features from Pure Storage. By Andy Patrizio Jun 26, 2024 3 mins Enterprise Storage Data Center news Nvidia teases next-generation Rubin platform, shares physical AI vision ‘I'm not sure yet whether I'm going to regret this or not,' said Nvidia CEO Jensen Huang as he revealed 2026 plans for the company’s Rubin GPU platform. By Andy Patrizio Jun 17, 2024 4 mins CPUs and Processors Data Center news Intel launches sixth-generation Xeon processor line With the new generation chips, Intel is putting an emphasis on energy efficiency. By Andy Patrizio Jun 06, 2024 3 mins CPUs and Processors Data Center news AMD updates Instinct data center GPU line Unveiled at Computex 2024. the new AI processing card from AMD will come with much more high-bandwidth memory than its predecessor. By Andy Patrizio Jun 04, 2024 3 mins CPUs and Processors Data Center PODCASTS VIDEOS RESOURCES EVENTS NEWSLETTERS Newsletter Promo Module Test Description for newsletter promo module. Please enter a valid email address Subscribe