Memory expansion modules from Micron comply with Compute Express Link 2.0, which promises new security features and far more versatility than previous versions. Micron has introduced memory expansion modules that support the 2.0 generation of Compute Express Link (CXL) and come with up to 256GB of DRAM running over a PCIe x8 interface. CXL is an open interconnect standard with wide industry support that is meant to be a connection between machines allowing for the direct sharing of contents of memory. It is built on top of PCI Express for coherent memory access between a CPU and a device, such as a hardware accelerator, or a CPU and memory. PCIe is normally used in point-to-point communications, such as SSD to memory, while CXL will eventually support one-to-many communication. So far, CXL is capable of simple point-to-point communication only. Development on the CXL standard began in early 2019, but it has only recently come to market because it required a faster PCIe bus as well as native support from CPU vendors Intel and AMD. Only their most recent CPUs support it. For now, the initial applications revolve around attaching DRAM to a PCIe interface. That’s what Micron is offering with its CZ120 memory expansion modules. The modules are available in 128GB and 256GB capacities, which is pretty big for a memory module. They use a special dual-channel memory architecture capable of delivering a maximum of 36GB/s of bandwidth. Ryan Baxter, senior director of the data center segment at Micron, said security features were paramount in this release. “There are a lot of security features in 2.0 that don’t exist or are not supported in 1.1,” Baxter said. Security is important when you have servers talking to each other. The CXL 2.0 standard now supports any-to-any communication encryption through the use of hardware acceleration built into the CXL controllers. This means that silicon providers do not have to build encryption security into their own hardware. Baxter said that a lack of security is why customers testing and deploying CXL 1.1 tended to only experiment and use lower capacity memory. Some customers might have deployed CXL 1.1 in some internal workloads, but many avoided doing anything really ambitious while they waited for 2.0, he said. CXL 2.0 will also support persistent memory, which stores data and memory like NAND flash but is much faster, almost as fast as DRAM. CXL 2.0 enables distinct PMEM support as part of a series of pooled resources. Micron sees two key primary use cases for CXL 2.0: adding memory to a system to provide extra memory to a CPU under heavy workload; and supporting bandwidth intensive workloads, since the PCIe spec is actually faster than memory slots. So, guess which workloads they have in mind? “We’re seeing [interest] with AI training and inference, where these use cases are driving a much bigger memory footprint around the CPU,” said Baxter. He also cited more traditional uses, such as in memory databases, as benefiting from CXL 2.0 memory capacity. CXL has a lot of planned obsolescence in it. The 2.0 version is not backwards compatible with 1.1, and the 3.0 version will not be compatible with 2.0, since 3.0 uses the next generation of PCIe. Baxter doesn’t expect to see significant 2.0 use and product availability until at least next year if not 2025. Related content news HPE, Dell launch another round of AI servers HPE unveils one server, and Dell launches several compute and storage products. By Andy Patrizio Oct 17, 2024 3 mins Servers Data Center news Dell updates servers with AMD AI accelerators The Dell PowerEdge XE9680 ships with Instinct MI300X accelerators, AMD's high-end GPU accelerators that are designed to compete with Nvidia Hopper processors. By Andy Patrizio Jul 17, 2024 3 mins Servers Data Center news Gartner: AI spurs 25% surge in data center systems spending Worldwide IT spending is expected to surpass $5 trillion in 2024, a 7.5% increase over 2023. Gartner's forecast reflects 'ravenous demand' for data center infrastructure. By Denise Dubie Jul 16, 2024 3 mins Generative AI Servers Data Center news Elon Musk’s Grok AI ‘compute factory’ will use Dell and Supermicro servers The factory’s location is still shrouded in mystery and shrink wrap. By John E. Dunn Jun 21, 2024 3 mins Generative AI Servers PODCASTS VIDEOS RESOURCES EVENTS NEWSLETTERS Newsletter Promo Module Test Description for newsletter promo module. Please enter a valid email address Subscribe