Today : Sep 10, 2025
Technology
10 November 2024

Revolutionary Cooling And Storage Technologies Transform AI Workloads

Hewlett-Packard Enterprise and Samsung push boundaries with fanless cooling and ultra-high-capacity NAND chips

Recently, Hewlett-Packard Enterprise (HPE) made waves at their AI Day 2024 event with the introduction of groundbreaking cooling technology aimed at enhancing efficiency for artificial intelligence (AI) workloads. This innovation boasts the distinction of being the industry's first fully fanless cooling architecture utilizing direct liquid cooling, setting the stage for more effective thermal management as AI technologies evolve.

AI technologies today are demanding more power than traditional methods can handle, with power consumption surpassing the capabilities of conventional air cooling. This surge has led organizations with extensive AI operations to look for advanced solutions to manage energy requirements within their infrastructure. HPE has stepped up with its direct liquid cooling technology – now recognized as one of the most efficient systems for cooling high-performance AI setups.

The new fanless cooling system, as HPE claims, heralds significant improvements across the board. With this system, cooling power consumption can be reduced by as much as 90% when juxtaposed against traditional air-cooling solutions, which presents substantial environmental and financial benefits for businesses heavily invested in AI technologies.

This novel system comprises four foundational elements. The first is an all-encompassing cooling design leveraging an 8-element system, which addresses heat management for CPUs, GPUs, local storage, network fabric, server blades, and more. This innovative design is intended to deliver optimal cooling without relying on fans, which can be noisy and power-hungry.

The second key component focuses on high-density performance, providing support for compact configurations. This reliability is backed by rigorous testing, along with monitoring software and on-site services, ensuring seamless deployments. Thirdly, environmentally-conscious organizations can appreciate the system's integrated network fabric, which enhances large-scale connectivity, lowers costs, and minimizes energy usage. Finally, the open architecture of this cooling system is adaptable, allowing businesses to choose the specific accelerators and solutions they require.

According to Antonio Neri, HPE’s President and CEO, as companies step boldly toward utilizing generative AI’s possibilities, there’s also the pressing necessity to bolster sustainability efforts, address rising energy demands, and reduce operational costs. He emphasized the architecture's efficient energy usage, stating, "The architecture we unveiled today uses only liquid cooling, delivering greater energy and cost-efficiency advantages than the alternative solutions on the market. This direct liquid cooling architecture yields a 90% reduction in cooling power consumption as compared to traditional air-cooled systems."

While HPE's advancements take center stage, the broader ecosystem of memory technology is also experiencing transformative changes. Samsung, another key player, is debuting its highly anticipated 400-layer NAND chip. Expected to be released by 2026, this new chip is indicative of the company's direction to push the limits of storage capacity for AI applications. It’s being dubbed as the technology to break the 200TB barrier for ultra-large AI hyperscaler solid-state drives (SSDs).

The innovations surrounding Samsung's upcoming NAND technology are noteworthy. The company’s Device Solutions division has plans to utilize bonding technology to build memory cells and the supporting circuitry separately before merging them, which minimizes heat retention and enhances performance. This cutting-edge technology, referred to as bonding vertical NAND (BV NAND), will significantly augment data density per unit area, making it ideal for AI applications.

Samsung's push for high-capacity storage doesn't stop at the 400-layer V10 chip; they have even loftier ambitions, aiming for chips with over 1,000 layers by 2030. This enhanced capacity is particularly pertinent as AI data centers need to process vast amounts of information quickly and efficiently. The current 286-layer V9 NAND chips have already marked major progress, but Samsung's new models promise to establish new records.

This ambitious roadmap isn’t solely confined to NAND technology; Samsung is also setting its sight on next-generation DRAM releases. They are expected to unfurl sixth-generation 1c DRAM and seventh-generation 1d DRAM by the end of 2024, targeting applications for high-performance AI chips. The goal is to pair capacity with performance to meet the relentless demands of cutting-edge AI technologies.

Both HPE and Samsung are spearheading shifts within the tech space, anticipating and meeting the soaring demands of AI workloads as they seek to balance performance with efficiency. Their efforts aim not just at improving technology for the present but paving the way for future innovations where cooling technologies and massive storage capacities work harmoniously to support the fast-increasing power needs of AI.

Advancements such as fanless cooling systems and high-capacity NAND chips mark just the beginning of what could be considered the dawn of new technology. With the current trajectories, stakeholders and consumers can look forward to more sustainable solutions and exceptional operational efficiencies across data centers globally.