DESIGN TOOLS
Applications

AI drives power consumption, Micron drives power efficiency

Larry Hart | May 2024

AI, AI, AI. I’m not trying to draw a bunch of clicks with word-stuffing, but it’s difficult in the technology world not to hear that two-letter acronym repeated over and over in meetings, press briefings, media articles, and even in discussions with family. Is it largely bluster and hype? My friend and corporate vice president, Jeremy Werner, contends that it’s not, and he predicts various ways in which it will prove valuable to all of us. (Read 2024 predictions in storage, technology, and the world, part 1 and part 2.) 

Despite Jeremy’s compelling narrative, some still scoff at the promises of AI. But take a moment to reflect on the many technologies in the past 20 years that have transformed how we live and work. We use search engines almost every day to find important (and sometimes unimportant) information. Smartphones are so smart that we rarely use them as phones anymore. Who drives anywhere new without using a GPS map? Raise your hand if you use GPS to get home using the fastest route. Do you go to the store to buy something? Many of us don’t bother when we can order it and have it delivered straight to our house. We’ve largely pivoted from watching TV series and movies from linear TV (what many call “live TV”) to streaming services, like Netflix. We’re now in control of what we watch rather than relying on the networks to serve us whatever they want. Social media allows us to stay connected with friends and family no matter where they are. The list is long regarding the transformative technologies that have improved our lives. What a world we live in!  

Like these, AI is likely to deliver on many of its promises, especially if we consider its future impact over the next 20 years. Even if I’m wrong on this, one thing is certain — our customers have embraced AI as a core tenant of their technology ecosystem. The global AI server market is expected to grow 25% from 2024 to 2029 (compound annual growth rate [CAGR] of revenue).1 Our customers are continuously scaling their AI workloads and expanding their AI use cases, which is driving this increase in AI server purchases. 

One of the fundamental components of an AI server solution is the GPU or accelerator. These GPUs consume a tremendous amount of power. For example, an NVIDIA H200 GPU can consume up to 700 watts of power. If you have eight H200 GPUs in a server, that’s 5,600 watts. Consider an SSD with a max power consumption of 25 watts. Even with 24 SSDs in a server, the total max power consumption is only 600 watts.  

I know. I know. I’m comparing max power consumption and not typical consumption. But it clearly shows that GPUs in a system can theoretically consume about 10 times more power than SSDs. The equation generally holds true as well for memory CPUs and DPUs when compared to GPUs. Customers now tell us that the power needed to run those AI workloads on servers has become a critical consideration. 

Power consumption is a critical infrastructure consideration

Having GPUs in a system with more users and more content seems like an obvious equation for consuming more power. And it is, but power consumption for data centers has remained surprisingly constant as a percentage of global electricity use.

Demand for digital services is growing rapidly. Since 2010, the number of internet users worldwide has more than doubled, while global internet traffic has expanded 20-fold. 

Rapid improvements in energy efficiency have helped limit growth from data centres and data transmission networks, which each account for 1-1.5% of global electricity use.2

Wait. Double the internet users. A 20-fold increase in traffic expansion, but accounting for just 1% to 1.5% of global energy use? Why is that percentage still the same after all these years? In part because Micron continues to push the boundaries on power efficiency. We recently announced that NVIDIA is using our HBM3E memory in its H200 Tensor Core GPU.3 According to NVIDIA, its new H200 GPU uses half the power of its predecessor for large language model (LLM) calculations. Much of that reduction is due to our efforts to provide power efficiency to power-hungry devices.

 

Source: NVIDIA H200 Tensor Core GPU Datasheet
NVIDIA H200 Tensor Core GPU datasheet

Micron is at the forefront of sustainability

At Micron, we are committed to a sustainable future for all, making us a leader in reducing environmental impacts through more power-efficient and more performant AI solutions. Here are a few examples:

  • Micron HBM3E delivers tremendously improved power efficiencies compared to the competition. It delivers approximately 30% lower power consumption than competitive offerings.3 This product also comes with 1.2 terabytes per second of bandwidth.
  • DDR5 draws less power for inference, delivering up to 48% lower power consumption in AI inference. Impressively, it also offers 28% faster training time.4
  • Micron 6500 ION SSD delivers power savings while delivering longer life. It consumes up to 20% less power than competitor QLC SSDs to reduce operating costs, and it delivers up to 10 times more 4KB random write endurance than competitive QLC SSDs for longer life.5
  • Micron 7450 SSD delivers twice the throughput as the 7300 SSD. Yet, it has approximately the same power consumption, yielding a 50% increase in power efficiency (watts per input/output operations per section [IOPS]).6
  • Micron 9400 SSD delivers up to 77% better power efficiency than its predecessor.7

We take pride in raising the bar on every new product we deliver and ensuring that we’re doing our part to reduce power consumption. We also refuse to compromise on performance, security and other features. Why? Because we believe in delivering the best technologies while focusing on sustainability. As Micron CEO Sanjay Mehrotra said, “… you’ll see that sustainability is not just central to Micron’s vision, mission and values, it is also integral to our long-term strategic plans. We believe we also have a responsibility to help lead sustainability improvements across our industry.”8 Well said, Sanjay! 

AI will bring many opportunities and benefits to humanity, but at Micron, we won’t take our eyes off our sustainability efforts. It’s integral to all that we do because we take our impact on our society and our planet seriously. 

1Global artificial intelligence server market (2023 edition): Analysis by value and unit shipment, server type (data, training, inference, others), AI server infrastructure, hardware architecture, end-use, by region, by country: Market insights and forecast (2019-2029) | Research and Markets | researchandmarkets.com
2Data centres and data transmission networks | International Energy Agency | iea.org
3Micron commences volume production of industry-leading HBM3E solution to accelerate the growth of AI | Micron Technology | micron.com
4128GB_DDR5 RDIMM product brief | Micron Technology | micron.com
5Micron scales storage to new heights with launch of two data center drives | Micron Technology | micron.com
6Steady state as defined by SNIA Solid State Storage Performance Test Specification Enterprise v1.1; Drive write cache enabled; NVMe power state 0; Sequential workloads measured using a flexible input/output (FIO) with a queue depth of 32
7Comparison of 7.68TB SSDs — Micron 9400 SSD: 94,118 4K random read IOPS/watt vs. 53,100 IOPS/watt for prior-generation Micron 9300 NVMe SSD
8A message from our CEO| 2023 Micron Sustainability Report | Micron Technology | micron.com

Sr. Director, Solution Marketing

Larry Hart

As the Senior Director of Solutions Marketing for Micron’s Storage Business Unit, Larry Hart is deeply committed to creating and marketing impactful technology solutions. With a multifaceted background spanning pricing, product marketing, outbound marketing, product management, and ecosystem development, he leads our strategic efforts to drive better technological alignment within our ecosystem, communicate our solutions in the voice of our customers, and deliver maximum total business value to our customers.