Skip to main content Scroll Top

Amazon Web Services is building equipment to cool Nvidia GPUs as AI boom accelerates

The letters AI, which stands for “artificial intelligence,” rise on the Amazon Internet Products and services sales space on the Hannover Messe commercial business truthful in Hannover, Germany, on March 31, 2025.

Julian Stratenschulte | Image Alliance | Getty Photographs

Amazon mentioned Wednesday that its cloud section has advanced {hardware} to chill i’m sick next-generation Nvidia graphics processing devices which might be worn for synthetic wisdom workloads.

Nvidia’s GPUs, that have powered the generative AI growth, require large quantities of power. That implies firms the usage of the processors want backup apparatus to chill them i’m sick.

Amazon regarded as erecting information facilities that might accommodate common liquid cooling to form the these kind of power-hungry Nvidia GPUs. However that procedure would have taken too lengthy, and commercially to be had apparatus wouldn’t have labored, Dave Brown, vice chairman of compute and gadget studying products and services at Amazon Internet Products and services, mentioned in a video posted to YouTube.

“They would take up too much data center floor space or increase water usage substantially,” Brown mentioned. “And while some of these solutions could work for lower volumes at other providers, they simply wouldn’t be enough liquid-cooling capacity to support our scale.”

Instead, Amazon engineers conceived of the In-Row Warmth Exchanger, or IRHX, that may be plugged into present and fresh information facilities. Extra conventional breeze cooling was once enough for earlier generations of Nvidia chips.

Consumers can now get right of entry to the AWS provider as computing cases that walk through the title P6e, Brown wrote in a weblog submit. The fresh programs accompany Nvidia’s design for hazy computing talent. Nvidia’s GB200 NVL72 packs a unmarried rack with 72 Nvidia Blackwell GPUs which might be stressed in combination to coach and run immense AI fashions.

Computing clusters in keeping with Nvidia’s GB200 NVL72 have prior to now been to be had via Microsoft or CoreWeave. AWS is the arena’s biggest provider of cloud infrastructure.

Amazon has rolled out its personal infrastructure {hardware} within the future. The corporate has customized chips for general-purpose computing and for AI, and designed its personal vault servers and networking routers. In operating homegrown {hardware}, Amazon relies much less on third-party providers, which is able to get advantages the corporate’s base order. Within the first quarter, AWS delivered the widest working margin since no less than 2014, and the unit is chargeable for maximum of Amazon’s web source of revenue.

Microsoft, the second one biggest cloud supplier, has adopted Amazon’s supremacy and made strides in chip building. In 2023, the corporate designed its personal programs referred to as Sidekicks to chill the Maia AI chips it advanced.

WATCH: AWS proclaims original CPU chip, will ship report networking velocity

AWS announces latest CPU chip, will deliver record networking speed

SHARE THIS ARTICLE

Privacy Preferences
When you visit our website, it may store information through your browser from specific services, usually in form of cookies. Here you can change your privacy preferences. Please note that blocking some types of cookies may impact your experience on our website and the services we offer.