Microsoft is known to get creative when it comes to data center cooling. Back in 2016, the company released details about Project Natick, an initiative to cool immense data center heat by submerging servers at the bottom of the ocean. Last year, Microsoft proved the efficacy of this system by tasking the 864 servers for Project Natick to find a COVID-19 vaccine.
While undersea data centers are certainly effective for thermal management, it isn’t a scalable option for many cloud providers. Furthermore, traditional air cooling methods are falling flat as Moore’s law continues and computing demands become more intense.
While computing companies have a number of cooling options at their disposal—for instance, cold plates and heat sinks—Microsoft has turned to another established cooling technique called immersion cooling.
Microsoft employee extracts a server blade from an immersion cooling tank. Image used courtesy of Gene Twedt and Microsoft
Two-phase Immersion Cooling
Microsoft has tapped two-phase liquid immersion cooling—specifically for its remote data centers.
The technique works by immersing the electronic components in a specially engineered dielectric liquid. The liquid is not harmful to the electronics; its boiling point is significantly lower than water (~60°C compared to 100°C). When the components heat up, their heat is transferred into the liquid. The liquid will eventually boil, evaporating the liquid and carrying the heat away with it.
The two-phase immersion cooling cycle. Image used courtesy of Gigabyte
This gas then rises to the top of the chamber and is actively cooled, often by a condenser coil. Finally, the cooled gas returns to a liquid phase, falling back into the vat of liquid-like raindrops and completing a continuous cycle of cooling.
While companies like Submer and LiquidStack have been using two-phase liquid immersion cooling for years now, Microsoft is the first big computing player to implement this technique in a production environment.
Advantages to Immersion Cooling
Microsoft was drawn to this cooling method thanks to its many advantages in data center applications.
Immersion cooling is significantly more efficient for heat transfer than conventional air cooling, offering up to 90% energy savings. This is because the engineered liquids have much higher thermal conductivity than air, allowing more heat to be transferred easily.
Another result of this improved efficiency is that the data centers can afford to become even denser without concerns of overheating.
Boing liquid carries away heat created by computer servers. Image used courtesy of Microsoft
The system is also passive, not active. Like a self-enclosed ecosystem, the liquid will continually boil, coil, and rain back down in an ongoing loop of electronic cooling. The benefit of a passive system is that it requires low maintenance and upkeep, saving time and money. It can also be implemented in remote locations.
Microsoft Brings the Sea to the Servers
Microsoft’s decision to implement two-phase immersion cooling is backed by research and experimentation. After implementing this system in its own data center, Microsoft found that servers saved power by 5% to 15%. The company’s hope is that this method will make data centers more independent while lowering failure rates and maintenance requirements.
As a result of the power savings and improved cooling capacity, Microsoft hopes to create more densely packed servers. This could improve computing capabilities, specifically by extending data centers’ ability to serve more users at a given time.
Comparing the performance improvements of immersion cooling to the success of Project Natick, Husam Alissa, a principal hardware engineer on Microsoft’s data center team, explains, “We brought the sea to the servers rather than put the datacenter under the sea.”
Project Natick servers had one-eighth the failure rate compared to a replica server in a land data center. Image used courtesy of Microsoft
Plans to Scale to Immersion Cooling
As of now, Microsoft already has one tank running workloads in a hyperscale data center. Over the coming months, the company plans to perform tests to prove the viability of this technique with the hopes of eventually implementing it on a large scale.