Intel Embraces Oil Immersion Cooling For Servers

Intel sees 20 percent annual growth for the technical computing market from 2011-2016. Will the appetite for ever-more powerful computing clusters push users to new cooling technologies, like submerging servers in liquid coolant? If so, Intel will be ready. Intel’s interest in alternative cooling designs is driven by growth projections for the high-performance computing sector.

Intel Embraces Oil Immersion Cooling For Servers Slashdot posting tells that Intel has just concluded a year-long test in which it immersed servers in an oil bath, and has affirmed that the technology is highly efficient and safe for servers. The chipmaker is now working on reference designs, heat sinks and boards that are optimized for immersion cooling.

Intel Gives Oil-Based Cooling Thumbs Up article tells that Intel finds dunking a server in a bath of nonconductive oil may be an ideal cooling solution and Intel gave its seal of approval to dunking a server full of electronic components in a bath of dielectric oil. This approach can lower the PUE to an eye-catching 1.02. After a year’s immersion in the oil bath, all the hardware involved in the test—microprocessors, I/O chips, connectors, hard drives, and server housing—withstood the oil just fine.

Of course, this “new” way of cooling a server isn’t exactly novel. For example power substations have used oil liquid cooling to reduce heat in transformers for ages. Dunking heat-generating microprocessors and graphics cards inside an oil bath has served as an alternative cooling solution for PC over-clockers for many years. Also early supercomputers have used liquid cooling.

Data Center Knowledge article Intel Embraces Submerging Servers in Oil article tells that Intel is optimizing its technology for servers immersed in oil, an approach that may soon see broader adoption in the high performance computing (HPC) sector. Mineral oil has been used in immersion cooling because it is not hazardous and transfers heat almost as well as water, but doesn’t conduct an electric charge. Mike Patterson, senior power and thermal architect at Intel, says that immersion cooling can change the way data centers are designed and operated. Immersion cooling can even eliminate need for chillers and raised floors.

Austin-based Green Revolution Cooling says its liquid-filled enclosures can cool high-density server installations for a fraction of the cost of air cooling in traditional data centers. The company says its approach can produce large savings on infrastructure, allowing users to operate servers without a raised floor, computer room air conditioning (CRAC) units or chillers. Fluid temperature is maintained by a pump with a heat exchanger using a standard water loop.

In essence, the company’s product, called CarnotJet, houses servers in a specialized coolant oil that absorbs the heat they give off and is then sent to a radiator where it’s cooled before being recycled back into the server housing. Each 13U rack can handle between 6 kW and 8 kW of heat, depending on whether the heat pulled away from the servers is exchanged via a traditional radiator or a water loop. The company claims that they can install 100kW or more of compute in each 42U rack. Here is a video of the technology.

The downside is that mineral oil-style coolants can be messy to maintain. A little mineral oil spreads a long way (ie., it’s messy). If you plan to minimize the needed hardware maintenance and keep spare clothes when working with servers, the messiness might to be a very big issue compared to potential gains.

20 Comments

  1. Tomi Engdahl says:

    How to cool a PC with toilet water
    http://www.extremetech.com/extreme/124677-how-to-cool-a-pc-with-toilet-water

    Hot on the heels of news that Google uses toilet water to cool one of its data centers, it has emerged that an enterprising hardware hacker had the same idea some seven years ago. As you will see in the following pictures, though, Jeff Gagnon’s computer is much more than a toilet-cooled rig — it’s a case mod tour de force.

    Then there’s the CPU waterblock, which has been handmade from a lump of copper and, I presume, an arc welder or a soldering iron. But where does that tubing go, I hear you ask? Where’s the water reservoir, the pump, the radiator?

    Well, it just so happens that there’s a toilet on the other side of the wall

    Reply
  2. Tomi Engdahl says:

    Cool technology: Submerged blade servers escape the heat
    http://www.theregister.co.uk/2014/12/12/blade_cooling/

    Keeping servers cool is a challenge, even in a purpose-built data centre.

    Outdoor equipment in Canada often lives in horrible little metal boxes ironically called “sheds” that bear no resemblance to any structure so spacious.

    They are basically a full depth 12U 19in rack bolted to the side of some steel monstrosity made up of nightmares and solar absorption. Inside the box sit 4U of server, 4U of networking and 4U of heating, ventilation and air conditioning.

    With the chiller going flat out the temperature doesn’t drop below 60°C during the hot days, and stays around 50°C for about four months of the year.

    At first glance it would seem that the obvious solution to these problems is to replace the horrible little shed things with something better. That is far easier said than done.

    For reasons involving bureaucrats (and, I am convinced, demonic pacts) getting external enclosures certified for use with telecoms equipment, on oil pipelines or with various other utilities is way harder than it should be.

    Networking along 6,000 kilometres of oil pipeline is spotty at best so cloud computing is not an option,

    So, if the ovens we kindly refer to as shelters are not likely to get any cooler we need servers that can handle higher temperatures.

    The first answer that comes to mind is from LiquidCool Solutions, which does not manufacture IT equipment but licenses patents

    Liquidcool claims to be able to run servers “in ambient temperatures of 50°C or higher while maintaining core junction temperatures 30°C cooler than fan/air based cooling”. That has got my attention, yes indeedy.

    LiquidCool pitches “harsh environments” as a major use case

    LiquidCool licenses patents to build fully enclosed submerged server technology. Put your server in a box of oil, seal it and pump the oil out to a radiator. It is a great concept, but I have a few issues.

    Assuming that there is absolutely no possible way for the metal dust to get into the oil, then a completely sealed, submerged computer might be a solution. Except for the part where swapping computer components out is a job likely to be given to Welder McMetaldust.

    In June, HP announced its Apollo 8000 system at the HP Discover conference.

    Contemplate this for a moment. HP’s “dry disconnect” technology offers the efficiency of liquid cooling, and also the ability to hot-swap nodes from a chassis.

    Project Apollo is aimed at the high-performance computing market. It is designed to get more computers into a smaller space while using less power than the competition.

    Reply
  3. Tomi Engdahl says:

    Aquarium Computer
    http://eleccelerator.com/aquarium-computer/

    the process of building a PC is pretty boring, it’s just an exercise of picking out compatible parts for the right price. I decided to make it slightly more interesting by submerging the entire computer in a fish tank full of mineral oil.

    This isn’t an entirely new idea (even patented so nobody can sell a kit), many people have already done this. Mineral oil is non-conductive and so the electronics will work perfectly fine while submerged

    I have to be really careful with choosing materials. I’ve noted that some people who build mineral oil submerged computers experience no problems with any materials, while other people report that PVC will swell or harden. The PVC swelling is the cause of capacitors popping off circuit boards. Flexible vinyl tubing (also PVC) can become hard. A more detailed research indicates that capacitors on older motherboards has a rubber seal that is failing, causing hot electrolytic fluid inside to bulge, newer motherboards with “solid capacitors” should not fail. Vinyl tubing should not be operated at high temperatures. Rubber and neoprene must definitely be avoided as they will fail in mineral oil. PLA plastic (used by my 3D printer) can’t be used because PLA will warp even in hot water. ABS plastic (also used by my 3D printer) should be fine. Mineral oil is also used to clean many adhesives so I obviously need to be careful with that as well.

    Reply
  4. Tomi Engdahl says:

    Designing with liquid-immersion cooling systems
    http://www.csemag.com/single-article/designing-with-liquid-immersion-cooling-systems/245158e4ef137225b9ee9ccf2e54cd08.html?OCVALIDATE&ocid=101781

    Liquid cooling is an option in some data centers. Consider these best practices when looking at immersion cooling for your next data center project.

    In simple thermodynamic terms, heat transfer is the exchange of thermal energy from a system at a high temperature to one at lower temperature. In a data center, the information technology equipment (ITE) is the system at the higher temperature. The objective is to maintain the ITE at an acceptable temperature by transferring thermal energy in the most effective and efficient way, usually by expending the least amount of mechanical work.

    During steady-state operation, the thermal energy generated equals the rate at which it is transferred to the cooling medium flowing through its internal components. The flow rate requirement and the temperature envelope of the cooling medium is driven by the peak rate of thermal energy generated and the acceptable temperature internal to the ITE.

    For data centers, air-cooling systems have been de facto. From the perspective of ITE, air cooling refers to the scenario where air must be supplied to the ITE for cooling. As the airflow requirement increases due to an increase in load, there is a corresponding increase in fan energy at two levels: the air distribution level (i.e., mechanical infrastructure such as air handling units, computer room air handlers, etc.) and the equipment level, because ITE has integral fans for air circulation.

    Strategies including aisle containment, cabinet chimneys, and in-row cooling units help improve effectiveness and satisfactorily cool the equipment. However, the fact remains that air has inferior thermal properties and its abilities are getting stretched to the limit as cabinet loads continue to increase with time. For loads typically exceeding 15 kW/cabinet, alternative cooling strategies, such as liquid cooling, have become worthy of consideration.

    Liquid cooling refers to a scenario where liquid (or coolant) must be supplied to the ITE. An IT cabinet is considered to be liquid-cooled if liquid, such as water, dielectric fluid, mineral oil, or refrigerant, is circulated to and from the cabinet or cabinet-mounted equipment for cooling. Several configurations are possible, depending on the boundary being considered (i.e., external or internal to the cabinet). For the same heat-transfer rate, the flow rate requirement for a liquid and the energy consumed by the pump are typically much lower than the flow rate requirement for air and the energy consumed by the fan system. This is primarily because the specific volume of a liquid is significantly lower than that of air.

    For extreme load densities typically in excess of 50 to 75 kW/cabinet, the liquid should preferably be in direct contact with ITE internal components to transfer thermal energy effectively and maintain an acceptable internal temperature. This type of deployment is called liquid-immersion cooling and it is at the extreme end of the liquid cooling spectrum.

    the commercially available solutions can essentially be categorized into two configurations:

    1. Open/semi-open immersion. In this type of system, the ITE is immersed in a bath of liquid, such as dielectric fluid or mineral oil. The heat-transfer mechanism is vaporization, natural convection, forced convection, or a combination of vaporization and convection (see Figure 1).
    2. Sealed immersion. In this type of system, the ITE is sealed in liquid-tight enclosures and liquid, such as refrigerant, dielectric fluid, or mineral oil, is pumped through the enclosure. The heat-transfer mechanism is vaporization or forced convection, and the enclosure is typically under positive pressure

    For both types of systems, thermal energy can be transferred to the ambient by means of fluid coolers (dry or evaporative) or a condenser. It can also be transferred to facility water (chilled water, low-temperature hot water, or condenser water) by means of a heat exchanger.

    The liquid properties impact major facets of the design and should be reviewed in detail.

    The right solution?

    When dealing with extremely dense cabinets, immersion cooling is worthy of consideration. It is suitable for deployments ranging from a few kilowatts to several megawatts. Due to improved heat-transfer performance as compared with an air-cooling system, liquid-supply temperatures higher than 100° F are feasible. Higher liquid temperatures increase the hours of economization, offer the potential for heat recovery, and in certain climates can eliminate the need for chillers completely. The elimination of internal ITE fans reduces energy consumption and noise. In addition, pump energy for circulating liquid is typically lower than fan energy.

    Despite the mechanical advantages, there are reasons for caution when deploying liquid-immersion cooling in data centers. The impact on infrastructure, such as structural, electrical, fire protection, and structured cabling, should be evaluated.

    Reply
  5. Tomi Engdahl says:

    IEEE says zero hot air in Fujitsu liquid immersion cooling for data centers
    http://www.cablinginstall.com/articles/pt/2017/05/ieee-says-zero-hot-air-in-fujitsu-liquid-immersion-cooling-for-data-centers.html?cmpid=enl_cim_cimdatacenternewsletter_2017-05-23

    Given the prodigious heat generated by the trillions of transistors switching on and off 24 hours a day in data centers, air conditioning has become a major operating expense. Consequently, engineers have come up with several imaginative ways to ameliorate such costs, which can amount to a third or more of data center operations.
    One favored method is to set up hot and cold aisles of moving air through a center to achieve maximum cooling efficiency. Meanwhile, Facebook has chosen to set up a data center in Lulea, northern Sweden on the fringe of the Arctic Circle to take advantage of the natural cold conditions there; and Microsoft engineers have seriously proposed putting server farms under water.

    Fujitsu, on the other hand, is preparing to launch a less exotic solution: a liquid immersion cooling system it says will usher in a “next generation of ultra-dense data centers.”

    Fujitsu Liquid Immersion Not All Hot Air When It Comes to Cooling Data Centers
    http://spectrum.ieee.org/tech-talk/computing/hardware/fujitsu-liquid-immersion-not-all-hot-air-when-it-comes-to-cooling-data-centers

    Reply
  6. Tomi Engdahl says:

    Researchers are developing ‘naked’ data centers: Report
    http://www.cablinginstall.com/articles/pt/2017/11/researchers-are-developing-naked-data-centers-report.html?cmpid=enl_cim_cim_data_center_newsletter_2017-11-14

    A new outdoor server farm concept that uses vats of liquid-cooled computers instead of buildings could be literally located in farmland. Servers would be stored in vats of cooling, non-conductive oil instead of elaborate, outfitted structures, say engineers who are working on a radical, building-free, data center concept.

    Researchers developing building-free data centers
    https://www.networkworld.com/article/3236472/data-center/researchers-developing-building-free-data-centers.html

    A new outdoor server farm concept that uses vats of liquid-cooled computers instead of buildings could be literally located in farmland.

    Servers should be stored in vats of cooling, non-conductive oil instead of elaborate, outfitted structures, say engineers who are working on a radical, building-free, data center concept.

    French company Horizon Computing is one of the developers behind the project and provides support. It proposes using stacks of 10-gallon barrels filled with Shell DIALA dielectric mineral oil or natural equivalent. Dielectric oil doesn’t have any water in it, so it won’t conduct electricity, but it cools just like water. The computers function as normal and aren’t subject to rust either.

    The idea is that common servers are fully submerged in the barrels where they are chilled by the immersion. Expensive humidity control and air conditioning thus become irrelevant, as do buildings.

    Horizon’s proposed outdoor cooling boxes have numerous other benefits, it explains on its website: The micro-ATX motherboard-containing, case-like pod can operate in an “extreme environment,” such as outdoors, and in “positive temperatures.”

    Reply
  7. Tomi Engdahl says:

    Supply powers immersed computers
    https://www.edn.com/electronics-products/other/4459039/Supply-powers-immersed-computers

    The PT578 AC/DC switching power supply from Powerbox delivers 500 W of power for marine, offshore, and demanding industrial applications. Meeting the marine industry’s need to simplify logistics and reduce energy consumption, the single-output PT578 integrates programmed digital protection, built-in redundancy and paralleling circuitry, and active power factor correction.

    Power supply electronics are designed for immersed computing systems that require the power unit to operate safely within neutral fluid containers.

    A conformal coating protects the power supply against humidity and corrosion. The PT578 meets international marine requirements and complies with vibration specifications in the DNV-GL Table 7.

    The PT578 operates with input voltages of 90 VAC to 265 VAC and with DC bus voltages of 125 V to 375 V. Input AC frequency is 47 Hz to 63 Hz; 440 Hz for naval airborne with reduced PFC. The unit offers a choice of two adjustable output voltages of 24 VDC (23 V to 29 V) or 48 VDC (47 V to 56 V).

    PT578 power supplies cost $200 each in OEM quantities.

    https://www.prbx.com/product/pt578-series/

    Reply
  8. Tomi Engdahl says:

    Submerge your power supply, and other options
    https://www.edn.com/design/power-management/4459203/Submerge-your-power-supply–and-other-options-?utm_content=buffer59da4&utm_medium=social&utm_source=twitter.com&utm_campaign=buffer

    The requirement for marine power supplies is somewhat similar to the one for immersible power supplies. The technology platform developed for the PT578 is suitable for immersed computing systems requiring the power unit to operate safely within neutral fluid containers, but the PT578 is only designed for convection cooling and is protected against humidity and corrosion with conformal coating and can withstand high shock and vibration levels

    Enter immersion computing

    I consider this the best solution for cooling, especially for a data center. The immersed computing application was conceived in 2005 for data centers as the increasing heat generated by high-speed processors year-after-year became unmanageable via conventional means. From the technology developed for immersion cooling of data centers, the embedded computing industry addressing these demanding applications, adopted this technology

    Using neutral fluid seems to be easy and not a problem in which to immerse power supplies (as demonstrated at CES 2017–see below for details) but in reality, to secure long term reliability, there are many challenges to address (e.g. electrolyte effects due to AC current switches and metal interactions between aluminum and others.)

    Electronic components could now be safely immersed in this non-conductive fluid which conducts component heat to the fluid. This reduces the need for heat sinks and fans as well as thermal interface material.

    At CES 2017, Gigabyte and 3M showed off their “underwater computer” demo with a functioning computer submerged in 3M Novec cooling liquid.

    Rack-mounted servers from any OEM like Supermicro, Nvidia, Dell, and others are installed in special racks filled with a dielectric mineral oil blend called ElectroSafe, an electrical insulator with 1,200× more heat capacity by volume than air. The heat is transferred to the coolant which then is pumped through a heat exchanger, filter, and coolant pump through a warm water loop and into an evaporative cooling tower and then re-circulating through the system in reverse back to the server.

    One of two techniques of immersion cooling is typically chosen: single phase or two phase.

    Liquid convection cooling is probably the next best solution after submersible cooling, but far better than air-cooled systems. Here is a pretty neat possibility to enhance standard convection cooling.

    LSDC density is measured in kW per server rack; present systems are in the range of about 30kW, and although 100kW may be achievable in the near future, whether this will be cost effective needs to be debated.

    Reply
  9. Tomi Engdahl says:

    Oil-Cooled Raspberry Pi
    https://www.hackster.io/314reactor/oil-cooled-raspberry-pi-449a74

    An experiment to see if the Pi can be oil-cooled and how effective it is in the summer heat.

    Reply
  10. Tomi Engdahl says:

    Oil & Electronics? the best way to cool electronics? (Experiment)
    https://www.youtube.com/watch?v=78WgdXpAVxw

    In this video I will show you why oil and electronics can sometimes represent a useful combination. Along the way I will perform a couple of experiments with the oil in order to determine its resistance as well as its cooling capacity. Let’s get started!

    Reply
  11. Tomi Engdahl says:

    Microsoft is dunking servers into boiling liquid to keep them cool
    https://www.pcgamer.com/microsoft-is-dunking-servers-into-boiling-liquid-to-keep-them-cool/

    A unique two-phase immersion cooling strategy keeps these servers running at full speed.

    Reply
  12. Tomi Engdahl says:

    Microsoft Dunks Servers Into Boiling Fluid to Cool Them Off
    https://www.extremetech.com/computing/321578-microsoft-dunks-servers-into-boiling-fluid-to-cool-them-off?utm_campaign=trueAnthem%3A+Manual&utm_medium=trueAnthem&utm_source=facebook

    Microsoft has been exploring innovative ways to cool its data center servers for some years now. In the past, the company has previously made waves for its offshore data center cooling using seawater via its Project Natick. Now, it’s showing off a two-phase liquid cooling solution it says enables even higher server densities.

    The new system uses a non-conductive cooling fluid. Microsoft doesn’t precisely identify it, but it sounds similar to 3M’s Novec 1230, with a very low boiling point around 122F (Novec 1230 boils at 120.6F). Boiling off the coolant creates a vapor cloud, which rises and contacts a cooled condenser at the top of the tank lid. The liquid then rains back down into the closed-loop server chassis, resupplying the systems with freshly cooled coolant. Heat is also transferred from the server tank to a dry cooler outside the enclosure and dissipated there as well. Immersion cooling works because direct contact with a non-conducting fluid offers far better thermal dissipation than a conventional air cooler.

    Reply
  13. Tomi Engdahl says:

    Microsoft Dunks Servers Into Boiling Fluid to Cool Them Off
    https://www.extremetech.com/computing/321578-microsoft-dunks-servers-into-boiling-fluid-to-cool-them-off?utm_campaign=trueAnthem%3A+Trending+Content&utm_medium=trueAnthem&utm_source=facebook

    Microsoft has been exploring innovative ways to cool its data center servers for some years now. In the past, the company has previously made waves for its offshore data center cooling using seawater via its Project Natick. Now, it’s showing off a two-phase liquid cooling solution it says enables even higher server densities.

    The new system uses a non-conductive cooling fluid. Microsoft doesn’t precisely identify it, but it sounds similar to 3M’s Novec 1230, with a very low boiling point around 122F (Novec 1230 boils at 120.6F). Boiling off the coolant creates a vapor cloud, which rises and contacts a cooled condenser at the top of the tank lid.

    Reply
  14. Tomi Engdahl says:

    Maailman ensimmäinen nestejäähdytetty datakeskuskontti
    https://etn.fi/index.php/13-news/12560-maailman-ensimmaeinen-nestejaeaehdytetty-datakeskuskontti

    Schneider Electric on esitellyt konttiin ahdetun datakeskuksen, jota se kehuu maailman ensimmäiseksi nestejäähdytteiseksi, liikuteltavaksi datakeskukseksi. Esiasennetussa modulaarisessa datakeskuskontissa on Avnetin integroima ja Iceotopen kehittämä jäähdytysratkaisu, jossa palvelimet ovat upotettu jäähdytysnesteeseen.

    Reply
  15. Tomi Engdahl says:

    Schneider to sell Chilldyne direct-to-chip liquid cooling systems
    Adds another cooling option to its arsenal
    https://www.datacenterdynamics.com/en/news/schneider-to-sell-chilldyne-direct-to-chip-liquid-cooling-systems/

    Reply
  16. Tomi Engdahl says:

    LiquidStack CEO on why you shouldn’t ignore immersion cooling
    Depending on use case, the efficiency gains can be significant
    https://www.theregister.com/2023/04/14/liquidstack_immersion_cooling/

    Reply
  17. Tomi Engdahl says:

    Frederic Lardinois / TechCrunch:
    AWS details updates to its data centers, including liquid cooling for AI servers, more simplified electrical and mechanical designs for server racks, and more

    AWS bets on liquid cooling for its AI servers
    https://techcrunch.com/2024/12/02/aws-bets-on-liquid-cooling-for-its-ai-servers/

    Reply

Leave a Reply to Tomi Engdahl Cancel reply

Your email address will not be published. Required fields are marked *

*

*