Last updated: April 26, 2026
Quick Answer: Direct-to-chip and immersion cooling are liquid-based thermal management technologies that remove heat directly from high-power AI accelerators, achieving Power Usage Effectiveness (PUE) as low as 1.02β1.15 compared to 1.4β1.6 for traditional air cooling. These systems handle 40 to 200+ kW per rack, making them the only viable path for deploying dense AI workloads at the edge. As of 2026, most new hyperscale builds specify liquid cooling as a baseline requirement.
Air cooling fails modern AI accelerators because the heat density per rack has grown faster than fan technology can compensate. A single NVIDIA H100 GPU can draw over 700W; a fully loaded rack of AI accelerators can exceed 100 kW, while traditional air-cooled racks top out around 10β15 kW.
This gap is exactly why direct-to-chip and immersion cooling: thermal management solutions enabling edge AI infrastructure have shifted from niche options to mainstream requirements. The physics are straightforward: water conducts heat roughly 25 times more efficiently than air, and dielectric fluids used in immersion systems are even more effective at absorbing heat from submerged components.
The edge makes this harder, not easier. Edge AI nodes sit in telecom facilities, factory floors, and retail back rooms, where space is tight and there’s no room for raised floors, precision air conditioning units, or large cold aisles. Liquid cooling solves both the density problem and the footprint problem at once.
“Most new hyperscale data center builds announced in 2025 and 2026 now specify DLC-ready infrastructure as a base requirement.” [4]

Direct-to-chip (DTC) cooling attaches a metal cold plate directly to the surface of a processor or GPU, circulating chilled liquid through the plate to pull heat away at the source. The liquid then carries that heat to a Coolant Distribution Unit (CDU), which transfers it to a facility water loop or external heat exchanger.
| Component | Function |
|---|---|
| Cold plate | Mounts on chip; skived fin structure maximizes surface area for heat transfer |
| Coolant Distribution Unit (CDU) | Manages coolant flow, pressure, and temperature across multiple racks |
| Manifold/quick-connect fittings | Route coolant to each cold plate within a rack |
| Inverter-driven pumps | Adjust flow rate dynamically to match thermal load |
| Sensors and controls | Monitor inlet/outlet temperatures and flag anomalies |
LG Electronics, for example, is presenting a 1.4 MW CDU at Data Center World 2026 with compact design, inverter-driven pumps, and integrated sensing for stable, energy-efficient operation [1][2]. The cold plates in LG’s lineup use skived fin structures that increase internal surface area, improving heat transfer without increasing the plate’s external footprint.
Choose DTC cooling if: the facility already has a water infrastructure, the budget is constrained, or the deployment involves a mix of AI and standard compute in the same rack. DTC can be retrofitted onto existing rack designs more easily than immersion.
Common mistake: Undersizing the CDU for future rack density growth. Always spec the CDU at 20β30% above current peak load to accommodate next-generation accelerators.
Immersion cooling submerges entire server boards in a thermally conductive dielectric fluid inside sealed tanks, eliminating the need for fans entirely and allowing heat to transfer directly from every component surface into the fluid. This is fundamentally different from DTC, which only cools the primary chip and still relies on airflow for other components.
There are two main variants:
LG is expanding its immersion portfolio through a collaboration with Green Revolution Cooling (GRC), a U.S.-based immersion cooling specialist, offering tank systems that submerge IT equipment directly in dielectric liquid [1][2]. LG has also jointly developed immersion cooling fluids with SK Enmove, addressing the critical fluid chemistry component that determines long-term component compatibility and thermal performance [1].
Edge case to watch: Two-phase systems require careful management of fluid vapor pressure and containment. In edge environments with variable ambient temperatures, phase-change dynamics can become unpredictable without proper enclosure design.

The right choice depends on rack density targets, capital budget, and whether the facility can support tank infrastructure. Neither technology is universally superior; each fits a different operational profile.
| Factor | Direct-to-Chip (DTC) | Single-Phase Immersion | Two-Phase Immersion |
|---|---|---|---|
| Rack density | 40β100+ kW | 100β200 kW | 200β300+ kW |
| PUE | 1.10β1.15 | 1.04β1.06 | 1.02β1.04 |
| Capital cost premium | Moderate | High (+$1β2M/MW vs. DTC) [4] | Highest |
| Retrofit complexity | Lowβmoderate | High | High |
| Hardware lifespan impact | Moderate improvement | Up to 40% longer vs. air [7] | Up to 40% longer vs. air [7] |
| Fan elimination | Partial | Full | Full |
| Fluid management | Minimal | Moderate | Complex |
Choose DTC if: rack density is under 100 kW, budget is constrained, or a retrofit into an existing facility is required.
Choose single-phase immersion if: density targets exceed 100 kW per rack, long-term hardware longevity is a priority, and capital budget allows for tank infrastructure.
Choose two-phase immersion if: deploying cutting-edge AI training clusters at maximum density and operational teams have the expertise to manage phase-change fluid systems.
Edge AI nodes face a unique combination of high compute density and constrained physical space, making liquid cooling not just beneficial but often the only workable option. A factory automation system running real-time inference on a GPU cluster can’t rely on a raised-floor data center with precision air conditioning.
Key reasons liquid cooling is essential at the edge:
A 2026 surge in liquid cooling adoption is specifically noted for CDU-based systems and direct-to-chip configurations that enable efficient coolant distribution at scale in distributed deployments [3].
The capital cost of liquid cooling is higher than air cooling upfront, but total cost of ownership often favors liquid cooling over a 5-year horizon when energy savings and hardware longevity are factored in. The gap between DTC and immersion is significant and should drive the technology selection process.
Direct-to-chip systems:
Immersion systems:
Common mistake: Evaluating only capital cost without modeling energy savings over the deployment lifetime. At $0.08β$0.12/kWh for industrial power, a 30% reduction in cooling energy for a 500 kW edge cluster saves roughly $100,000β$150,000 annually (estimate based on 8,760 operating hours and stated PUE improvement range).
2026 marks a significant maturation point for liquid cooling, with major OEMs moving from pilot programs to full product lines. The global data center liquid cooling market reached $16.16 billion in 2026 [8], and vendor activity reflects that scale.
LG Electronics is presenting a comprehensive DTC and immersion cooling portfolio at Data Center World 2026, including:
This end-to-end approach, covering hardware, tanks, and fluid chemistry, signals that the market is consolidating around integrated solution providers rather than component specialists.
Industry-wide: Most new hyperscale data center builds announced in 2025 and 2026 specify DLC-ready infrastructure as a baseline requirement [4], which is accelerating supply chain development for cold plates, CDUs, and dielectric fluids.

Liquid cooling introduces new failure modes and operational requirements that air-cooled facilities don’t face. Teams deploying these systems need to plan for fluid management, leak detection, and staff training.
Edge case: In cold climates, facility water loops can drop below the dew point of the coolant return temperature, causing condensation inside CDUs. Proper controls and insulation are required for edge deployments in northern facilities.
Q: What is the minimum rack density that justifies direct-to-chip cooling? A: Direct-to-chip cooling becomes cost-justified when rack density exceeds roughly 20β30 kW per rack, where air cooling requires increasingly expensive precision cooling infrastructure. At 40+ kW, DTC is generally the more economical long-term choice.
Q: Can immersion cooling be used for standard IT equipment, not just AI accelerators? A: Yes, but the economics rarely justify it for standard servers below 10β15 kW per rack. Immersion cooling is most cost-effective for high-density AI, HPC, and GPU workloads where the density and energy savings justify the capital investment.
Q: What dielectric fluid is used in immersion cooling? A: Common options include engineered fluids from 3M (Novec), Engineered Fluids, and specialty lubricant providers like SK Enmove (which collaborates with LG). Fluid selection affects thermal performance, component compatibility, and long-term operating costs [1].
Q: Does immersion cooling void server warranties? A: It depends on the vendor. Some OEMs now offer immersion-ready server configurations with compatible materials. Others still restrict warranty coverage for immersion-deployed hardware. Always verify warranty terms before deployment.
Q: What PUE can a well-designed edge AI facility achieve with liquid cooling? A: A DTC-cooled edge facility can realistically achieve PUE of 1.10β1.15. Single-phase immersion can reach 1.04β1.06. Two-phase immersion can reach 1.02β1.04 in optimized deployments [4][7].
Q: How does liquid cooling affect server noise levels? A: DTC cooling reduces but doesn’t eliminate fan noise, since fans still cool other components. Full immersion cooling eliminates fans entirely, making it nearly silent, which is a significant advantage for edge deployments in noise-sensitive environments.
Q: Is direct-to-chip cooling compatible with existing data center water infrastructure? A: Generally yes, with modifications. Most CDUs connect to a facility chilled water or cooling tower loop. Water temperature requirements vary; many DTC systems accept facility water at 18β25Β°C, which is achievable with standard cooling infrastructure.
Q: What is a Coolant Distribution Unit (CDU) and why does it matter? A: A CDU is the central hub that manages coolant flow, pressure, and temperature for a liquid cooling system. It connects the facility water loop to the server-side coolant loop. A well-designed CDU, like LG’s 1.4 MW unit with inverter-driven pumps, is critical for energy efficiency and stable operation across variable AI workloads [1][2].
Q: How long does it take to deploy a direct-to-chip cooling system? A: For a greenfield edge AI cluster, DTC deployment typically takes 4β12 weeks depending on facility readiness, CDU sizing, and cold plate customization. Retrofits into existing racks take longer due to design and compatibility work.
Q: Are there environmental benefits beyond energy efficiency? A: Yes. Lower PUE means less total energy consumption and a smaller carbon footprint. Immersion cooling also eliminates the need for refrigerant-based precision air conditioning, removing a significant source of greenhouse gas risk from equipment leaks.
Direct-to-chip and immersion cooling: thermal management solutions enabling edge AI infrastructure are no longer emerging technologies. They are production-proven systems that the industry is adopting at scale, with the global liquid cooling market at $16.16 billion in 2026 [8] and hyperscalers treating DLC-readiness as a baseline requirement [4].
For teams planning or upgrading edge AI deployments, here are concrete next steps:
The thermal challenge of edge AI is not going away. Every new generation of AI chips runs hotter and denser than the last. Getting the cooling infrastructure right now is the foundation for every AI workload that follows.
[1] LG Electronics Showcases AI Data Center Cooling Solutions at Data Center World 2026 – https://www.lg.com/global/newsroom/news/eco-solution/lg-electronics-showcases-ai-data-center-cooling-solutions-at-data-center-world-2026/
[2] LG Electronics Brings End-to-End AI Data Center Cooling to Data Center World 2026 – https://www.blackridgeresearch.com/news-releases/lg-electronics-brings-end-to-end-ai-data-center-cooling-to-data-center-world-2026
[3] Data Center Trends: Cooling Strategies to Watch in 2026 – https://airsysnorthamerica.com/data-center-trends-cooling-strategies-to-watch-in-2026/
[4] Data Center Cooling Technology 2026 – https://build.inc/insights/data-center-cooling-technology-2026
[7] Immersion-First Data Centers 2026 Architecture – https://techbytes.app/posts/immersion-first-data-centers-2026-architecture/
[8] Data Center Liquid Cooling Market Report 2026: $16.16 Bn Opportunities, Trends, Competitive Landscape, Strategies and Forecasts 2020β2025, 2025β2030F, 2035F – https://www.globenewswire.com/news-release/2026/02/04/3232076/0/en/data-center-liquid-cooling-market-report-2026-16-16-bn-opportunities-trends-competitive-landscape-strategies-and-forecasts-2020-2025-2025-2030f-2035f.html