Passive Thermal Computing Infrastructure

The data center
has no building.
The river is the cooling system.

Submera submerges GPU compute hardware directly into natural water bodies using passive thermodynamics to reject heat with zero mechanical cooling. No pumps. No fans. No chillers. Just physics.

See how it works View deployment model
~1.0
PUE target
57°F
validated cooling delta
Zero
moving cooling parts
~4 ft²
land footprint
$26B
cooling market (2025)
The Problem

AI compute is hitting a thermal wall.

Modern GPU racks generate up to 120 kW per rack, six times what air cooling can handle. The industry spends billions of dollars, millions of gallons of freshwater, and acres of real estate just managing heat. Cooling has become the bottleneck of the entire AI infrastructure stack.

120 kW

Per rack, GB200 NVL72

Air cooling physically maxes out at 15–25 kW per rack. The newest GPU systems generate 5–8× that load. Air cooling is not a future option, it is already obsolete at the frontier.

1.58×

Industry average PUE

For every watt going to compute, 0.58 watts are wasted on cooling overhead. At hyperscale, that represents billions of dollars in electricity consumed annually without doing any useful work.

40M gal

Water per year, 10MW facility

Evaporative cooling towers consume tens of millions of gallons of freshwater annually per data center. Regulators and communities in water-stressed regions are pushing back hard.

$26B

Cooling market (2025)

The data center cooling market is growing at 22.3% CAGR, expected to reach $128B by 2033. The industry is not looking to optimize existing cooling. It needs to reinvent it entirely.

The Solution

Passive thermal siphon. Zero moving parts.

Submera places sealed aluminum enclosures containing GPU hardware directly into a natural water body. Heat moves from hardware to dielectric fluid to aluminum wall to river water, entirely by the laws of thermodynamics. No mechanical system required at any step.

01

6061-T6 aluminum enclosure

Precision-fabricated, hermetically sealed aluminum housing. Aluminum conducts heat at 167 W/m·K, among the highest of any structural metal. The enclosure wall is the heat exchanger. No additional components needed.

02

EDM250 dielectric fluid

Hardware is bathed in EDM250 dielectric fluid, electrically inert, non-corrosive, and safe for direct contact with all server components. Natural convection circulates the fluid inside the enclosure, carrying heat from GPUs to the walls continuously without any pump.

03

Natural water body as heat sink

A river, lake, reservoir, or ocean at 40–65°F acts as an effectively infinite heat sink. No matter how many watts the GPUs generate, the water continuously absorbs and carries that energy away. Cooling capacity scales with water flow, not with mechanical infrastructure.

04

Armored umbilical to shore

Power and fiber run through a single armored conduit from the submerged frame to shore infrastructure. The deployed node is operationally identical to a rack-mounted GPU server, remotely monitored, fully accessible, invisible to the software stack above it.

What Submera eliminates

Pumps & CDUs
Server fans
Chillers
Facility HVAC
Cooling towers
Water treatment systems
PFAS / refrigerant exposure
The building itself
1.00
Theoretical PUE, zero cooling overhead
How It Works

From riverbank to live compute node.

A single Submera deployment site requires one post in the ground, one arm over the water, and one umbilical to shore. That is the entire physical infrastructure required to bring 20 GPU compute units online.

Step 01

Site survey

A stretch of river or lake shoreline is evaluated for depth (8–15 ft target), flow rate (2–4 mph), riverbed stability, and year-round water temperature. Submera targets sites where ambient water temperature stays below 65°F, which covers the vast majority of U.S. inland waterways.

Step 02

Post installation

A single 4×4 steel post is driven 20 ft into the riverbank with a concrete footing and extends 15 ft above ground. This is the only permanent land-side structure. Approximately 4 square feet of land footprint. No building permit, no facility construction, no HVAC infrastructure.

Step 03

Cantilever arm

A triangulated obtuse-angle steel arm extends 15 ft over the water from the post. A diagonal brace from the post top to the arm tip provides structural rigidity under full load. The arm connects to a float collar that slides freely up and down the main post as water levels change.

Step 04

H-frame deployment

An H-shaped steel frame hangs from the arm tip with Rail A and Rail B running parallel, each carrying 10 Submera units suspended 1–2 ft below the river surface. All 20 units are positioned directly over the water, fully submerged, continuously cooled.

Step 05

Float system

Pairs of 55-gallon air-filled drums at the tip and midpoint of each rail keep the entire frame at a fixed depth relative to the water surface. As the river rises or falls seasonally by 8 ft or more, the float system compensates automatically with zero mechanical input or manual adjustment.

Step 06

Online in under an hour

An armored umbilical carries power and fiber from shore to the submerged frame. Within minutes of submersion, GPU junction temperatures reach thermal equilibrium. Validated testing showed 50–57°F temperature reduction and full thermal equilibrium in under one hour at 300W sustained load.

Deployment Visualization

The structure, live simulation.

Explore the deployment geometry and float mechanism interactively. The water level control shows how the buoy system keeps all 20 units at consistent submersion depth regardless of river conditions. One post on land. Twenty compute units in the river.

+0 ft
Drag the slider or click Animate to see the passive float system track the water level in real time.
In development

Submera is actively developing a floating platform deployment method, a modular pontoon-based structure designed for open lake and river locations where fixed shoreline anchoring is not feasible. This approach will expand our deployable surface area significantly, enabling high-density compute clusters positioned directly over deep water with no land footprint whatsoever.

Where It Deploys

Any water body. Any waterfront.

Submera is not tied to a specific geography. Anywhere there is a natural or managed water body with sufficient depth and flow, Submera can deploy. The U.S. alone has over 250,000 rivers and streams and thousands of lakes, representing an enormous untapped deployment surface.

Rivers & streams

Flowing water provides a continuously refreshed heat sink with no recirculation required. Moderate flow rates of 2–4 mph are ideal, enough to exchange heat at the enclosure surface without requiring heavy anchoring. The Blue River corridor in Kansas City is Submera's initial target deployment zone.

flowing heat exchangecontinuous refresh2–4 mph ideal

Lakes & reservoirs

Still water bodies with sufficient depth provide stable thermal stratification. Submera's proof-of-concept validation was conducted at Hamilton Lake Reservoir in January 2025, achieving 57°F temperature reduction at 300W sustained load with zero mechanical cooling assistance.

thermal stratificationstable environmentvalidated Jan 2025

Industrial waterways

Managed channels, cooling ponds, and industrial waterways adjacent to power infrastructure offer ideal co-location opportunities. Proximity to grid power combined with continuous water flow makes industrial waterways highly attractive for high-density GPU deployments.

grid-adjacentmanaged depthhigh density potential

Coastal & international

Ocean-adjacent deployments and international markets in power-constrained regions, particularly across Africa and Southeast Asia, represent significant long-term opportunity. Water-cooled compute at near-zero PUE is especially compelling where grid power is expensive or unreliable.

global applicabilityemerging marketspower-constrained fit
250,000+

Rivers and streams in the United States alone, plus thousands of lakes, reservoirs, and managed waterways. Submera's deployment surface is not constrained by real estate availability. It is constrained only by proximity to water.

Validation Data

Proof of concept. Real numbers.

Submera's V1 system, a Dell PowerEdge R610 in a custom-fabricated sealed aluminum enclosure, was submerged at Hamilton Lake Reservoir in January 2025. The results validated the core passive thermal siphon hypothesis with measured data across all sensor positions.

50–57°F

Temperature reduction across all sensor positions, bulk fluid, enclosure wall, and lid, compared to air-cooled sealed baseline at the same 300W sustained load.

Hamilton Lake, Jan 2025
< 1 hr

Time to reach full thermal equilibrium after submersion. Air-cooled baseline never reached equilibrium, temperatures were still rising when the test was terminated.

300W sustained load
1.00

Effective PUE during submerged operation. Zero mechanical cooling of any kind was used at any point in the test. All heat rejection was passive and thermodynamic.

Zero mechanical assist
Measurement Test 1: Sealed / Air Test 2: Lake Submerged Delta
Ambient temp58°F air40°F water,
Bulk fluid temp110°F (rising ↑)60°F (stable ✓)−50°F
Enclosure wall temp95°F (rising ↑)46°F (stable ✓)−49°F
Lid temp107°F (rising ↑)50°F (stable ✓)−57°F
Thermal equilibriumNOT reached< 1 hour ✓,
Mechanical coolingNoneNoneSame
Competitive Landscape

Every competitor still needs active components.

The immersion cooling market is real and growing, Trane Technologies acquired LiquidStack in February 2026 for ~$85B in enterprise value, validating the category. But every existing solution still requires pumps, CDUs, and mechanical infrastructure. Submera is the only architecture that eliminates them entirely.

Category Submer LiquidStack GRC Direct-to-Chip Submera
Pumps / CDUsYesYes (CDU)Yes (CDU)Yes (CDU)NONE
Fans requiredNoNoNoYES (partial)NONE
Facility coolingCDU + loopCDU + towerCDU + loopCDU + CRACNONE
PUE1.03–1.051.02–1.031.02–1.051.03–1.101.005–1.04
CapEx / kW$3K–$8K$5K–$12K$3K–$6K$1.5K–$4K~$1K–$2K
PFAS exposureNoneHIGHNoneNoneZERO
Water useVariesVariesVariesVariesZERO
Moving partsMultipleMultipleMultipleMultipleZERO
Market signal: Trane Technologies (~$85B enterprise) acquired LiquidStack in February 2026, directly validating the immersion cooling category. Schneider Electric acquired Motivair for ~$850M in October 2024. Samsung acquired Flak Group for $1.5B in May 2025 to enter the space. The largest industrial companies in the world are buying their way into this market. Submera is building the passive alternative that none of them have.
Market Opportunity

A $26B market growing at 22% CAGR.

The data center cooling market is one of the fastest-growing infrastructure segments in the world, driven directly by AI compute density growth. Submera is positioned at the intersection of immersion cooling adoption and passive architecture, a category with no current direct competitors.

T
A
M

Global DC cooling

22.3% CAGR

$26.3B → $128B
2025 → 2033
S
A
M

Liquid cooling (all types)

20.1% CAGR

$6.65B → $29.5B
2025 → 2033
S
O
M

Submera reachable (5-yr)

Years 1–5 build

$4.75M → $62.5M
revenue projection

Market validation signals

Trane Technologies (~$85B) acquiring LiquidStack, February 2026

Schneider Electric acquired Motivair for ~$850M, October 2024

Samsung acquired Flak Group for $1.5B to enter DC cooling, May 2025

AWS launched proprietary liquid cooling for GPU racks, July 2025

Top 4 hyperscalers: $290B+ combined AI/DC CapEx in 2025

AI rack density now 40–120 kW, air cooling physically insufficient at frontier

Revenue Model

Three paths to revenue. One technology.

Submera's business model is sequenced for capital efficiency, the same hardware that validates the technology generates the first revenue, funds the next build, and establishes the track record for product sales and IP licensing at scale.

01
Compute-as-a-Service
Near-term / validation revenue

Deploy Submera units in the Blue River corridor and rent GPU compute time to enterprise AI teams, researchers, and inference providers. Because our cooling costs nothing to run, our operating expenses are a fraction of traditional data centers, allowing us to offer competitive pricing while maintaining strong margins.

  • Rent GPU compute time to enterprise AI teams, researchers, and inference providers
  • Lower operating costs mean we can undercut market pricing while keeping stronger margins
  • Electricity is the only real ongoing cost, cooling is free
02
Product Sales
Primary growth driver

Sell Submera enclosures and deployment kits to data center operators, edge compute providers, defense contractors, and HPC research facilities. Any operator located near water eliminates their entire cooling CapEx and OpEx by deploying Submera.

  • 5–10 units/yr (early) → 100+/yr (at scale)
  • Target: DC operators, edge, defense, research
03
IP Licensing
Long-term / high margin

License the passive thermal siphon architecture to OEM manufacturers and hyperscalers. Non-provisional patent filing by November 2026 establishes the IP foundation for licensing conversations with Dell, HPE, and the hyperscaler ecosystem.

  • Potential licensees: Dell, HPE, hyperscalers, cooling OEMs
  • Per-unit royalties + annual licensing fees
  • Capital-light revenue at scale
Build V2 Rent compute Generate revenue Scale to 4 units Begin product sales IP licensing Exit / acquisition

Capturing just 1–2% of immersion cooling as it grows to $4–7B by 2030 represents $40–140M in revenue opportunity at Submera's cost structure.

Milestones & Roadmap

18–24 months from funded to commercial.

Submera's execution plan is sequenced to reach GPU-class thermal validation, first compute revenue, and patent filing within 18 months of funding close, with product sales conversations beginning in month 16.

Phase 1

Hardware & Build

Months 1–3

Procure 4× H100 PCIe GPUs and CNC tooling. Fabricate V2 enclosure with precision thermal interface. Seal and fill with dielectric fluid.

Phase 2

Test & Validate

Months 4–9

Deploy at Blue River corridor. Run thermal validation at 1.5–2.5 kW GPU-class load. Collect 90 days of production temperature data.

Phase 3

IP / Patent

Months 7–12

File non-provisional patent with measured validation data. November 2026 is the provisional patent conversion deadline. CNC in-house capability protects fabrication IP.

Phase 4

Commercial Revenue

Months 9–15

V2 transitions from validation to production. First compute rental revenue. Scale to 4 units. Begin product sales pipeline conversations via Schneider network.

Phase 5

Product Sales

Months 15–24

6 months of production data collected. Product sales pitch ready. IP licensing conversations initiated. First external customer deployments targeted.

M1

Funding close → hardware procurement begins immediately

M3

V2 enclosure assembled, sealed, and fluid-filled

M4

Water body deployment, thermal testing begins

M9

Thermal validation complete at 1.5–2.5 kW GPU load

M10

First compute revenue, V2 transitions to production node

M12

Non-provisional patent filed (Nov 2026 deadline)

M16

6 months production data, product sales pitch ready

M24

First product sales / IP licensing conversations active

Get In Touch

Ready to learn more?

Submera is actively working to reshape how the world thinks about data centers, moving compute out of massive, power-hungry buildings and into the natural environment around us. Efficient by design. Out of sight by nature. If that mission resonates with you, we'd love to connect.

Send an email Read our Substack