Did you even read the second part of my comment before getting mad?
yeah, i did. you wrote:
So it should be easy enough to build them in locations that have easy access to cheap energy and large amounts of water
if you think it should be easy enough, what is your explanation for why datacenters are continuing to be built in locations where they’re competing with agriculture, other industries, and/or residential demand for scarce water resources (as you can read about in the links in my previous comment)?
Something that people need to understand is that AI companies (let’s talk about them instead of “AIs” that have no agency) are on a race to use less energy and less water per request for a very simple and selfish reason: it costs money.
Datacenters put forward their energy efficiency not because of environmental concern or even for greenwashing, but because it means “We have lower costs. We are efficient”.
GPT-3, that most publications use as a baseline for their calculations, is from 2020. It is a bad and inefficient model. OpenAI brags about their models becoming bigger and bigger, but many suspect that what they sell is actually a distilled version running on much smaller models. There are many tricks in the book that were invented since 2020 to improve the inference performance of models.
I’d be very wary of publications that extrapolate energy use of AI models (per request) going up. That’s clearly not the trend.
Something that people need to understand is that AI companies (let’s talk about them instead of “AIs” that have no agency) are on a race to use less energy and less water per request for a very simple and selfish reason: it costs money.
I agree here; left to their own devices, money is generally what matters to for-profit companies. Which is why they are mostly continuing to build datacenters (including those that are primarily for “AI”) where they do, which is almost entirely in places where they are competing with others for scarce water: because the alternatives are even more expensive.
This appears to be the only currently-operating (though the most recent news about it I can find is from 2023) underwater datacenter project, and it is in a certain country where it is somewhat easier for long-term environmental concerns to supersede capitalism’s profit motive. It would be great if they can make it an economically viable model which becomes commonplace, but until they do… datacenters today are still extremely thirsty.
This appears to be the only currently-operating (though the most recent news about it I can find is from 2023) underwater datacenter project, and it is in a certain country where it is somewhat easier for long-term environmental concerns to supersede capitalism’s profit motive. It would be great if they can make it an economically viable model which becomes commonplace, but until they do… datacenters today are still extremely thirsty.
I think you miss the reason why Microsoft started that and why I think Chinese businessowners are funding this: it is profitable.
When you own a datacenter cooling down is costly. It costs water and power. Therefore you limit cooling as much as you can but it is a balancing act: electronics like cold environments. The more you cool down, the less hardware failures you have. Submerging datacenters allowed for less components failures and less cooling expense.
The reason why these are not widespread yet is because of an unpopular reason that usually gets me downvotes when I mention it: datacenters cooling is currently not a huge problem either economically or ecologically in most places and there are tons of places with a water surplus that don’t mind this use. This is an artificial problem that AI haters try to present as a blocker but this is not a reasoning, this is a rationalization: they don’t like AI therefore the resources it uses must be problematic.
oh, wow, yeah, evaporated water does fall again as rain 🤔
why didn’t anyone think of this before? so many wars could have been avoided!
finally, a hydrology understander has logged on to explain why water usage isn’t even a problem. thank you for clearing this up!
seriously though
https://www.datacenters.com/news/california-data-centers-under-pressure-to-reduce-water-usage
https://apnews.com/article/technology-business-environment-and-nature-oregon-united-states-2385c62f1a87030d344261ef9c76ccda
https://www.oregonlive.com/silicon-forest/2022/12/googles-water-use-is-soaring-in-the-dalles-records-show-with-two-more-data-centers-to-come.html
https://www.datacenterfrontier.com/special-reports/article/11428474/tackling-data-center-water-usage-challenges-amid-historic-droughts-wildfires
https://www.npr.org/2022/08/30/1119938708/data-centers-backbone-of-the-digital-economy-face-water-scarcity-and-climate-ris
https://www.theregister.com/2025/01/04/how_datacenters_use_water/
Did you even read the second part of my comment before getting mad?
yeah, i did. you wrote:
if you think it should be easy enough, what is your explanation for why datacenters are continuing to be built in locations where they’re competing with agriculture, other industries, and/or residential demand for scarce water resources (as you can read about in the links in my previous comment)?
Because zero of the current data centers were not built for LLM inference. The latency matters in other use cases.
when/where are the “built for LLM inference” (😂) datacenters being built where water consumption is not an issue?
Given the aluminum import tariffs just freed up a ton of electrical power in British Columbia and Quebec, I’d start there.
Underwater experiments in 2020
Real world deployement in 2021
Something that people need to understand is that AI companies (let’s talk about them instead of “AIs” that have no agency) are on a race to use less energy and less water per request for a very simple and selfish reason: it costs money.
Datacenters put forward their energy efficiency not because of environmental concern or even for greenwashing, but because it means “We have lower costs. We are efficient”.
GPT-3, that most publications use as a baseline for their calculations, is from 2020. It is a bad and inefficient model. OpenAI brags about their models becoming bigger and bigger, but many suspect that what they sell is actually a distilled version running on much smaller models. There are many tricks in the book that were invented since 2020 to improve the inference performance of models.
I’d be very wary of publications that extrapolate energy use of AI models (per request) going up. That’s clearly not the trend.
I agree here; left to their own devices, money is generally what matters to for-profit companies. Which is why they are mostly continuing to build datacenters (including those that are primarily for “AI”) where they do, which is almost entirely in places where they are competing with others for scarce water: because the alternatives are even more expensive.
That’s a neat idea, and maybe will be widespread one day.
However that particular experimental project from Microsoft was conceived of in 2013, deployed in 2018, and concluded in 2020. Microsoft is not currently operating or planning to operate any more underwater datacenters: https://www.datacenterdynamics.com/en/news/microsoft-confirms-project-natick-underwater-data-center-is-no-more/
Among things they’re doing instead (specifically for AI) is restarting a decommissioned nuclear plant at Three Mile Island in Pennsylvania, a state with (like most states) a long history of conflict related to water scarcity and privatization.
This appears to be the only currently-operating (though the most recent news about it I can find is from 2023) underwater datacenter project, and it is in a certain country where it is somewhat easier for long-term environmental concerns to supersede capitalism’s profit motive. It would be great if they can make it an economically viable model which becomes commonplace, but until they do… datacenters today are still extremely thirsty.
I think you miss the reason why Microsoft started that and why I think Chinese businessowners are funding this: it is profitable.
When you own a datacenter cooling down is costly. It costs water and power. Therefore you limit cooling as much as you can but it is a balancing act: electronics like cold environments. The more you cool down, the less hardware failures you have. Submerging datacenters allowed for less components failures and less cooling expense.
The reason why these are not widespread yet is because of an unpopular reason that usually gets me downvotes when I mention it: datacenters cooling is currently not a huge problem either economically or ecologically in most places and there are tons of places with a water surplus that don’t mind this use. This is an artificial problem that AI haters try to present as a blocker but this is not a reasoning, this is a rationalization: they don’t like AI therefore the resources it uses must be problematic.