- cross-posted to:
- [email protected]
- [email protected]
- [email protected]
- [email protected]
- cross-posted to:
- [email protected]
- [email protected]
- [email protected]
- [email protected]
Fuck AI
I’m with you, but this is satire to raise awareness about the resource usage of AI
There is a community here on Lemmy quite literally named “Fuck AI”. Don’t know if you have heard of it.
Either way, here is the link: [email protected]
Yes and it would be great if we could keep them separate.
I came here to say the same thing.
And my Axe!
The “AI” I run locally on my own GPU takes 0 mL of water per request. This paper assumes that AI and GPT (more specifically GPT-3) are interchangeable terms. It is biased, it is a negation of the FOSS world and it is weirdly aligned with the corporate hype around the subject. If even opponents of OpenAI accept their propaganda, we are in bad shape.
“AI” does not need water and does not need to emit CO2. It needs electricity, and we know how to produce electricity without emitting CO2 and using water. OpenAI and big datacenters do not, but they are the problem, the tech itself is not.
Please don’t let companies use the cheap trick of using “AI” as a puppet to present their own interests and their own way of doing things as the only way to do “AI”.
Fuck I hate it when anything uses “water consumed” as a bad thing.
Writing a 100-word email consumes about 500ml of water (17 oz).
500ml of water is used in a cooling system. This water is not lost in any way, it’s just warmed up and evaporated. Which then falls as rain again.
If you draw more water from a particular area than that area can support it’s a bad thing. Otherwise, this is just a stupid argument made to appeal to emotions rather than logic.
Where I live, our main water source is a lake, and we only even bother closing the weir(a type of dam) that holds the water in come the middle of March, and open it completely again in October. We just just let billions of liters flow directly out to the ocean the rest of the year.
Especially for AI, you don’t need to locate datacenters near population centers at all, the latency doesn’t affect that use case at all. So it should be easy enough to build them in locations that have easy access to cheap energy and large amounts of water, pretty much anywhere that has a hydro electric dam is a good choice.
Fuck I hate it when anything uses “water consumed” as a bad thing.
[…]
This water is not lost in any way, it’s just warmed up and evaporated. Which then falls as rain again.
oh, wow, yeah, evaporated water does fall again as rain 🤔
why didn’t anyone think of this before? so many wars could have been avoided!
finally, a hydrology understander has logged on to explain why water usage isn’t even a problem. thank you for clearing this up!
Did you even read the second part of my comment before getting mad?
Did you even read the second part of my comment before getting mad?
yeah, i did. you wrote:
So it should be easy enough to build them in locations that have easy access to cheap energy and large amounts of water
if you think it should be easy enough, what is your explanation for why datacenters are continuing to be built in locations where they’re competing with agriculture, other industries, and/or residential demand for scarce water resources (as you can read about in the links in my previous comment)?
Because zero of the current data centers were not built for LLM inference. The latency matters in other use cases.
when/where are the “built for LLM inference” (😂) datacenters being built where water consumption is not an issue?
Underwater experiments in 2020
Real world deployement in 2021
Something that people need to understand is that AI companies (let’s talk about them instead of “AIs” that have no agency) are on a race to use less energy and less water per request for a very simple and selfish reason: it costs money.
Datacenters put forward their energy efficiency not because of environmental concern or even for greenwashing, but because it means “We have lower costs. We are efficient”.
GPT-3, that most publications use as a baseline for their calculations, is from 2020. It is a bad and inefficient model. OpenAI brags about their models becoming bigger and bigger, but many suspect that what they sell is actually a distilled version running on much smaller models. There are many tricks in the book that were invented since 2020 to improve the inference performance of models.
I’d be very wary of publications that extrapolate energy use of AI models (per request) going up. That’s clearly not the trend.
Something that people need to understand is that AI companies (let’s talk about them instead of “AIs” that have no agency) are on a race to use less energy and less water per request for a very simple and selfish reason: it costs money.
I agree here; left to their own devices, money is generally what matters to for-profit companies. Which is why they are mostly continuing to build datacenters (including those that are primarily for “AI”) where they do, which is almost entirely in places where they are competing with others for scarce water: because the alternatives are even more expensive.
That’s a neat idea, and maybe will be widespread one day.
However that particular experimental project from Microsoft was conceived of in 2013, deployed in 2018, and concluded in 2020. Microsoft is not currently operating or planning to operate any more underwater datacenters: https://www.datacenterdynamics.com/en/news/microsoft-confirms-project-natick-underwater-data-center-is-no-more/
Among things they’re doing instead (specifically for AI) is restarting a decommissioned nuclear plant at Three Mile Island in Pennsylvania, a state with (like most states) a long history of conflict related to water scarcity and privatization.
This appears to be the only currently-operating (though the most recent news about it I can find is from 2023) underwater datacenter project, and it is in a certain country where it is somewhat easier for long-term environmental concerns to supersede capitalism’s profit motive. It would be great if they can make it an economically viable model which becomes commonplace, but until they do… datacenters today are still extremely thirsty.
Given the aluminum import tariffs just freed up a ton of electrical power in British Columbia and Quebec, I’d start there.