• BlameThePeacock@lemmy.ca
    link
    fedilink
    English
    arrow-up
    2
    arrow-down
    3
    ·
    11 days ago

    Fuck I hate it when anything uses “water consumed” as a bad thing.

    Writing a 100-word email consumes about 500ml of water (17 oz).

    500ml of water is used in a cooling system. This water is not lost in any way, it’s just warmed up and evaporated. Which then falls as rain again.

    If you draw more water from a particular area than that area can support it’s a bad thing. Otherwise, this is just a stupid argument made to appeal to emotions rather than logic.

    Where I live, our main water source is a lake, and we only even bother closing the weir(a type of dam) that holds the water in come the middle of March, and open it completely again in October. We just just let billions of liters flow directly out to the ocean the rest of the year.

    Especially for AI, you don’t need to locate datacenters near population centers at all, the latency doesn’t affect that use case at all. So it should be easy enough to build them in locations that have easy access to cheap energy and large amounts of water, pretty much anywhere that has a hydro electric dam is a good choice.

    • Arthur Besse@lemmy.mlOP
      link
      fedilink
      arrow-up
      3
      arrow-down
      2
      ·
      edit-2
      10 days ago

      Fuck I hate it when anything uses “water consumed” as a bad thing.

      […]

      This water is not lost in any way, it’s just warmed up and evaporated. Which then falls as rain again.

      oh, wow, yeah, evaporated water does fall again as rain 🤔

      why didn’t anyone think of this before? so many wars could have been avoided!

      finally, a hydrology understander has logged on to explain why water usage isn’t even a problem. thank you for clearing this up!

      seriously though

      https://www.datacenters.com/news/california-data-centers-under-pressure-to-reduce-water-usage

      https://apnews.com/article/technology-business-environment-and-nature-oregon-united-states-2385c62f1a87030d344261ef9c76ccda

      https://www.oregonlive.com/silicon-forest/2022/12/googles-water-use-is-soaring-in-the-dalles-records-show-with-two-more-data-centers-to-come.html

      https://www.datacenterfrontier.com/special-reports/article/11428474/tackling-data-center-water-usage-challenges-amid-historic-droughts-wildfires

      https://www.npr.org/2022/08/30/1119938708/data-centers-backbone-of-the-digital-economy-face-water-scarcity-and-climate-ris

      https://www.theregister.com/2025/01/04/how_datacenters_use_water/

        • Arthur Besse@lemmy.mlOP
          link
          fedilink
          arrow-up
          2
          ·
          edit-2
          9 days ago

          Did you even read the second part of my comment before getting mad?

          yeah, i did. you wrote:

          So it should be easy enough to build them in locations that have easy access to cheap energy and large amounts of water

          if you think it should be easy enough, what is your explanation for why datacenters are continuing to be built in locations where they’re competing with agriculture, other industries, and/or residential demand for scarce water resources (as you can read about in the links in my previous comment)?

          • BlameThePeacock@lemmy.ca
            link
            fedilink
            English
            arrow-up
            1
            ·
            9 days ago

            Because zero of the current data centers were not built for LLM inference. The latency matters in other use cases.

            • Arthur Besse@lemmy.mlOP
              link
              fedilink
              arrow-up
              1
              ·
              9 days ago

              when/where are the “built for LLM inference” (😂) datacenters being built where water consumption is not an issue?

              • keepthepace@slrpnk.net
                link
                fedilink
                arrow-up
                1
                ·
                9 days ago

                Underwater experiments in 2020

                Real world deployement in 2021

                Something that people need to understand is that AI companies (let’s talk about them instead of “AIs” that have no agency) are on a race to use less energy and less water per request for a very simple and selfish reason: it costs money.

                Datacenters put forward their energy efficiency not because of environmental concern or even for greenwashing, but because it means “We have lower costs. We are efficient”.

                GPT-3, that most publications use as a baseline for their calculations, is from 2020. It is a bad and inefficient model. OpenAI brags about their models becoming bigger and bigger, but many suspect that what they sell is actually a distilled version running on much smaller models. There are many tricks in the book that were invented since 2020 to improve the inference performance of models.

                I’d be very wary of publications that extrapolate energy use of AI models (per request) going up. That’s clearly not the trend.

                • Arthur Besse@lemmy.mlOP
                  link
                  fedilink
                  arrow-up
                  1
                  ·
                  9 days ago

                  Something that people need to understand is that AI companies (let’s talk about them instead of “AIs” that have no agency) are on a race to use less energy and less water per request for a very simple and selfish reason: it costs money.

                  I agree here; left to their own devices, money is generally what matters to for-profit companies. Which is why they are mostly continuing to build datacenters (including those that are primarily for “AI”) where they do, which is almost entirely in places where they are competing with others for scarce water: because the alternatives are even more expensive.

                  Underwater experiments in 2020

                  That’s a neat idea, and maybe will be widespread one day.

                  However that particular experimental project from Microsoft was conceived of in 2013, deployed in 2018, and concluded in 2020. Microsoft is not currently operating or planning to operate any more underwater datacenters: https://www.datacenterdynamics.com/en/news/microsoft-confirms-project-natick-underwater-data-center-is-no-more/

                  Among things they’re doing instead (specifically for AI) is restarting a decommissioned nuclear plant at Three Mile Island in Pennsylvania, a state with (like most states) a long history of conflict related to water scarcity and privatization.

                  Real world deployement in 2021

                  This appears to be the only currently-operating (though the most recent news about it I can find is from 2023) underwater datacenter project, and it is in a certain country where it is somewhat easier for long-term environmental concerns to supersede capitalism’s profit motive. It would be great if they can make it an economically viable model which becomes commonplace, but until they do… datacenters today are still extremely thirsty.

                  • keepthepace@slrpnk.net
                    link
                    fedilink
                    arrow-up
                    1
                    ·
                    8 days ago

                    This appears to be the only currently-operating (though the most recent news about it I can find is from 2023) underwater datacenter project, and it is in a certain country where it is somewhat easier for long-term environmental concerns to supersede capitalism’s profit motive. It would be great if they can make it an economically viable model which becomes commonplace, but until they do… datacenters today are still extremely thirsty.

                    I think you miss the reason why Microsoft started that and why I think Chinese businessowners are funding this: it is profitable.

                    When you own a datacenter cooling down is costly. It costs water and power. Therefore you limit cooling as much as you can but it is a balancing act: electronics like cold environments. The more you cool down, the less hardware failures you have. Submerging datacenters allowed for less components failures and less cooling expense.

                    The reason why these are not widespread yet is because of an unpopular reason that usually gets me downvotes when I mention it: datacenters cooling is currently not a huge problem either economically or ecologically in most places and there are tons of places with a water surplus that don’t mind this use. This is an artificial problem that AI haters try to present as a blocker but this is not a reasoning, this is a rationalization: they don’t like AI therefore the resources it uses must be problematic.

              • BlameThePeacock@lemmy.ca
                link
                fedilink
                English
                arrow-up
                1
                ·
                9 days ago

                Given the aluminum import tariffs just freed up a ton of electrical power in British Columbia and Quebec, I’d start there.