• BlameThePeacock@lemmy.ca
    link
    fedilink
    English
    arrow-up
    1
    ·
    9 days ago

    Because zero of the current data centers were not built for LLM inference. The latency matters in other use cases.

    • Arthur Besse@lemmy.mlOP
      link
      fedilink
      arrow-up
      1
      ·
      9 days ago

      when/where are the “built for LLM inference” (😂) datacenters being built where water consumption is not an issue?

      • BlameThePeacock@lemmy.ca
        link
        fedilink
        English
        arrow-up
        1
        ·
        9 days ago

        Given the aluminum import tariffs just freed up a ton of electrical power in British Columbia and Quebec, I’d start there.

      • keepthepace@slrpnk.net
        link
        fedilink
        arrow-up
        1
        ·
        9 days ago

        Underwater experiments in 2020

        Real world deployement in 2021

        Something that people need to understand is that AI companies (let’s talk about them instead of “AIs” that have no agency) are on a race to use less energy and less water per request for a very simple and selfish reason: it costs money.

        Datacenters put forward their energy efficiency not because of environmental concern or even for greenwashing, but because it means “We have lower costs. We are efficient”.

        GPT-3, that most publications use as a baseline for their calculations, is from 2020. It is a bad and inefficient model. OpenAI brags about their models becoming bigger and bigger, but many suspect that what they sell is actually a distilled version running on much smaller models. There are many tricks in the book that were invented since 2020 to improve the inference performance of models.

        I’d be very wary of publications that extrapolate energy use of AI models (per request) going up. That’s clearly not the trend.

        • Arthur Besse@lemmy.mlOP
          link
          fedilink
          arrow-up
          1
          ·
          9 days ago

          Something that people need to understand is that AI companies (let’s talk about them instead of “AIs” that have no agency) are on a race to use less energy and less water per request for a very simple and selfish reason: it costs money.

          I agree here; left to their own devices, money is generally what matters to for-profit companies. Which is why they are mostly continuing to build datacenters (including those that are primarily for “AI”) where they do, which is almost entirely in places where they are competing with others for scarce water: because the alternatives are even more expensive.

          Underwater experiments in 2020

          That’s a neat idea, and maybe will be widespread one day.

          However that particular experimental project from Microsoft was conceived of in 2013, deployed in 2018, and concluded in 2020. Microsoft is not currently operating or planning to operate any more underwater datacenters: https://www.datacenterdynamics.com/en/news/microsoft-confirms-project-natick-underwater-data-center-is-no-more/

          Among things they’re doing instead (specifically for AI) is restarting a decommissioned nuclear plant at Three Mile Island in Pennsylvania, a state with (like most states) a long history of conflict related to water scarcity and privatization.

          Real world deployement in 2021

          This appears to be the only currently-operating (though the most recent news about it I can find is from 2023) underwater datacenter project, and it is in a certain country where it is somewhat easier for long-term environmental concerns to supersede capitalism’s profit motive. It would be great if they can make it an economically viable model which becomes commonplace, but until they do… datacenters today are still extremely thirsty.

          • keepthepace@slrpnk.net
            link
            fedilink
            arrow-up
            1
            ·
            8 days ago

            This appears to be the only currently-operating (though the most recent news about it I can find is from 2023) underwater datacenter project, and it is in a certain country where it is somewhat easier for long-term environmental concerns to supersede capitalism’s profit motive. It would be great if they can make it an economically viable model which becomes commonplace, but until they do… datacenters today are still extremely thirsty.

            I think you miss the reason why Microsoft started that and why I think Chinese businessowners are funding this: it is profitable.

            When you own a datacenter cooling down is costly. It costs water and power. Therefore you limit cooling as much as you can but it is a balancing act: electronics like cold environments. The more you cool down, the less hardware failures you have. Submerging datacenters allowed for less components failures and less cooling expense.

            The reason why these are not widespread yet is because of an unpopular reason that usually gets me downvotes when I mention it: datacenters cooling is currently not a huge problem either economically or ecologically in most places and there are tons of places with a water surplus that don’t mind this use. This is an artificial problem that AI haters try to present as a blocker but this is not a reasoning, this is a rationalization: they don’t like AI therefore the resources it uses must be problematic.