cross-posted from: https://lemmy.dbzer0.com/post/32023985

Writing a 100-word email using ChatGPT (GPT-4, latest model) consumes 1 x 500ml bottle of water It uses 140Wh of energy, enough for 7 full charges of an iPhone Pro Max

  • curbstickle@lemmy.dbzer0.com
    link
    fedilink
    arrow-up
    4
    arrow-down
    1
    ·
    edit-2
    4 days ago

    Making it a stupid design, yes.

    Edit: putting your massive heat generating data center (beyond what most DCs will do) in support of AI in Texas is stupid.

    Closed loop systems absolutely have other options in design, which ive mentioned in another comment chain.

    As terrible as they are as companies, meta, apple, and others have made much more appropriate decisions - like locating their big load DCs in cold climates, partnering with the locale to make use of the heat being generated, removing the need for power to be used to perform those tasks - making them not only efficient designs, but compared to putting a DC in Texas like a dipshit (or LA, or NV, or anywhere else with a hot climate), makes the whole thing better for the environment.

    Yes, its a stupid design.