• Nangijala
    link
    fedilink
    English
    arrow-up
    32
    arrow-down
    3
    ·
    edit-2
    4 days ago

    Maybe not an individual prompt, but with how many prompts are made for stupid stuff every day, it will stack up to quite a lot of CO2 in the long run.

    Not denying the training of AI is demanding way more energy, but that doesn’t really matter as both the action of manufacturing, training and millions of people using AI amounts to the same bleak picture long term.

    Considering how the discussion about environmental protection has only just started to be taken seriously and here they come and dump this newest bomb on humanity, it is absolutely devastating that AI has been allowed to run rampant everywhere.

    According to this article, 500.000 AI prompts amounts to the same CO2 outlet as a

    round-trip flight from London to New York.

    I don’t know how many times a day 500.000 AI prompts are reached, but I’m sure it is more than twice or even thrice. As time moves on it will be much more than that. It will probably outdo the number of actual flights between London and New York in a day. Every day. It will probably also catch up to whatever energy cost it took to train the AI in the first place and surpass it.

    Because you know. People need their memes and fake movies and AI therapist chats and meal suggestions and history lessons and a couple of iterations on that book report they can’t be fucked to write. One person can easily end up prompting hundreds of times in a day without even thinking about it. And if everybody starts using AI to think for them at work and at home, it’ll end up being many, many, many flights back and forth between London and New York every day.

    • A Wild Mimic appears!@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      19
      arrow-down
      2
      ·
      edit-2
      4 days ago

      I had the discussion regarding generated CO2 a while ago here, and with the numbers my discussion partner gave me, the calculation said that the yearly usage of ChatGPT is appr. 0.0017% of our CO2 reduction during the covid lockdowns - chatbots are not what is kiling the climate. What IS killing the climate has not changed since the green movement started: cars, planes, construction (mainly concrete production) and meat.

      The exact energy costs are not published, but 3Wh / request for ChatGPT-4 is the upper limit from what we know (and thats in line with the appr. power consumption on my graphics card when running an LLM). Since Google uses it for every search, they will probably have optimized for their use case, and some sources cite 0.3Wh/request for chatbots - it depends on what model you use. The training is a one-time cost, and for ChatGPT-4 it raises the maximum cost/request to 4Wh. That’s nothing. The combined worldwide energy usage of ChatGPT is equivalent to about 20k American households. This is for one of the most downloaded apps on iPhone and Android - setting this in comparison with the massive usage makes clear that saving here is not effective for anyone interested in reducing climate impact, or you have to start scolding everyone who runs their microwave 10 seconds too long.

      Even compared to other online activities that use data centers ChatGPT’s power usage is small change. If you use ChatGPT instead of watching Netflix you actually safe energy!

      Water is about the same, although the positioning of data centers in the US sucks. The used water doesn’t disappear tho - it’s mostly returned to the rivers or is evaporated. The water usage in the US is 58,000,000,000,000 gallons (220 Trillion Liters) of water per year. A ChatGPT request uses between 10-25ml of water for cooling. A Hamburger uses about 600 galleons of water. 2 Trillion Liters are lost due to aging infrastructure. If you want to reduce water usage, go vegan or fix water pipes.

      Read up here!

    • ayyy@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      7
      arrow-down
      9
      ·
      4 days ago

      I have a hard time believing that article’s guesstimate since Google (who actually runs these data centers and doesn’t have to guess) just published a report stating that the median prompt uses about a quarter of a watt-hour, or the equivalent of running a microwave oven for one second. You’re absolutely right that flights use an unconscionable amount of energy. Perhaps your advocacy time would be much better spent fighting against that.

      • Nangijala
        link
        fedilink
        English
        arrow-up
        16
        arrow-down
        2
        ·
        4 days ago

        And Google would never lie about how much energy a prompt costs, right?

        Especially not since they have an invested interest in having people use their AI products, right?

        • ayyy@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          11
          arrow-down
          4
          ·
          4 days ago

          That’s not really Google’s style when it comes to data center whitepapers. They did, however, omit all information about training energy use.

        • Bronzebeard@lemmy.zip
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          1
          ·
          3 days ago

          … They’re kind of governed by law about what things they’re allowed to tell their stockholders.

          And before you try to say otherwise, yes, laws that protect the ownership class are still being enforced.