• @[email protected]
      link
      fedilink
      109 months ago

      I mean literally this. AI is a strong tool when used properly, but we should obviously not lose sight of long term goals in favor of short term opportunities

      • BirdEnjoyer
        link
        fedilink
        69 months ago

        Yep.

        This was never the fault of AI.
        Its always the corporations- by nature, they’re designed to go with what gets the most profit at any cost.
        Just look at what they do to the meat industry to living creatures and their living employees who have to deal with it.

        I don’t think people appreciate just how dangerously willing these current AI companies are to set fire to the upcoming few decades of society- all for a little bit of glory right now.

        The energy problem is definitely one thing I didn’t realize was quite so dire, but we’re on the cusp of total loss of control over your own likeness, and these companies really couldn’t care less.

    • @[email protected]
      link
      fedilink
      -19 months ago

      Ai has replaced so much of what I used to Google search for, but without the blog fluff (though it adds its own flavour of fluff).

      All of it is low stakes, so I’m not worried about the accuracy as long as it keeps me moving on a task.

      • insomniac_lemon
        link
        fedilink
        1
        edit-2
        9 months ago

        I tried this recently in hopes of finding an animation pilot, it was too willing to give me completely wrong answers (the most popular things or even kids shows) or it’d just make a name up. Admittedly, I was using 13b.Q4 models and they are not the newest ones.

        I ended up finding what I was looking for by pure coincidence: I did a generic search (finding adult swim pilots (I had combed the wikipedia page and their site already)) and one of the higher results is a reddit thread where someone was looking for the same show I was and they made the same mistake that I made (mistaking a Cartoon Hangover short for an Adult Swim pilot).

        After that I tried finding an even older and dumber animation that I had gotten on the PSN during the PS3 era, those terms tripped the AI up because it would only give me videogames.

        (Certain things are probably better to ask, I’d say I’m not sure about computation being worth it but then again search is pretty garbage these days unless it’s an obvious query that won’t be mixed up with other newer/more-popular terms)

  • @[email protected]
    link
    fedilink
    English
    19
    edit-2
    9 months ago

    de Vries, who now works for the Netherlands’ central bank, estimated that if Google were to integrate generative A.I. into every search, its electricity use would rise to something like twenty-nine billion kilowatt-hours per year. This is more than is consumed by many countries, including Kenya, Guatemala, and Croatia.

    Why on earth would they do that? Just cache the common questions.

    It’s been estimated that ChatGPT is responding to something like two hundred million requests per day, and, in so doing, is consuming more than half a million kilowatt-hours of electricity. (For comparison’s sake, the average U.S. household consumes twenty-nine kilowatt-hours a day.)

    Ok, so the actual real world estimate is somewhere on the order of a million kilowatt-hours, for the entire globe. Even if we assume that’s just US, there are 125M households, so that’s 4 watt-hours per household per day. A LED lightbulb consumes 8 watts. Turn one of those off for a half-hour and you’ve balanced out one household’s worth of ChatGPT energy use.

    This feels very much in the “turn off your lights to do you part for climate change” distraction from industry and air travel. They’ve mixed and matched units in their comparisons to make it seem like this is a massive amount of electricity, but it’s basically irrelevant. Even the big AI-every-search number only works out to 0.6 kwh/day (again, if all search was only done by Americans), which isn’t great, but is still on the order of don’t spend hours watching a big screen TV or playing on a gaming computer, and compares to the 29 kwh already spent.

    Math, because this result is so irrelevant it feels like I’ve done something wrong:

    • 500,000 kwh/day / 125,000,000 US households = 0.004 kwh/household/day
    • 29,000,000,000 kwh/yr / 365 days/yr / 125,000,000 households = 0.6 kwh/household/day, compared to 29 kwh base
    • @[email protected]
      link
      fedilink
      English
      29 months ago

      Just cache the common questions.

      There are only two hard things in Computer Science: cache invalidation and naming things.

        • @[email protected]
          link
          fedilink
          English
          39 months ago

          Reminds me of the two hard things in distributed systems:

          • 2: Exactly-once delivery
          • 1: Guaranteed order
          • 2: Exactly-once delivery
      • @[email protected]
        link
        fedilink
        English
        39 months ago

        It’s a good thing that Google has a massive pre-existing business about caching and updating search responses then. The naming things side of their business could probably use some more work though.

    • @[email protected]
      link
      fedilink
      19 months ago

      Just cache the common questions.

      AI models work in a feedback loop. The fact that you’re asking the question becomes part of the response next time. They could cache it, but the model is worse off for it.

      Also, they are Google/Microsoft/OpenAI. They will do it because they can and nobody is stopping them.

      • @[email protected]
        link
        fedilink
        English
        19 months ago

        This is AI for search, not AI as a chatbot. And in the search context many requests are functionally similar and can have the same response. You can extract a theme to create contextual breadcrumbs that will be effectively the same as other people doing similar things. People looking for Thai food in Los Angeles will generally follow similar patterns and need similar responses, even if it comes in the form of several successive searches framed as sentences with different word ordering and choices.

        And none of this is updating the model (at least not in a real-time sense that would require re-running a cached search), it’s all short-term context fed in as additional inputs.

    • BlanketsWithSmallpox
      link
      English
      19 months ago

      I’m glad someone was on the same track as me. I posted numbers as well if you want to take a peak at mine below.

  • @[email protected]
    link
    fedilink
    129 months ago

    The issue is how the electricity is generated not that it is needed in the first place. Such a great distraction from the real issue that it has got to be big oil that is spinning the story this way.

    Let’s all hate on AI and crypto because they are ruining the entire environment and if we just stopped them, all would be fine with the planet again /s.

  • BlanketsWithSmallpox
    link
    English
    8
    edit-2
    9 months ago

    Artificial intelligence requires a lot of power for much the same reason. The kind of machine learning that produced ChatGPT relies on models that process fantastic amounts of information, and every bit of processing takes energy. When ChatGPT spits out information (or writes someone’s high-school essay), that, too, requires a lot of processing. It’s been estimated that ChatGPT is responding to something like two hundred million requests per day, and, in so doing, is consuming more than half a million kilowatt-hours of electricity. (For comparison’s sake, the average U.S. household consumes twenty-nine kilowatt-hours a day.

    So 500 Megawatts a day across the globe? This is all just Data Center use? Not even 1/10th the power of the newest and largest data center’s power… out of ~11,000 total data centers.

    Existing markets are already struggling to meet demand, the report says. In Northern Virginia, the largest data center market in the world at 3,400MW, availability is running at just 0.2 percent.

    https://www.datacenterdynamics.com/en/news/us-data-center-power-consumption/

    So a drop in the bucket for a crazy useful tool using mostly existing infrastructure…

    The finding that global data centers likely consumed around 205 terawatt-hours (TWh) in 2018, or 1 percent of global electricity use, lies in stark contrast to earlier extrapolation-based estimates that showed rapidly-rising data center energy use over the past decade (Figure 2).

    https://energyinnovation.org/2020/03/17/how-much-energy-do-data-centers-really-use/

    The typical cost of building a solar power plant is between $0.89 and $1.01 per watt. A 1MW (megawatt) solar farm can cost you between $890,000 and $1.01 million… According to GTM Research, 1 MW solar farms require 6–8 acres to accommodate all the necessary infrastructure and space between panel rows.

    https://coldwellsolar.com/commercial-solar-blog/how-much-investment-do-you-need-for-a-solar-farm/

    $300 million and ~2 square miles (7 for reference) to power the entire world’s AI use feels like a non-issue to me. A billionaire could literally fund the entire world’s daily consumption and not dent their holdings…

    Computers use power… More news at 11.

  • @[email protected]
    link
    fedilink
    89 months ago

    So I did a little math.

    This site says a single ChatGPT query consumes 0.00396 KWh.

    Assume an average LED light bulb is 10 watts, or 0.01 kwh/hr. So if I did the math right, no guarantees there, a single ChatGPT query is roughly equivalent to leaving a light bulb on for 20 minutes.

    So if you assume the average light bulb in your house is on a little more than 3 hours a day, if you make 10 ChatGPT queries per day it’s the equivalent of adding a new light bulb to your house.

    Which is definitely not nothing. But isn’t the end of the world either.

    • @AliasAKA
      link
      English
      5
      edit-2
      9 months ago

      It’s also the required energy to train the model. Inference is usually more efficient (sometimes not but almost always significantly more so), because you have no error back propagation or other training specific calculations.

      Models probably take 1000 megawatts of energy to train (GPT3 took 284MW by OpenAI’s calculation). That’s not including the web scraping and data cleaning and other associated costs (such as cooling the server farms which is non trivial).

      A coal plant takes roughly 364kg - 500kg of coal to generate 1 MWh. So for GPT3 you’d be looking at 103,376 kg (~230 thousand pounds, or 115 US tons) at minimum to train it. Nobody has used it and we’re not looking at the other associated energy costs at this point. For comparison, a typical home may use 6MWh per year. So just training GPT3 could’ve powered 47 homes for an entire year.

      Edit: also, it’s not nearly as bad as crypto mining. And as another person says it’s totally moot if we have clean sources of energy to fill the need and the grid can handle it. Unfortunately we have neither right now.

      • @[email protected]
        link
        fedilink
        29 months ago

        If you amortize training costs over all inference uses, I don’t think 1000MW is too crazy. For a model like GPT3 there’s likely millions of inference calls to split that cost between.

        • @AliasAKA
          link
          English
          19 months ago

          Sure, and I think that these may even be useful and it warrants the cost. But it is to just say that this still isn’t simply running a couple light bulbs or something. This is a major draw on the grid (but likely still pales in comparison to crypto farms).

          Note that most people would be better off using a model that’s trained for a specific task. For example, training image recognition uses vastly less energy because the models are vastly smaller, but they’re exceedingly excellent at image recognition.

          • @[email protected]
            link
            fedilink
            English
            19 months ago

            The article claims 200M ChatGPT requests per day. Assuming they make a new version yearly, that’s 73B requests per training. Spreading 1000MW across 73B requests yields a per-request amortized cost of 0.01 watt. It’s nothing.

            47 more households-worth of electricity just isn’t a major draw on anything. We add ~500,000 households a year from natural growth.

    • @[email protected]
      link
      fedilink
      29 months ago

      I have a feeling it’s not going to be the ordinary individual user that’s going to drive the usage to problematic levels.

      If a company can make money off of it, consuming a ridiculous amount of energy to do it is just another cost on the P & L.

      (Assuming of course that the company using it either pays the electric bill, or pays a marked-up fee to some AI/cloud provider)

  • @Potatos_are_not_friends
    link
    59 months ago

    I mean, energy usage continued to skyrocket since 1850.

    Why is that article so surprised by that?

  • @[email protected]
    link
    fedilink
    09 months ago

    This is concerning, why they just dont stop the never ending updates and just stick with the latest things we have for a moment? Isnt all the tech stuff we have sufficient for the world to keep going?

  • @[email protected]
    link
    fedilink
    09 months ago

    The bigger companies focus on huge model sizes instead and ever increasing them. Lots of advanced are being made with smaller and more affordable models that can be run on consumer devices but the big companies don’t focus on that as it can’t generate as much profit.