• @FooBarrington
    link
    English
    0
    edit-2
    1 year ago

    While you are right about baseload being more satisfiable through nuclear, you are wrong that it’s in any way important for AI model training. This is one of the best uses for solar energy: you train while you have lots of energy, and you pause training while you don’t. Baseload is important for things that absolutely need to get done (e.g. powering machines in hospitals), or for things that have a high startup cost (e.g. furnaces). AI model training is the opposite of both, so baseload isn’t relevant at all.

    • @[email protected]
      link
      fedilink
      English
      21 year ago

      It’s not life-critical but it is financially-critical to the company. You aren’t going to build a project on the scale of a data center that is capable of running 24/7 and not run it as much as possible.

      That equipment is expensive, and has a relatively short useful lifespan even if not running.

      This is why tire factories and refineries run three shifts, this isn’t a phenomenon unique to data centers.

      • @FooBarrington
        link
        English
        21 year ago

        It’s not life-critical but it is financially-critical to the company. You aren’t going to build a project on the scale of a data center that is capable of running 24/7 and not run it as much as possible.

        Sorry, but that’s wrong. You’ll run it as much as is profitable. If electricity cost goes up, there is a point where you’ll stop running it, since it becomes too expensive. Even more so considering that AI models don’t have a set goal to reach - you train them as long as you want and can, but training a little bit extra will have diminishing returns after a while.

        That equipment is expensive, and has a relatively short useful lifespan even if not running.

        Not really, the limiting factors in AI training are mostly supply of cards. The cards already in use will stay in use until they fail, they won’t be replaced with newer cards the second they get released.

        This is why tire factories and refineries run three shifts, this isn’t a phenomenon unique to data centers.

        This is comparing apples and oranges, since tire factories:

        • have long-term planning and production goals to reach

        • have employees who must be planned

        • have resource input costs that are higher than electricity

        Of course you want the highest utilisation that you can economically reach, but a better comparison would be crypto mining - which also has expensive equipment that has a relatively short useful lifespan even if not running, and yet they stop mining when electricity is too expensive.

    • @guacupado
      link
      English
      01 year ago

      “And you pause training while you dont.” lmao I don’t know why people keep giving advice in spaces they’ve never worked in.

      • @FooBarrington
        link
        English
        11 year ago

        What are you trying to imply? That training Transformer models necessarily needs to be a continuous process? You know it’s pretty easy to stop and continue training, right?

        I don’t know why people keep commenting in spaces they’ve never worked in.

        • @guacupado
          link
          English
          0
          edit-2
          1 year ago

          No datacenter is shutting off of a leg, hall, row, or rack because “We have enough data, guys.” Maybe at your university server room where CS majors are interning. These things are running 24/7/365 with UU tracking specifically to keep them up.

          • @FooBarrington
            link
            English
            1
            edit-2
            1 year ago

            What are you talking about? Who said anything close to “we have enough data, guys”?

            Are you ok? You came in with a very snippy and completely wrong comment, and you’re continuing with something completely random.