Running AI is so expensive that Amazon will probably charge you to use Alexa in future, says outgoing exec::In an interview with Bloomberg, Dave Limp said that he “absolutely” believes that Amazon will soon start charging a subscription fee for Alexa

  • @scarabic
    link
    English
    11 year ago

    Don’t these models require rather a lot of storage?

    • arthurpizza
      link
      English
      11 year ago

      Storage is getting cheaper every day and the models are getting smaller with the same amount of data.

      • @scarabic
        link
        English
        11 year ago

        I’m just curious - do you know what kind of storage is required?

    • LEX
      link
      fedilink
      English
      1
      edit-2
      1 year ago

      13B quantized models, generally the most popular for home computers with dedicated gpus, are between 6 and 10 gigs each. 7B models are between 3 and 6. So, no, not really?

      It is relative so, I guess if you’re comparing that to an atari 2600 cartridge then, yeah, it’s hella huge. But you can store multiple models for the same storage cost as a single modern video game install.

      • @scarabic
        link
        English
        11 year ago

        Yeah that’s not a lot. I mean… the average consumer probably has 10GB free on their boot volume.

        It is a lot to download. If we’re talking about ordinary consumers. Not unheard of though - some games on Steam are 50GB+

        So okay, storage is not prohibitive.