Running AI is so expensive that Amazon will probably charge you to use Alexa in future, says outgoing exec::In an interview with Bloomberg, Dave Limp said that he “absolutely” believes that Amazon will soon start charging a subscription fee for Alexa

    • arthurpizza@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      Storage is getting cheaper every day and the models are getting smaller with the same amount of data.

    • LEX@lemm.ee
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      1 year ago

      13B quantized models, generally the most popular for home computers with dedicated gpus, are between 6 and 10 gigs each. 7B models are between 3 and 6. So, no, not really?

      It is relative so, I guess if you’re comparing that to an atari 2600 cartridge then, yeah, it’s hella huge. But you can store multiple models for the same storage cost as a single modern video game install.

      • scarabic@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        Yeah that’s not a lot. I mean… the average consumer probably has 10GB free on their boot volume.

        It is a lot to download. If we’re talking about ordinary consumers. Not unheard of though - some games on Steam are 50GB+

        So okay, storage is not prohibitive.