It is ‘nearly unavoidable’ that AI will cause a financial crash within a decade, SEC head says::undefined

  • @MrFlamey
    link
    English
    231 year ago

    Is this before or after it destroys the economy for everyone but the super rich by replacing them and making them compete for fewer and fewer scraps? Sorry, there will be lots more new jobs created by AI probably, like AI wrangler, AI safety consultant and the like. Probably.

    • @dustyData
      link
      English
      12
      edit-2
      1 year ago

      I always have a laughing fit whenever I see “Prompt engineer” used unironically in a job posting on LinkedIn.

      • @[email protected]
        link
        fedilink
        English
        71 year ago

        I also think the term is granted way too much prestige, but a bit over a decade ago people also laughed at the notion of “Social Media Manager” being a real, high paying job

        Who knows where this stuff will go

        • @dustyData
          link
          English
          41 year ago

          Even today the term Social media manager is still conflated with graphic designer, sales representative, customer information management, publishing, copywriter, photographer and creative writing, all the time.

        • @MeekerThanBeaker
          link
          English
          61 year ago

          Honestly, it’s a legit position. Maybe not something that will last a long time, but to do it well you need to know what prompts to give.

          The average person might put “cat on a speed boat” whereas someone’s job would need to know what “bokeh” is, what kind of camera lens you want to simulate with what F-stop, know the rule of thirds, negative space, etc.

          The problem is whether someone is willing to pay for that extra knowledge or get their nephew to pump something out on their iPhone then say it’s good enough.

      • @[email protected]
        link
        fedilink
        English
        31 year ago

        I have… feelings about LLMs being the big thing in AI/ml right now… because its really not much new. Maybe the transformer model kind of but ultimately LLMs are massive supervised learning neural nets trained on obscene amounts of data. And then other models use that pretrained “foundational model” to work and just tune their parameters. Which is why prompt engineer is becoming a thing.

        Corpos are playing by the book here and trying to extinguish any competition before it begins by having people rely on their “foundation” models instead of innovating their own solutions

        How many tutorials can you find for implementing LLM NLP tasks that dont include “import this model from X company” id wager its only maybe 33%

        • @dustyData
          link
          English
          31 year ago

          Part of what makes localized model engines and custom ML chips interesting is precisely their ability to enable small custom local models. Right now LLMs require so much computational power and massive amounts of data to be trained and operate that even the most expensive options lose money with every prompt query.

          So, the reason every tutorial starts with “download this model”. Is because there’s a good chance you don’t have the hundreds of super computer cluster chips and the several hundreds of exabytes of scrapped and curated data needed to train a natural language processing model. There’s a reason there are only big players in this game.

          • @[email protected]
            link
            fedilink
            English
            2
            edit-2
            1 year ago

            Facts.

            Even if you could design your own model… How do you acquire a dataset even a fraction of the size those pretrained models from the corps.

            Then how do you train the model in a reasonable time. Other than relying on cloud computing which leads to the same problem of only corps can play this game properly right now.

            I designed and collected/labeled the data for a relatively small deep CNN for my masters thesis and training it on 60000 images was taking over a dozen hours (this was 5 years ago at this point so that part may be misremembered) on a 1080ti.