• @UnderpantsWeevil
    link
    324 days ago

    AI is making people able to do more work.

    It’s not. AI is creating more work, more noise in the system, and more costs for people who can’t afford to mitigate the spam it generates.

    The real value add in AI is the same as shrinkflation. You dump your clients into paying more for less, by insisting work is getting done that isn’t.

    This holds up so long as the clients never get wise to the con. But as the quality of output declines, it impacts delivery of service.

    Spotify is already struggling to deliver services to it’s existing user base. It’s losing advertisers. And now it will have fewer people to keep the ship afloat.

    • @WarlordSdocy
      link
      724 days ago

      I think that’s a really broad statement. Sure there are some industries where AI hurts more then it helps in terms of quality. And of course examples of companies trying to push it too far and getting burned for it like Spotify. But in many others a competent person using AI (someone who could do all the work without AI assistance, just slower) will be much more efficient and get things done much faster as they can outsource certain parts of their job to AI. That increased efficiency is then used to cut jobs and create more profit for the companies.

      • @UnderpantsWeevil
        link
        123 days ago

        Sure there are some industries where AI hurts more then it helps in terms of quality.

        In all seriousness, where has the LLM tech improved business workflow? Because if you know something I don’t, I’d be curious to hear it.

        But in many others a competent person using AI (someone who could do all the work without AI assistance, just slower) will be much more efficient and get things done much faster as they can outsource certain parts of their job to AI.

        What I have seen modern AI deliver in practice is functionally no different than what a good Google query would have yielded five years ago. A great deal of the value-add of AI in my own life has come as a stand-in for the deterioration of internet search and archive services. And because its black-boxed behind a chat interface, I can’t even tell if the information is reliable. Not in the way I could when I was routed to a StackExchange page with a multi-post conversation about coding techniques or special circumstances or additional references.

        AI look-up is mystifying the more traditional linked-post explanations I’ve relied on to verify what I was reading. There’s no citation, no alternative opinion, and often no clarity as to if the response even cleanly matches the query. Google’s Bard, for instance, keeps wanting to shoehorn MySQL and Postgres answers into questions about MSSQL coding patterns. OpenAI routinely hallucinates answers based on data gleaned from responses to different versions of a given coding suite.

        Rather than giving a smoother front end to a well-organized Wikipedia-like back end of historical information, what we’ve received is a machine that sounds definitive regardless of the quality of the answer.