• @[email protected]
    link
    fedilink
    English
    144 months ago

    I mean they aren’t wrong. From an efficiency standpoint, current AI is like using a 350hp car engine to turn a childs rock tumbler, or spin art thingy. Sure, it produces some interesting outputs, at the cost of way too much energy for what is being done. That is the current scenario of using generalized compute or even high end GPUs for AI.

    Best I can tell is, the “way forward” is further development of ASICs that are specific to the model being run. This should increase efficiency, decrease the ecological impact (less electricity usage) and free up silicon and components, possibly decreasing price and increasing availablity of things like consumer graphics cards again (but I won’t hold my breath for that part).