• @brucethemoose
    link
    21
    edit-2
    2 days ago

    Grok is a laughing stock in LLM world.

    • It’s worse than Gemini, Claude, so… why use it over API?

    • It’s not open source or even open weights. Elon is straight up lying when he claims it is.

    • It’s more expensive and generally more censored than great open weights models. Its even straight up worse than the remarkable Deepseek v3.

    • It has no niche. It’s not long context, it’s slopped from training on other models, it’s not fast, it has seemingly no architectural advantages (not that they publish anything about it,) it’s not particularly good at prose or code completion or answering grounded in facts (hence the subject of the OP) or anything.

    • Elon has one of the most inefficiently used H100 hoards on the planet, while other labs are making good models without being billionaires or scorching the Earth.

    • Elon is an asshole, and I don’t want anything to do with him.

    • Even for “business use,” simply associating with X makes you radioactive.

    The only ostensible reason I can think of to use Grok is ignorance, or to lick the boots of Musk/Trump. It’s not good, it’s not ‘based,’ it’s just a shitty, expensive product hyped to the moon and “better some day.” Surprise, surprise…

    Anyway, the point is Elon doesn’t care if Grok makes him look bad. Reality is irrelevant, he can hype the snot out of it and have hordes come running to bow in deference.