• @brucethemoose
    link
    4
    edit-2
    5 hours ago

    Intel is not as bad in LLM land as you’d think. Llama.cpp support gets better every day.

    Nvidia may be first class, but in this case, it doesn’t matter if the model you want doesn’t fit in VRAM. I’d trade my 3090 for a 48GB Arc card without even blinking, even if the setup is an absolute pain.