As per the title really. The whole AI revolution has largely passed me by, but the idea of self hosting something on a small box like this appeals. I don’t have an nvidia GPU in my PC and never will, so far as I can tell that pretty much rules out doing anything AI there.

I guess I can run it as a headless machine and connect over SSH or whatever web interface the AI models provide? I’m assuming running Proxmox on it will not work that well.

My main idea for AI is identifying photos with certain properties to aid in tagging over 20 years and 10s of thousands of photos.

  • @just_another_person
    link
    English
    313 hours ago

    You can use the majority of “AI” things with non-Nvidia hardware, so don’t feel boxed in by that. Some projects just skew towards the Nvidia tool chain, but there are many ways to run it on AMD if you feel the need.

    The “Super” board is just an Orin Nano with the power profiles unlocked. There is literally no difference except the software, and if you bootstrap an Orin Nano with the latest Nvidia packages, they perform the same. At about 67 TOPS.

    For your project, you could run that on pretty much any kind of CPU or GPU. I wouldn’t pay $250 for the Super devkit when you get a cheaper GPU to do this, and CPU would work just a bit slower.

  • @[email protected]
    link
    fedilink
    English
    113 hours ago

    I work on 64GB version. It arrives with Ubuntu Linux and you’ll have to use it because of DeepStream and other packages… You can run GUI on it if you like… By default it runs Gnome, if I remember correctly. It has Triton inference server but it will run ollama as well… It is very fast, powerful and a bit expensive. You might try to find a cheaper version, they announced it recently.