@TheBigBrother to SelfhostedEnglish • edit-24 months agoWhat's the bang for the buck go to setup for AI image generation and LLM models?message-square17arrow-up140arrow-down112file-text
arrow-up128arrow-down1message-squareWhat's the bang for the buck go to setup for AI image generation and LLM models?@TheBigBrother to SelfhostedEnglish • edit-24 months agomessage-square17file-text
minus-square@[email protected]linkfedilinkEnglish4•edit-24 months agoKobaldCPP or LocalAI will probably be the easiest way out of the box that has both image generation and LLMs. I personally use vllm and HuggingChat, mostly because of vllm’s efficiency and speed increase.
minus-squareDarkThoughtslinkfedilink3•4 months agoIt is probably dead but Easy Diffusion is imo the easiest for image generation. KoboldCPP can be a bit weird here and there but was the first thing that worked for me for local text gen + gpu support.
KobaldCPP or LocalAI will probably be the easiest way out of the box that has both image generation and LLMs.
I personally use vllm and HuggingChat, mostly because of vllm’s efficiency and speed increase.
It is probably dead but Easy Diffusion is imo the easiest for image generation.
KoboldCPP can be a bit weird here and there but was the first thing that worked for me for local text gen + gpu support.