Ola

Elsewhere, I’ve been building a behaviour shaping harness for local LLMs. In the process of that, I thought “well, why not share what the voices inside my head are saying”.

With that energy in mind, may I present Clanker Adjacent (name chosen because apparently I sound like a clanker - thanks lemmy! https://lemmy.world/post/43503268/22321124)

I’m going for long form, conversational tone on LLM nerd-core topics; or at least the ones that float my boat. If that’s something that interests you, cool. If not, cool.

PS: I promise the next post will be “Show me your 80085”.

PPS: Not a drive by. I lurk here and get the shit kicked out of me over on /c/technology

  • SuspciousCarrot78OP
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 day ago

    Done

    I’ll give you the noob safe walk thru, assuming starting from 0

    1. Install Docker Desktop (or Docker Engine + Compose plugin).
    2. Clone the repo: git clone https://codeberg.org/BobbyLLM/llama-conductor.git
    3. Enter the folder and copy env template: cp docker.env.example .env (Windows: copy manually)
    4. Start core stack: docker compose up -d
    5. If you also want Open WebUI: docker compose --profile webui up -d

    Included files:

    • docker-compose.yml
    • docker.env.example
    • docker/router_config.docker.yaml

    Noob-safe note for older hardware:

    • Use smaller models first (I’ve given you the exact ones I use as examples).
    • You can point multiple roles to one model initially.
    • Add bigger/specialized models later once stable.

    Docs:

    • README has Docker Compose quickstart
    • FAQ has Docker + Docker Compose section with command examples