• amzd
    link
    fedilink
    411 months ago

    If you have a GPU in your pc it’s almost always faster to just run your own llm locally. And you won’t have this issue. Search for ollama.

  • @IHateReddit
    link
    411 months ago

    why would you even share your passwords with ChatGPT?