I have experience in running servers, but I would like to know if it’s possible to do it, I just need a GPT 3.5 like private LLM running.

  • @[email protected]
    link
    fedilink
    English
    35 months ago

    LLMs work by always predicting the next most likely token and LLM detection works by checking how often the next most likely token was chosen. You can tell the LLM to choose less likely tokens more often (turn up the heat parameter) but you will only get gibberish out if you do. So no, there is not.

    • @TheBigBrotherOP
      link
      English
      -13
      edit-2
      5 months ago

      I think hosting my own LLM wouldn’t work, at some point and as someone said it, the big models are already trained on all the internet stuff, so there is no point into feeding it with more stuff like ebooks, I have to find a way to make the AI write dumber or make it analize the way an author write to then make it emulate the author.