• @c10l
    link
    41 month ago

    Even when you’re not sending data that you consider sensitive, it’s helping train their models (and you’re paying for it!).

    Also what’s not sensitive to one person might be extremely sensitive to another.

    Also something you run locally, by definition, can be used with no Internet connection (like writing code on a plane or in a train tunnel).

    For me as a consultant, it means I can generally use an assistant without worrying about privacy policies on the LLM provider or client policies related to AI and third parties in general.

    For me as an individual, it means I can query the model away without worrying that every word I send it will be used to build a profile of who I am that can later be exploited by ad companies and other adversaries.