• RupeThereItIs
    link
    fedilink
    71 year ago

    It’s a MASSIVE security risk. What you tell ChatGPT is not private, if you knowingly or unknowingly tell ChatGPT secret information you have no control over where that information may go. Especially for a company for Apple that lives & breaths on surprise product releases.

    • MentalEdge
      link
      fedilink
      1
      edit-2
      1 year ago

      This is true, but if you understand that queries don’t necessarily need to also become training data, what you tell it could absolutely be kept secret, provided the necessary agreements and changes were to be made. Nothing about an LLM means you can’t make it forget things you’ve told it. What you can’t make it forget, without re-training it from the ground up with that piece of information omitted, is what you told it in the training data.

      But queries, do not suffer this limitation.

      • @Toasteh
        link
        11 year ago

        Its not the llm that would remember the data, its openAI who could find all the apple developer chats and comb through their logs.