I think Microsoft’s planned recall feature where they screenshot everything you do so that it can be analysed by AI isn’t as bad as everyone makes it sound. It’s only bad because Windows is closed source and nobody can verify if what they say is true.

But if Microsoft aren’t lying and none of the data ever leaves your PC (which is supported by the fact that you need a pretty beefy machine to use it) then it is one of the more privacy friendly shit they’ve done recently. And I think they were fully aware that they could only sell “thing that records everything you do” if they could convince people that it doesn’t share that data. Guess they failed.

If it were open source I might even think about using it myself. If the hardware and subsequently power requirements weren’t so absurdly high.

  • @catalog3115
    link
    English
    15
    edit-2
    5 months ago

    Oh! You have misunderstood the whole concept of privacy. I have a thought experiment for you:-

    Let’s assume Microsoft is not lying 🤥. The data (screenshot) remains on device, which is passed to some AI model like Image-to-text etc. This model generates text on-device. But no where Microsoft guarantee’s that the text generated or output from those AI models won’t be sent to the Microsoft. They only say the screenshots and AI models remain on-device, but the output/metadata can be sent to Microsoft.

    That is the issue. Earlier there were many apps where Microsoft couldn’t pry because they were encrypted etc. Now they don’t need to break any encryption they just need metadata. That’s easy to transfer and use.

    • @[email protected]
      link
      fedilink
      English
      65 months ago

      And to take this one step further. Say you can trust Microsoft 1000%, they are on your team and do everything they can to protect you and your privacy. They are an American company and as such are subject to American laws.

      So when elections happen somewhere in the future and some asshole gets elected, they can order Microsoft to use their systems and their data to figure out a lot about people. They can for example figure out who has been using a certain kind of software, who has been consuming a certain kind of content, who has been playing certain kinds of games etc. Then this data can be used to target specific people for punishment for example, a one way all expenses paid trip to a new series of gulag up in Alaska.

      You need to be able to protect your privacy 100%, not just for your protection today, but also for the future.

      • Jeena
        link
        fedilink
        English
        05 months ago

        Why would this asshole not be able to order Microsoft to start implementing this software after they got elected if they didn’t do that before he was elected?