I have never liked Apple and lately even less. F… US monopolies

  • @[email protected]
    link
    fedilink
    16
    edit-2
    3 days ago

    It’s a cool idea: certain approaches to encryption still allow math to be performed. Here’s one example: say you encrypt data X with algorithm Z. then you could multiply Z by four, which would also multiply X by four. So you can run computations on the encrypted data without decrypting it.

    It would be quite complex, but I suppose you could run a machine learning model this way to tag images without ever seeing the image, or knowing the resulting tag. Only the decryption key can be used read the results (which is on the user’s iphone, I suppose).

    However… I don’t know how much compute cost this adds to an already expensive computation. The encryption used might not be the strongest out there. But the idea is pretty cool!

    • @reddig33
      link
      83 days ago

      I don’t really understand the purpose of the feature — GPS tags are already embedded in the photo by the phone, so it knows the location of each picture. The phone also analyzes faces of people you’ve identified so you can search for people you know. What else does this new feature add?

      • @[email protected]
        link
        fedilink
        123 days ago

        It let’s you type “eiffel tower” into search and get those pictures. Rather than all the other unspeakable things you did in Paris that night

        • @reddig33
          link
          03 days ago

          Current implementation seems like overkill. Why not just:

          • Search “Eiffel tower”
          • send search term to Apple server that already exists (Apple Maps)
          • server returns gps coordinates for that term
          • photos app displays photos in order of nearest to those coordinates
          • @[email protected]
            link
            fedilink
            4
            edit-2
            3 days ago

            Because you took two selfies in a restaurant near there, made a huge stunning collage of a duck below the tower and a couple photos from a while away to get the whole tower in view.

            I’m running this tech at home, because we had the same use case. Except for me it’s running on a nas, not Apple’s servers. The location solution doesn’t quite work as well when you’re avid photographer

            • @[email protected]
              link
              fedilink
              13 days ago

              If you read the article, you would know that the hard work is done locally on your iPhone not on apples server.

              • @[email protected]
                link
                fedilink
                13 days ago

                If you read the article thoroughly you’d know that a smaller model runs locally, to get an guess that a landmark might be in a spot in the image. The actual identification and tagging is done in the cloud. The tag is then sent back.

          • AwkwardLookMonkeyPuppet
            link
            English
            23 days ago

            Because then they don’t have an excuse to move all your data to Apple servers and scan it for later use.

    • @[email protected]
      link
      fedilink
      43 days ago

      I don’t know how much compute cost this adds to an already expensive computation.

      At that scale and because they do pay for servers I bet they did the math and are constantly optimizing the process as they own the entire stack. They might have somebody who worked on the M4 architecture give them hint on how to do so. Just speculating here but arguably they are in a good position to make this quite efficient, even though in fine if it’s actually worth the ecological costs is arguable.

      • queermunist she/her
        link
        fedilink
        63 days ago

        I bet they did the math

        Did they? Because it seems like everyone else is in a hype bubble and doesn’t give a shit about how much this costs or how much money it makes.

        • @[email protected]
          link
          fedilink
          2
          edit-2
          3 days ago

          Looks like they did “Brakerski-Fan-Vercauteren (BFV) HE scheme, which supports homomorphic operations that are well suited for computation (such as dot products or cosine similarity) on embedding vectors that are common to ML workflows” namely they use a scheme that is both secure and efficient specifically for the kind of compute they do here. https://machinelearning.apple.com/research/homomorphic-encryption

        • @[email protected]
          link
          fedilink
          English
          13 days ago

          At least it’s not going to be the overhyped LLM doing the analysis, it seems, considering the input is a photo data.

      • @[email protected]
        link
        fedilink
        1
        edit-2
        3 days ago

        Their chips are pretty good at not drawing much power. But then you also get to the balance of power cost, computing power and physical space.

        Google and Microsoft are already building their own power generation systems for even faster AI slop. That would make power a lot cheaper, and super efficient chips might not be the best answer.

        I don’t know which way Apple will go, except further up their own behind. But either way, these are some really cool approaches to implementing this technology, and I hope they keep it up!

        • @[email protected]
          link
          fedilink
          03 days ago

          Yep, reading their blog post to read a bit better. I don’t like that it’s enabled by default, especially despite iCloud off (which should be a signal to say the user does NOT want data leaving their device) but considering what others are doing, this seems like the best trade off.

      • @[email protected]
        link
        fedilink
        13 days ago
        1. The end user can access the resulting tags, Apple cannot. However iphones do automatically report if they see something Apple does not like (in the usa).
        2. Whatever lack of incentives may be, this is what is happening. I just explained it a bit simpler than the article did.