• Zoolander
    link
    English
    211 months ago

    Not even touch points, per se. The AVP uses eye tracking. Just look at what you want and pinch yo fingers together. I think you can pinch and hold too.

    • conciselyverbose
      link
      fedilink
      211 months ago

      I understand that. My point is that “look and pinch” effectively maps perfectly without alteration to touching a point, or touching and dragging.

      It’s not that you can’t also do a virtual remote to handle TV apps, but the interaction they intend is a lot closer to a tablet. Defaulting to TV would teach developers bad habits. You’d end up with more interactions more limited than they need to be.

      • Zoolander
        link
        English
        3
        edit-2
        11 months ago

        Ahhh, gotcha. I was thinking the opposite. Since the remote basically has a swipe pad and that’s it, it felt like there was less needed but I think you’re right. You gotta be able to pinch and zoom photos and stuff and that only works if they’re duplicating a trackpad or mouse.

        PS. Your username is great.

        • conciselyverbose
          link
          fedilink
          211 months ago

          I’m not sure if they’ve mapped every multitouch gesture to the Vision Pro out of the box, but it’s something they can and should do in time. There’s a lot of potential there.

          You could easily have some of the same gestures do double duty as remote inputs on TV interfaces, since it’s all context dependent on where your eyes are, and there aren’t that many to map. But swipe up down left right to navigate a TV interface would get old I think.

          I do actually think they should (I understand developer relations/contract reasons they don’t) straight up give you emulators apps can’t distinguish from the TV/iPad/iPhone on both MacOS and Vision Pro, and take action against developers who try to artificially block you from using their apps on other devices. There are things that won’t work, but most will, and I think letting developers artificially segment it out when it’s all basically the same chip now is kind of bullshit.