See that little circle? That’s a camera. | Photo by Vjeran Pavic / The Verge

All around Meta’s Menlo Park campus, cameras stared at me. I’m not talking about security cameras or my fellow reporters’ DSLRs. I’m not even talking about smartphones. I mean Ray-Ban and Meta’s smart glasses, which Meta hopes we’ll all — one day, in some form — wear. I visited Meta for this year’s Connect conference, where just about every hardware product involved cameras. They’re on the Ray-Ban Meta smart glasses that got a software update, the new Quest 3S virtual reality headset, and Meta’s prototype Orion AR glasses. Orion is what Meta calls a “time machine”: a functioning example of what full-fledged AR could look like, years before it will be consumer-ready. But on Meta’s campus, at least, the Ray-Bans were already everywhere. It…

Continue reading…
  • @[email protected]
    link
    fedilink
    English
    4
    edit-2
    2 days ago

    I’ll believe these will take off when I see it, as this has been tried before, and the questions of “use case” and “how does a user control it” still don’t have good answers in my opinion.

    I’d love just a basic HUD in my glasses for stuff like messages from important contacts, walking directions sometimes, maybe a todo list, a calendar view… but any use case I can imagine would be enough work to control when it displayed or not that it would be just as easy to just pull my phone from my pocket.

    Plus, from using VR headsets, there’s still a lot of room for better image quality when we’re using screens. A projector or transparent screen is going to have similar or worse limitations with resolution and clarity.

    • @jqubed
      link
      English
      213 hours ago

      I don’t really see what smart glasses bring to the table that aren’t pretty adequately covered by my smartphone and smartwatch. I can take a quick glance at my watch as a message comes in and easily ignore it when I can’t look. Simple tasks can be done from the watch; more complicated things I’ll want to get out my phone (or maybe even computer). The smartwatch is less obtrusive and likely has better battery life, screen, and intuitive interface. There are plenty of times when I would not want a notification popping up in my glasses. I continue to not want to use voice controls except in very limited situations, like driving my car.

      I remember that when Google Glass had its initial failure with the general public they continued to have use for years in job-related roles. I can see augmented reality having use-cases there, but the more I think about it I don’t see any use-case for augmented reality in everyday life that really improves on what I can already do. Sure, it sounds cool, but the reality seems worse than what we already have.

      As for the cameras, there have already been camera glasses for years now. The quality continues to improve but the use cases for them seem pretty limited. I like roller coasters and while some parks allow people to record rides with action cameras, many do not. I’ve seen some people using camera glasses to get around those restrictions. Still, “secretly recording videos in places that are restricted” has some clear legal and ethical risks.