• @SpaceNoodle
    link
    57
    edit-2
    9 months ago

    There’s no excuse for a buffer overflow in a caching component to lead to a security hole like this. If the data were properly encrypted and could only be decrypted by the client on their own device, the result would have been users simply not seeing videos instead of being able to view others’.

    • @Ottomateeverything
      link
      259 months ago

      It doesn’t even need to go that far. If some cache mixes up user ids and device ids, those user ids should go to request a video feed and the serving authority should be like “woah, YOU don’t have access to that device/user”. Even when you fucking mix these things up, there should be multiple places in the chain where this gets checked and denied. This is a systemic/architectural issue and not “one little oopsie in a library”. That oopsie simply exposed the problem.

      I don’t care if I was affected or how widespread this is. This just shows Wyze can’t be trusted with anything remotely “private”. This is a massive security failing.

    • admiralteal
      link
      fedilink
      119 months ago

      If the data were properly encrypted and could only be decrypted by the client on their own device

      Yeah, but part of Wyze’s sales pitch is their AI image recognition features, and they’d lose all training data by doing that and would force it to be processed locally, both of which would be a dead end.

      I realize these might not be features you want nor care about… but those are the features they want to offer.

      • @[email protected]
        link
        fedilink
        English
        79 months ago

        If I would have said 30 years ago that people in the future would pay money for a device that lets companies basically spy on you, and then they can also sell the data, I would have been branded a lunatic and sent for psychiatric help. Yet, here we are.

        • admiralteal
          link
          fedilink
          59 months ago

          You’re talking about a device which is a full-color high-definition surveillance camera that works at night and can be viewed from literally anywhere in the world and can be configured to send you alerts based on seeing people/animals/packages/whatever. That only costs them an inflation adjusted $13.

          I don’t really think the “they wouldn’t believe this shit” argument really applies with how rapidly tech has changed.

        • @[email protected]
          link
          fedilink
          39 months ago

          That and ring doorbells being used as a big data harvesting point for the police.

          The surveillance culture we have is so normalized now people don’t even care that their security camera is more of a corporate livestream then a secure loop. But hey, how else an I gonna pust pictures of the guy stealing my 3rd Amazon package of the day.

        • @[email protected]
          link
          fedilink
          19 months ago

          30 years ago was 1994, the internet was quickly becoming a thing and if you would have told them that companies would eventually offer extra services if you chose to store your data with them, they would have believed you because that’s how the banking system worked for centuries prior.

      • @SpaceNoodle
        link
        29 months ago

        Even just encrypting it before transmission would have prevented this, and still allowed them to harvest data.

  • @GlitzyArmrest
    link
    English
    169 months ago

    Nice of them to attempt to point blame at AWS, I’m sure AWS appreciates that.

  • @[email protected]
    link
    fedilink
    169 months ago

    There are two events:

    1. AWS had an outage which froze their backend
    2. They added some sort of caching that messed up when brought up and let users see other devices.

    Seems like Problem 1 was with Wyze not handling disaster-recovery properly. Problem 2 is them not testing their new update and setting up proper access controls.

    Trying to blame AWS on their own screwup is rich.

    • @Ottomateeverything
      link
      99 months ago

      Problem 2 also shows they have no double checks on access to private video feeds. Mixing up what’s being requested at any step and not reverifying anywhere after that point just reveals fucking terrible security practices.

  • @[email protected]
    link
    fedilink
    129 months ago

    The most specific fix they mentioned,

    Wyze is scrambling to fix things by adding an additional layer of verification before users can view images or footage from the Events tab.

    isn’t going to fix anything, is it? People were already “verified” to view the events, I don’t know how verifying them a second time is going to fix the underlying problem if it just serves up the same incorrect data from the massive data bucket that’s visible to Wyze and Amazon and their employees.

    • @Ottomateeverything
      link
      59 months ago

      No, this does actually sound like a solution. But it’s a solution that should be scattered all throughout the process, and checked at multiple steps along the way. The fact that this wasn’t here to begin with is a bigger problem than the “client library failure” as it shows Wyze’s security practices are fucking garbage. And adding “one layer” is not enough. There should be several.

      To give a bit better context, which I can only be guessing at by reading between the lines of their vague descriptions and my first hand experience with these types of systems…

      Essentially your devices all have unique ids. And your account has an account/user ID. They’re essentially “random numbers” that are unique within each set, but there appear to be devices that have the same ID as a some user’s user ID.

      When the app wants to query for video feeds it’s going to ask the server “hey, get me the feed for devices A, B, and C. And my user ID is X”. The server should receive this, check if that user has access to those devices. But that server is just the first external facing step. It then likely delegates the request through multiple internal services which go look up the feed for those device IDs and return them.

      The problem that happened is somewhere in there, they had an “oopsie” and they passed along “get me device X, X, X for user ID X”. And for whatever reason, all the remaining steps were like “yup, device X for user X, here you go”. At MULTIPLE points along that chain, they should be rechecking this and saying “woah, user X only has access to devices A, B, and C, not X. Access denied.”

      The fact that they checked this ZERO times, and now adding “a layer” of verification is a huge issue imo. This should never have been running in production without multiple steps in the chain validating this. Otherwise, they’re prone to both bugs and hacks.

      But no, they clearly weren’t verified to view the events. Their description implies that somewhere in the chain they scrambled what was being requested and there were no further verifications after that point. Which is a massive issue.

  • SVcrossDO
    link
    English
    119 months ago

    And this is not the first time this has happened… to them.

  • AutoTL;DRB
    link
    fedilink
    English
    19 months ago

    This is the best summary I could come up with:


    Last week, co-founder David Crosby said that “so far” the company had identified 14 people who were able to briefly see into a stranger’s property because they were shown an image from someone else’s Wyze camera.

    The revelation came from an email sent to customers entitled “An Important Security Message from Wyze,” in which the company copped to the breach and apologized, while also attempting to lay some of the blame on its web hosting provider AWS.

    It also claims that all impacted users have been notified of the security breach, and that over 99 percent of all of its customers weren’t affected.

    One Reddit user, who described herself as a “23 year old girl” was getting ready for work during the breach, described herself as “disgusted and upset” and said she would be deleting her account.

    Wyze is scrambling to fix things by adding an additional layer of verification before users can view images or footage from the Events tab.

    “We have also modified our system to bypass caching for checks on user-device relationships until we identify new client libraries that are thoroughly stress tested for extreme events like we experienced on Friday,” the company’s email reads.


    The original article contains 413 words, the summary contains 198 words. Saved 52%. I’m a bot and I’m open source!

  • @[email protected]
    link
    fedilink
    1
    edit-2
    9 months ago

    This is why I have my cameras locally and can’t reach the internet. 2fa screw up, accidentally sending other peoples cameras to users then this!? Not sure how this company is still alive. They have no idea what they are doing security wise.