A judge has dismissed a complaint from a parent and guardian of a girl, now 15, who was sexually assaulted when she was 12 years old after Snapchat recommended that she connect with convicted sex offenders.

According to the court filing, the abuse that the girl, C.O., experienced on Snapchat happened soon after she signed up for the app in 2019. Through its “Quick Add” feature, Snapchat “directed her” to connect with “a registered sex offender using the profile name JASONMORGAN5660.” After a little more than a week on the app, C.O. was bombarded with inappropriate images and subjected to sextortion and threats before the adult user pressured her to meet up, then raped her. Cops arrested the adult user the next day, resulting in his incarceration, but his Snapchat account remained active for three years despite reports of harassment, the complaint alleged.

  • Pika
    link
    fedilink
    English
    610 months ago

    I’m failing to see how it’s snapchats problem, it can’t know that the person was nefarious, and it’s not reasonable to expect that it should have been able to know. This is like saying that Disney should be held responsible because someone decided to go on a killing spree while using the recommended costume of the week. It’s two isolated events that happens to coencide with eachother.

    this is a failure on the parents side all the way down, from the lack of supervision to the allowance of making a social media account below the legal age to do so.

    • @[email protected]
      link
      fedilink
      English
      710 months ago

      Snapchat is not the only problem here, but it is a problem.

      If they can’t guarantee their recommendations are clean, they shouldn’t be offering recommendations. Even to adults. Let people find other accounts to connect to for themselves, or by consulting some third party’s curated list.

      If not offering recommendations destroys Snapchat’s business model, so be it. The world will continue on without them.

      It really is that simple.

      Using buggy code (because all nontrivial code is buggy) to offer recommendations only happens because these companies are cheap and lazy. They need to be forced to take responsibility where it’s appropriate. This does not mean that they should be liable for the identity of posters on their network or the content of individual posts—I agree that expecting them to control that is unrealistic—but all curation algorithms are created by them and are completely under their control. They can provide simple sorts based on data visible to all users, or leave things to spread externally by word of mouth. Anything beyond that should require human verification, because black box algorithms demonstrably do not make good choices.

      It’s the same thing as the recent Air Canada chatbot case: the company is responsible for errors made by its software, to about the same extent as it is responsible for errors made by its employees. If a human working for Snapchat had directed “C.O.” to the paedophile’s account, would you consider Snapchat to be liable (for hiring the kind of person who would do that, if nothing else)?

      • Pika
        link
        fedilink
        English
        2
        edit-2
        10 months ago

        No i would not, unless it was proven that said employee knew the person was an S.O and knew that the account was a minor (but at that point the employee should have disabled the account per Snapchats policy regardless). If that data was not available to them, then they wouldn’t have the capability to know so I would concider it not at fault.

        • @[email protected]
          link
          fedilink
          English
          0
          edit-2
          10 months ago

          Then, in my opinion, you would have failed to perform due diligence. Even if you’d thought C.O. was an adult, suggesting a woman strike up a private conversation with a man neither of you know is always something that deserves a second look (dating sites excepted), because the potential for harm is regrettably high.

    • Chaos
      link
      210 months ago

      It isn’t, and the courts agreed with that. Seems like an issue with legislative law. As far as I was aware, sex offenders were suppose to have Internet restrictions…

      Could there be a good discussion to try and prevent harm to further children? Well, yeah. Some parents just suck and it’s the kid that gets hurt.

      As long as it doesn’t involve stuff like kosa which puts more people on harms way.