Apple is failing to effectively monitor its platforms or scan for images and videos of the sexual abuse of children, child safety experts allege, which is raising concerns about how the company can handle growth in the volume of such material associated with artificial intelligence.

The UK’s National Society for the Prevention of Cruelty to Children (NSPCC) accuses Apple of vastly undercounting how often child sexual abuse material (CSAM) appears in its products. In a year, child predators used Apple’s iCloud, iMessage and Facetime to store and exchange CSAM in a higher number of cases in England and Wales alone than the company reported across all other countries combined, according to police data obtained by the NSPCC.

Through data gathered via freedom of information requests and shared exclusively with the Guardian, the children’s charity found Apple was implicated in 337 recorded offenses of child abuse images between April 2022 and March 2023 in England and Wales. In 2023, Apple made just 267 reports of suspected CSAM on its platforms worldwide to the National Center for Missing & Exploited Children (NCMEC), which is in stark contrast to its big tech peers, with Google reporting more than 1.47m and Meta reporting more than 30.6m, per NCMEC’s annual report.

  • @SlopppyEngineer
    link
    English
    154 months ago

    Well, to get the AI to have enough training data you have to pay people to look at traumatic things all the time first. Once that is done, you still need some people to check the AI to catch false positives as being falsely charged can be devastating. There are already people stories for people being arrested for CSAM because they’ve sent a pic of their child in a swimming pool to the grandparents.

    • Flying SquidM
      link
      English
      64 months ago

      Good point. So no matter what, you end up essentially either torturing people (sure, they’re being paid, but it’s still a form of psychological torture to have to look at CSAM images all the time) or you hire people who want to look at CSAM. Both are terrible options, but I guess the former is really the only viable one.

      And, of course, the humans hired to do this are always from developing nations so they can get away with paying them a few dollars a day.