“We developed a deep neural network that maps the phase and amplitude of WiFi signals to UV coordinates within 24 human regions. The results of the study reveal that our model can estimate the dense pose of multiple subjects, with comparable performance to image-based approaches, by utilizing WiFi signals as the only input.”

  • Shurimal
    link
    fedilink
    310 months ago

    What we know about drones is that they have cameras that can discern individuals from 10 km altitude.

    What we suspect is that US has Hubble-sized spy satellites that can do almost the same. There were a lot of classified military STS missions.

    What is theoretically possible is that US drones and spy sats can function as very large arrays (we do this with astronomical telescopes already) to dramatically increase spatial resolution.

    • @Maggoty
      link
      English
      210 months ago

      I’d believe it. When I was in the infantry 20 years ago we could see you 3km away with the optics mounted on our machineguns. And several kilometers with cameras mounted on towers. I don’t know how far they went but it was at least 5km because we were directing mortar fire with them and that’s about the range of the mortar system we were using.

    • @[email protected]
      link
      fedilink
      English
      110 months ago

      Oh I wonder if that’s how the Pic was taken that trump tweeted out of that rocket launch site, people didn’t think it was physically possible for a satellite to have that resolution

      • Shurimal
        link
        fedilink
        110 months ago

        It all comes down to the size of the mirror/lense—the bigger, the better. Up to a point. The biggest problem is air currents and different air densities refracting light and distorting the image. That’s what these laser beams are for on photos taken of astronomical observatories—they give reference light spot that can be used to calibrate adaptive optics to current atmospheric conditions reducing distortion.