Last month, Ente launched https://Theyseeyourphotos.com, a website and marketing stunt designed to turn Google’s technology against itself. People can upload any photo to the website, which is then sent to a Google Cloud computer vision program that writes a startlingly thorough three-paragraph description of it. (Ente prompts the AI model to document small details in the uploaded images.)
If you don’t want to upload your own picture, Ente gives people the option to experiment on Theyseeyourphotos using one of several stock images. Google’s computer vision is able to pick up on subtle details in them, like a person’s tattoo that appears to be of the letter G, or a child’s temporary tattoo of a leaf. “The whole point is that it is just a single photo,” Mohandas says. He hopes the website prompts people to imagine how much Google—or any AI company—can learn about them from analyzing thousands of their photos in the cloud in the same way.
First we learned that idiocracy was a documentary. Now we are learning that Minority Report was a documentary too.
Damn, what a time to be alive.
Wasn’t idiocracy this thing with an underlying nazi concept of eugenics?
That is WILD! This technology could be put to good use, but corporations are abusing it to build profiles on their users so they can weaponize the data.
Random photo I had saved (of a Da-Brim cycling accessory):
Umm…
TBF, that lawn does look like cardboard.
That’s what wild boars do to it.
That’s really weird, wonder what happened there, if the image was somehow corrupt or anything
Well, was the picture at least taken on a Xiaomi? Or is the AI hallucinating metadata now too
No, it reads the metadata alright!
That’s an interesting detail!
I’m not surprised this tagging system is imperfect, but in a broader context – that a company like Google probably has something a hundred times more powerful and more accurate, and it’s scanning through people’s whole photo libraries, really adds to their creepy factor.
I threw a few images through the website. yikes some of terms it uses.