- cross-posted to:
- news
- cross-posted to:
- news
A college is removing its vending machines after a student discovered they were using facial recognition technology::A photo shared on Reddit showed one of the vending machines with an error code suggesting it used facial recognition tech.
On the one hand, I can totally understand that there is a difference between recognizing a face and recognizing your face. Algorithms that recognize a face are really easy to implement now.
On the other hand, though, why should a vending machine need to recognize a face? So it shuts off it’s lighting when no one is looking at it? I’m not sure if there is any practical benefit besides some project manager justifying a new feature with buzzword-compliant tech.
I believe the company when they say there is nothing problematic here, but they deserve the bad press for thinking it would be a good idea in the first place.
Their corporate website mentions that they use the data for marketing purposes. Whatever type of face they see - e.g. male or female, large or skinny, etc. - gets correlated with what was purchased, and then they sell that data for marketing purposes. Exactly like Google selling your search history, except with likely fewer restrictions in place.
Their website doesn’t mention how often they get hacked to give away that data for free - to be clear, that data meaning A PICTURE OF YOUR ACTUAL FUCKING FACE. I don’t know what resolution, or even what someone would do with it later, I am focusing here on the fact that the picture taking seems nonconsensual, especially for it to be stored in a database rather than simply used in the moment.
That’s not how this works. The most likely use case is using a picture of your face, letting the algorithm run (which then finds out if you’re male, female, roughly how old) and then they throw the picture away. The actual collected data is anonymous, so if they did that it might even be GDPR compliant in the EU (otherwise they’d break several laws).
There really is no value in having a picture of your actual face, it’s just a lot of trouble in waiting.
They claim to be GPDR compliant, and while I am not an EUian I think if that claim is accurate, they can’t be doing any of those things you mention.
My point is, even if we take them at their word that the facial recognition is benign, it was still a dumb choice.
GPDR only applies in the EU, and this happened in Canada. They may actually be GPDR compliant in europe, but have they stated whether they are following those laws where they aren’t legally required to?
Most companies who sell worldwide won’t bother developing one set of firmware which is GPDR compliant for the EU, and another set for the rest of the world, unless there was an explicit business reason to do so. So when they replied about this incident in Canada with their GPDR status, I thought it was implied that they had only one codebase which was GPDR compliant, and they ship it in Canada, not because they have to but because it’s all they have.
The assumption is exactly what they are hoping for and the problem. They say they adhere to the GPDR, but not that they adhere to it everywhere, regardless of legal requirement. If they do adhere to its requirements everywhere, it would be an easy thing to state.
The article has comments from the manufacturer and the company that stocks the machine and both state that they dont take or store pictures, but are purposely vague about what data they so take and storing. I expect this is due to it still being a creepy level of information about their customer base that is another revenue stream they exploit.
Of note, it’d be pretty easy to push an OTA software update to have it go from recognizing a face to recognizing your face
then of course linking your card/phone to your face. maybe you can get a text message reminding you that you ate one this time last week and “youre not yourself when you’re hungry”
They need to recognize a face because they explicitly state in their FAQ they are estimating purchasers’ age and sex. This isn’t just adjusting lighting. I would not be so quick to say there is nothing problematic here. I’m highly skeptical.
And yet, you’re quick to jump to the conclusion that there is something problematic? I don’t really see anything wrong with this. It’s not personal information. It’s demographics.