I never understood how they were useful in the first place. But that’s kind of beside the point. I assume this is referencing AI, but due to the fact that you’ve only posted one photo out of apparently four, I don’t really have any idea what you’re posting about.
The point of verification photos is to ensure that nsfw subreddits are only posting with consent. Many posts were just random nudes someone found, in which the subject was not ok with having them posted.
The verification photos show an intention to upload to the sub. A former partner wanting to upload revenge porn would not have access to a verification photo. They often require the paper be crumpled to make it infeasible to photoshop.
If an AI can generate a photorealistic verification picture, it cannot be used to verify anything.
I didn’t realize they originated with verifying nsfw content. I’d only ever seen them in otherwise text-based contexts. It seemed to me the person in the photo didn’t necessarily represent the account owner just because they were holding up a piece of paper showing the username. But if you’re matching the verification against other photos, that makes more sense.
It’s been used way before the nsfw stuff and the advent of AI.
Back in the days if you were doing an AMA with a celeb, the picture proof is the celeb telling us this is the account they are using. Doesn’t need to be their account and was only useful for people with an identifiable face. If you were doing an AMA because you were some specialist or professional, giving your face and username doesn’t do anything, you need to provide paperwork to the mods.
This is a poor way to police fake nudes though, I wouldn’t have trusted it even before AI.
Sort of. You just need the vague correct position of the elbow/shoulder and facing the camera. You can get away with photoshopping different arms and most people wouldn’t notice if you do it correctly.
As in most things. I don’t have security cameras to capture video of someone breaking in. I have them so my neighbours house looks like an easier target.
I tried some one day and I didn’t find any that is actually easy for a noob, I remember having to check resolution, contrast, spatial frequency disruption etc. and nothing looked easy to detect without proper training.
You can verify the resolution changes across a video or photo. This can be overcome by setting a dpi limit to your lowest resolution item in the picture, but most people go with what looks best instead of a flat level.
On a side note, they are also used all the time for online selling and trading, as a means to verify that the seller is a real person who is in fact in possession of the object they wish to sell.
How does traditional - as in before AI - photo verification knows the image was not manipulated? In this post the paper is super flat, and I’ve seen many others.
From reading the verification rules from /r/gonewild they require the same paper card to be photographed from different angles while being bent slightly.
Photoshopping a card convincingly may be easy. Photoshopping a bent card held at different angles that reads as the same in every image is much more difficult.
That last thing will still be difficult with AI. You can generate one image that looks convincing, but generating multiple images that are consistent? I doubt it.
I feel like you could do this right now by hand (if you have experience with 3d modelling) once you’ve generated an image. 3d modelling often includes creating a model from references, be they drawn or photographs.
Plus, I just remembered that creating 3d models of everyday objects/people via photos from multiple angles has been a thing for a long time. You can make a setup that uses just your phone and some software to make 3d printable models of real objects. No reason preventing someone from using a series of AI generated images instead of photos they took, so long as you can generate a consistent enough series to get a base model you can do some touch-up by hand to fix anything that the software might’ve messed up. I remember a famous lady in the 3d printing space who I think used this sort of process to make a complete 3d model of her (naked) body, and then sold copies of it on her Patreon or something.
Jut ask for multiple photos of the person in the same place, AI has a hard time with temporal coherence so in each picture the room items will change, the face will change a bit (maybe a lot), hair styles will change… etc
I had some trouble figuring out what exactly was going on as well, but the Stable Diffusion subreddit gave away that it was at least AI related, as that’s one of the popular AI programs. It wasn’t until I saw the tag though, that I really understood - Workflow Included. Meaning that the person included the steps they used to create the photo in question. Which means that the person in the photo was created using the AI program and is fake.
The implications of this sort of stuff are massive too. How long until people are using AI to generate incriminating evidence to get people arrested on false charges, or the opposite - creating false evidence to get away with murder.
Pretty sure it started because nsfw subreddit mods realized they demand naked pictures of women that nobody else had access to and it made their little mod become a big mod.
They were used extensively on 4chan, because they were the only way to prove that a person posting was in fact that person.and yes, it was mostly people posting nudes, but it was more that they wanted credit.
The reason it carried on to Reddit was because people were using the accounts to advertise patreon and onlyfans, and mods mostly wanted the people making money off the pictures to be the people who took those pictures.
Also it was useful for AMA posts and other such where a celebrity was involved.
4chan was a bit different in that it was anonymous to begin with- and more to the point, it was self-volunteered verification, not a mod-driven requirement.
As for reddit, mods were requiring private verification photos LONG before patreon and onlyfans even existed in the first place.
I hate to break this to you, but there was in fact subreddits that publically stated that they required you to privately DM mods a full-body full-face nudes in poses of the mod’s choice for verification.
That ain’t me being in bad taste, it’s just me doing basic observation. Some subreddits it was about verification, yes. Some it was about consent. Some of them it was about the mods being horny. And most of them, it was some combination of the three.
To pretend that it didn’t happen is… well, casual erasure of sexual misconduct of the mods, frankly.
I never understood how they were useful in the first place. But that’s kind of beside the point. I assume this is referencing AI, but due to the fact that you’ve only posted one photo out of apparently four, I don’t really have any idea what you’re posting about.
The point of verification photos is to ensure that nsfw subreddits are only posting with consent. Many posts were just random nudes someone found, in which the subject was not ok with having them posted.
The verification photos show an intention to upload to the sub. A former partner wanting to upload revenge porn would not have access to a verification photo. They often require the paper be crumpled to make it infeasible to photoshop.
If an AI can generate a photorealistic verification picture, it cannot be used to verify anything.
I didn’t realize they originated with verifying nsfw content. I’d only ever seen them in otherwise text-based contexts. It seemed to me the person in the photo didn’t necessarily represent the account owner just because they were holding up a piece of paper showing the username. But if you’re matching the verification against other photos, that makes more sense.
It’s been used way before the nsfw stuff and the advent of AI.
Back in the days if you were doing an AMA with a celeb, the picture proof is the celeb telling us this is the account they are using. Doesn’t need to be their account and was only useful for people with an identifiable face. If you were doing an AMA because you were some specialist or professional, giving your face and username doesn’t do anything, you need to provide paperwork to the mods.
This is a poor way to police fake nudes though, I wouldn’t have trusted it even before AI.
It used to be tits or GTFO ON /b.
From now on I’ll have amazing tits.
Was it really that hard to Photoshop enough to bypass mods that are not experts at photo forensic?
Probably not, but it would still reduce the amount considerably.
I think it takes a considerable amount of work to photoshop something written on a sheet of paper that has been crumpled up and flattened back out.
If you have experience with the program it’s piss easy
However most people do not have experience.
You also have to include the actual person holding something that can be substituted for the paper.
Sort of. You just need the vague correct position of the elbow/shoulder and facing the camera. You can get away with photoshopping different arms and most people wouldn’t notice if you do it correctly.
So you need a guy with such experience on your social engineering team.
It’s mostly about filtering the low-hanging fruit, aka the low effort trolls, repost bots, and random idiots posting revenge porn.
As in most things. I don’t have security cameras to capture video of someone breaking in. I have them so my neighbours house looks like an easier target.
Removed by mod
there’s a lot of tools to verify if something was photoshopped or not… you don’t need to be an expert to use them
I tried some one day and I didn’t find any that is actually easy for a noob, I remember having to check resolution, contrast, spatial frequency disruption etc. and nothing looked easy to detect without proper training.
i wouldn’t just go around telling people that…
Can you share more? Never had to use one.
You can verify the resolution changes across a video or photo. This can be overcome by setting a dpi limit to your lowest resolution item in the picture, but most people go with what looks best instead of a flat level.
I was going to suggest using an artifact overlay to suggest all the images were shot by the same lens on the same camera
On a side note, they are also used all the time for online selling and trading, as a means to verify that the seller is a real person who is in fact in possession of the object they wish to sell.
How does traditional - as in before AI - photo verification knows the image was not manipulated? In this post the paper is super flat, and I’ve seen many others.
From reading the verification rules from /r/gonewild they require the same paper card to be photographed from different angles while being bent slightly.
Photoshopping a card convincingly may be easy. Photoshopping a bent card held at different angles that reads as the same in every image is much more difficult.
That last thing will still be difficult with AI. You can generate one image that looks convincing, but generating multiple images that are consistent? I doubt it.
The paper is real. The person behind it is fake.
Curious how long it’ll be until we start getting AI 3D models of this quality.
I feel like you could do this right now by hand (if you have experience with 3d modelling) once you’ve generated an image. 3d modelling often includes creating a model from references, be they drawn or photographs.
Plus, I just remembered that creating 3d models of everyday objects/people via photos from multiple angles has been a thing for a long time. You can make a setup that uses just your phone and some software to make 3d printable models of real objects. No reason preventing someone from using a series of AI generated images instead of photos they took, so long as you can generate a consistent enough series to get a base model you can do some touch-up by hand to fix anything that the software might’ve messed up. I remember a famous lady in the 3d printing space who I think used this sort of process to make a complete 3d model of her (naked) body, and then sold copies of it on her Patreon or something.
Jut ask for multiple photos of the person in the same place, AI has a hard time with temporal coherence so in each picture the room items will change, the face will change a bit (maybe a lot), hair styles will change… etc
I found this singular screenshot floating around elsewhere, but yes r/stablediffusion is for AI images.
I had some trouble figuring out what exactly was going on as well, but the Stable Diffusion subreddit gave away that it was at least AI related, as that’s one of the popular AI programs. It wasn’t until I saw the tag though, that I really understood - Workflow Included. Meaning that the person included the steps they used to create the photo in question. Which means that the person in the photo was created using the AI program and is fake.
The implications of this sort of stuff are massive too. How long until people are using AI to generate incriminating evidence to get people arrested on false charges, or the opposite - creating false evidence to get away with murder.
deleted by creator
deleted by creator
Pretty sure it started because nsfw subreddit mods realized they demand naked pictures of women that nobody else had access to and it made their little mod become a big mod.
Verification posts go back further than Reddit.
They were used extensively on 4chan, because they were the only way to prove that a person posting was in fact that person.and yes, it was mostly people posting nudes, but it was more that they wanted credit.
The reason it carried on to Reddit was because people were using the accounts to advertise patreon and onlyfans, and mods mostly wanted the people making money off the pictures to be the people who took those pictures.
Also it was useful for AMA posts and other such where a celebrity was involved.
4chan was a bit different in that it was anonymous to begin with- and more to the point, it was self-volunteered verification, not a mod-driven requirement.
As for reddit, mods were requiring private verification photos LONG before patreon and onlyfans even existed in the first place.
AMAs, agreed.
“No no it’s not about consent it’s about someone being horny” is such a bad take… and bad taste.
I hate to break this to you, but there was in fact subreddits that publically stated that they required you to privately DM mods a full-body full-face nudes in poses of the mod’s choice for verification.
That ain’t me being in bad taste, it’s just me doing basic observation. Some subreddits it was about verification, yes. Some it was about consent. Some of them it was about the mods being horny. And most of them, it was some combination of the three.
To pretend that it didn’t happen is… well, casual erasure of sexual misconduct of the mods, frankly.