If it was a linear transformation, probably, because you’d remove the stochastic term. But transformation is non linear. I 'd be surprised if true. Do you have a reference for a statistically meaningful experiment on this?
You are linking sources on biases. As said it is very different. Holy mary is most often represented as white, blue eyes. That is a bias, inherited from training data (as models don’t know anything else out of that).
Average is a different things, these models do not perform averages, do not output averages, averages of the output data are not comparable with averages of input data.
If it was a linear transformation, probably, because you’d remove the stochastic term. But transformation is non linear. I 'd be surprised if true. Do you have a reference for a statistically meaningful experiment on this?
Won’t stop being that guy
It is an unfortunate burden I am condemned to carry
Recognition;
https://odsc.medium.com/the-impact-of-racial-bias-in-facial-recognition-software-36f37113604c
https://venturebeat.com/ai/training-ai-algorithms-on-mostly-smiling-faces-reduces-accuracy-and-introduces-bias-according-to-research/
Generative denoisers and colorization;
https://www.theverge.com/21298762/face-depixelizer-ai-machine-learning-tool-pulse-stylegan-obama-bias
With generative models already used in stuff like adverts and soon in Hollywood, it becomes more relevant as it affects representation;
https://towardsdatascience.com/empowering-fairness-recognizing-and-addressing-bias-in-generative-models-1723ce3973aa
This extends to text as the output more frequently copies a style which is common in the input.
You are linking sources on biases. As said it is very different. Holy mary is most often represented as white, blue eyes. That is a bias, inherited from training data (as models don’t know anything else out of that).
Average is a different things, these models do not perform averages, do not output averages, averages of the output data are not comparable with averages of input data.
It was just to clarify the point
Thank you.