Hi AI ethics enthusiasts,
When talking about bias in AI, people typically focus on how often gender, racial, and other minorities appear in AI-generated images. But the problem goes much further than that.
A recent study (Ghosh and Caliskan, 2023) uncovered alarming biases in the representation of “person” in Stable Diffusion, one of the most popular AI image generators.
These results mean that, without proper care, the growing use of generative AI will likely spread stereotypes even when generating seemingly innocuous content.
I’ll review this research and explain its significance.
For dessert, an AI-generated take on this post is at the end of this post!
Quick Summary
The researchers generated multiple images using Stable diffusion, and revealed the following concerning biases:
In AI-generated images, “Person” is most similar to Western, light-skinned man.
Women in AI-generated images are sexualized. Women from Latin American countries are much more sexualized.
Now, let's dive deeper int…