OpenAI’s Sora, an AI model that generates videos from text instructions, has been examined for its portrayal of the LGBTQ community. The investigation revealed that when asked to envision queer people, AI image and video generators, including Sora, often responded with stereotypical depictions of LGBTQ culture.
Despite recent improvements in image quality, AI-generated images frequently presented a simplistic, whitewashed version of queer life. For instance, lesbian women were shown with nose rings and stern expressions, gay men were all fashionable dressers with killer abs, and basic images of trans women were hypersexualized, with lingerie outfits and cleavage-focused camera angles.
These depictions reflect the data used to train the underlying machine-learning algorithms. This data is mostly collected by scraping text and images from the web, where depictions of queer people may already reinforce stereotypical assumptions, like gay men appearing effeminate and lesbian women appearing butch.
The divergence between the queer people interested in artificial intelligence and how the same group of people is represented by the tools their industry is building is a significant issue.
read more > www.wired.com