Marian Blanco Ruiz
Lecturer in Audiovisual Communication and Advertising, coordinator of the Advertising and Public Relations strand of the Communication Sciences doctoral programme
This study published in Nature on age and gender distortion in images and language models reinforces what feminist research has been pointing out for decades: technology is not neutral, but rather reproduces and even amplifies pre-existing cultural gender stereotypes and roles. The finding that women are represented as younger than men in prestigious occupations reflects a long-standing cultural pattern linked to what Laura Mulvey called the 'male gaze'.
These results also confirm what has been noted by various feminist technoscience studies: algorithms learn from a biased cultural archive, organised around hierarchies of gender, race and class, among others. The concern is that, when incorporated into automated systems with great social authority, such biases are not only symbolic but become discrimination with real effects on people's daily lives, for example, in access to medical coverage, housing rental or employment. It is precisely this practical aspect that is the central contribution of the evidence in this work. The article points out that women are not only represented as younger, but also evaluated as less competent than men. This finding shows that the 'Jennifer and John effect' continues to be very present in AI developments.
These results, however, are limited by the research method itself. The models are also biased by the tools used; it would be interesting to complement this study with a qualitative analysis that incorporates a critical perspective and could interpret these results in an intersectional way, since gender biases are intertwined with other axes of exclusion, such as class or race, affecting different groups unequally. But beyond describing these dynamics of inequality, this article demonstrates that the urgent challenge is to design strategies that question the cultural assumptions on which artificial intelligence models are trained and that allow for the construction of more equitable and inclusive digital infrastructures.