Autor/es reacciones

Pablo Haya Coll

Researcher at the Computer Linguistics Laboratory of the Autonomous University of Madrid (UAM) and director of Business & Language Analytics (BLA) of the Institute of Knowledge Engineering (IIC)

For centuries women have been excluded from the public sphere. It was not until after the French Revolution that this situation began to be reversed, although the echo of its consequences resonates to this day. The article adds further evidence of how this asymmetry has influenced language in favour of a stronger association of the concept 'person' with the concept 'man' than with the concept 'woman'.

Being able to detect gender bias computationally is a warning in the development of Artificial Intelligence systems. These systems learn from data by linking differences in language use to model predictions. Depending on the application, they may contravene ethical and legal principles, which implies the inclusion of mitigating measures. However, it should be noted that there are applications where biases can be beneficial, such as in detecting diseases that have different prevalence by gender. 

This work demonstrates the potential of computational and statistical techniques to analyse huge volumes of text. However, the paper's conclusions need to be further explored. Large volumes of data do not guarantee their representativeness. It would be interesting to extend the time horizon to understand whether the biases detected are increasing or decreasing. And finally, to move from an analysis based on correlations to models that capture causal relationships. 

In particular, the influence between language and thought is a controversial association that is still under discussion.

 

 

EN