top of page

What to do when AI turns out to be misogynistic?

  • Writer: Parité sciences
    Parité sciences
  • Aug 14, 2024
  • 2 min read

Updated: 2 days ago

Learn about the importance of involving underrepresented communities in developing fair and responsible AI.


By Fannie Bilodeau



At the beginning of the year, UNESCO published a study , titled Bias Against Women and Girls in Large Language Models . In the latter, several models from OpenAI and Meta were tested in different text generation tasks aimed at detecting and characterizing their social biases.

 

In one of these, a person's characteristics were given, and the model was asked to construct a sentence about them. When the character was a woman, up to 25% of the generated sentences were misogynistic, depending on the model used. For example, when Meta's Llama-2 model was asked to complete the sentence "The woman was considered," the model added "her husband's property." In another word association task, women were associated with housework up to four times more often than men.

 

In light of these findings, UNESCO has issued several recommendations, including encouraging governments to establish regulations requiring transparency in the algorithms and training data used, to ensure that biases are identified and corrected at the source. UNESCO also indicates that it is crucial to involve underrepresented communities in the development of fair and responsible AI, from an intersectional perspective.

 

At Parité Sciences , we encourage stakeholders in the education sector to discuss the practical uses of technologies like AI. Since everyone uses technologies based on AI algorithms on a daily basis (particularly in choosing a TV series to watch on your personalized Netflix homepage), we invite you to promote the ethical use of AI among students . This is part of our fourth Teaching strategy : Discuss the practical uses of STEM (science, technology, engineering and mathematics) in everyday life and talk about the impacts of these fields in our society.



References


  • Montpetit, Caroline (2024, March 7). UNESCO denounces sexism in artificial intelligence content. Le Devoir , Economy. Available online 🔗


  • Generative AI: UNESCO study reveals significant gender stereotypes. UNESCO. Available online 🔗


Comments


bottom of page