On Monday 29 January 2025, the GRAIN network organised an online training session on "Artificial Intelligence and Gender" for beneficiaries of the Grain project who are also members of the GRAIN network.
The objectives of the training included strengthening participants' understanding of how to integrate gender perspectives into AI initiatives; exploring the role of evidence in designing effective and inclusive AI solutions; and providing practical tools for applying gender-sensitive approaches to AI projects.
Led by Sixtus Onyekwere, a researcher in international development and gender expert at the Centre for the Study of African Economies (CSEA), the training course raised awareness of the importance of integrating the gender dimension into technological innovation processes, particularly AI, and of rethinking the design, deployment and evaluation of AI systems through the prism of equity, inclusion and social justice.
The session gave the gender expert the opportunity to revisit key concepts such as algorithmic bias, intersectionality and the ethics of AI. She insisted on the idea that AI can either reinforce existing inequalities or be a lever for positive transformation, depending on how it is thought out and applied.
Artificial intelligence is often perceived as objective, yet it inherits biases that may be present in the data, the algorithms or the teams that design it. Gender bias can have concrete consequences, including the exclusion of certain populations, the reinforcement of stereotypes and discriminatory decisions in recruitment, credit and predictive justice algorithms. Examples of gender bias were shared by the trainer: facial recognition, AI-based human resources tools, medical diagnosis, etc.
The session also looked at methods and tools for thinking about better gender mainstreaming, such as: regular audits of AI models to detect sexist and racial bias; integrating diversity and inclusion criteria into technology performance indicators; encouraging a diversity of profiles in design teams : (women, minorities, people from social science backgrounds); working with a diversity of end-users to test tools in a variety of contexts; collecting representative data that takes gender differences into account; Thinking about the social impact of innovations from the design stage and not as an afterthought and implementing methods for anonymising and de-identifying data.
At the end of the presentation, the discussions provided an opportunity for the network members to express their difficulty in putting these good practices into practice in certain contexts (a lack of resources, low awareness, institutional resistance). In their view, it is therefore important to train developers and decision-makers in critical thinking and the gender approach, starting at university or in technical courses.
Not that long ago, people lived and functioned in tight communities. Every vendor knew their customers personally and could make...
This Machine Learning Glossary aims to briefly introduce the most important Machine Learning terms - both for the commercially and...