The GRAIN (Gender and Responsible Artificial Intelligence Network) project is launching a survey to assess the level of gender mainstreaming in institutions heavily involved in artificial intelligence in sub-Saharan Africa.
The survey represents a crucial step in identifying current practices, highlighting existing gaps and identifying opportunities for improvement to ensure that AI is developed and used in an ethical and equitable manner.
The GRAIN project is therefore committed to exploring how AI-related structures integrate gender perspectives into their practices, policies and technological advances.
The aim is to identify current strengths and weaknesses in gender mainstreaming in the field of artificial intelligence (AI) and to strengthen equity and accountability in the use of AI, while assessing the level of inclusion and integration of gender perspectives in this field.
More specifically, it takes an in-depth look at the distribution of women involved in artificial intelligence-related activities and analyses this involvement at strategic and technical levels.
It also examines women's motivations for choosing a career in artificial intelligence by exploring the motivating factors.
In addition, the survey accurately identifies the obstacles that prevent women from fully participating in artificial intelligence. It highlights the obstacles and challenges in the field of artificial intelligence, and the potential solutions for overcoming them.
To achieve the objectives of the survey, the necessary information will be collected from AI stakeholders in Sub-saharan Africa. So, participants include all institutions involved or active in the field of AI, covering areas such as governance, funding, research, production of solutions, training, use and promotion.
Not that long ago, people lived and functioned in tight communities. Every vendor knew their customers personally and could make...
This Machine Learning Glossary aims to briefly introduce the most important Machine Learning terms - both for the commercially and...