We need Affirmative Action for Algorithms to correct real life bias and barriers that prevent women from achieving their full participation and rights in the present, and in the future we invent.
Since 2019, we called on Governments, Private Sector and Civil Society Organizations to advocate for and adopt guidelines that establish accountability and transparency for algorithmic decision making (ADM) in both the public and private sectors.
This means that we must ensure machine learning does not embed an already biased system into our futures. It is everyone’s responsibility to bring forward a positive agenda that can advance values of equality which we have long embraced, to correct for the visibility, quality, and influence of women proportionate to the population.
We continue to call for:
Algorithmic equitable actions to correct for real life biases and barriers that prevent women and girls from achieving full participation and equal enjoyment of rights.
Public institutions to Pilot and Lead: Affirmative Action for Algorithms deployed when public institutions pilot ADM. Base pilots on longstanding and new social science research that allocate social incentives, subsidies, or scholarships where women have traditionally been left behind in prior systems.
Public and private sector uptake of Algorithmic Impact Assessments (AIA): A self assessment framework designed to respect the public’s right to know the AI systems that impact their lives in terms of principles of accountability and fairness.
Rigorous testing across the lifecycle of AI systems: Testing should account for the origins and use of training data, test data, models, Application Program Interface (APIs), and other components over a product life cycle. Testing should cover pre-release trials, independent auditing, certification, and on- going monitoring to test for bias and other harms. ADM should improve the quality of, not control the human experience.
Strong legal frameworks to promote accountability: Including potential expansion of powers for sector specific agencies, or creation of new terms of reference to oversee, audit, and monitor ADM systems for regulatory oversight and legal liability on the private and public sector.
Gender responsive procurement guidelines: Organizations and all levels of government, to develop ADM gender equality procurement guidelines with hard targets; and outline roles and responsibilities of those organizations required to apply these principles.
Improved datasets – Open gender disaggregated data, data collection, and inclusive quality datasets: Actively produce open gender disaggregated datasets; this better enables an understanding of the sources of bias in AI, to ultimately improve the performance of machine learning systems. Invest in controls to oversee data collection processes and human-in-the-loop verification, so that data is not collected at the expense of women and other traditionally excluded groups. Engage in more inclusive data collection processes that focus not only on quantity but on quality of datasets.
Follow the conversation on X , IG & LinkedIn #TheTimeHasCome #InclusiveAlgorithms