We must ensure machine learning does not embed an already biased system into our futures
Gender balance in AI decision making: Gender balance in decision making should be put on the official agenda of all involved with the funding, design, adoption, and evaluation of ADM.
Gender balance in design teams: Employment of a robust range of intersectional feminists in the design of ADM systems that will trigger and assist greater innovation and creativity, as well detection and mitigation of bias and harmful effects on women, girls, and those traditionally excluded.
Require companies to proactively disclose and report on gender balance in research and design teams, including upstream when applying for grants. Incentivize teams that are balanced and multi-disciplinary.
Research fund: Create a research fund to explore the impacts of gender and AI, machine learning, bias and fairness, with a multi-disciplinary approach beyond the computer science and economic lens to include new ways of embedding digital literacy, and study the economic, political, and social effects of ADM on the lives of women and those traditionally excluded from rules making and decision taking.
Mass scale correction of skewed data will require multilateral and international cooperation to ensure we leave no one behind.
A UN agencies-wide review of the application of existing international human rights laws and standards for ADM, machine learning, and gender: This can guide and provoke the creative thinking for an approach grounded in human rights that is fit for purpose in the fast changing digital age.
Development of a set of metrics for digital inclusiveness: To be urgently agreed, measured worldwide, and detailed with sex disaggregated data in the annual reports of institutions such as the UN, the International Monetary Fund, the International Telecommunications Union, the World Bank, and other multilateral development banks, and the OECD.