Incubating Feminist AI: <From Paper to Prototype to Pilot>
Systemic gender, racial and intersectional bias sit at the core of current Artificial Intelligence & Algorithmic Decision-Making processes born in the North, replicated in the South. Combatting and correcting this wired bias and discrimination is urgent in order that pro-social capabilities of AI & ADM can be activated. We can either seize this moment to correct bias in the digital realm, as we tackle bias in the analog world, or condemn ourselves to old bias hardwired into the future century of Algorithmic Decision-Making (ADM) that is trained by machine learning on biased data sets. The f<a+i>r (feminist AI research) network is dedicated to finding ways to make Artificial Intelligence and related technologies more effective, inclusive and transformational, not only more ‘efficient’.
Our three-year initiative 2021-2024 will take advantage of the opportunity to not only address bias, but rethink how we prototype and pilot applied AI research and solutions to ensure that the technology meets the needs of women and girls in vulnerable communities. This project will be one of the first of its kind to support research which seeks to address holistic policy, practice, and innovation, and we invite innovators, like-minded governments and others to join our emerging coalition as the work commences – particularly those in low and middle income countries who want to explore the positive potential of artificial intelligence in advancing gender equality. We believe that now is the time to connect the work feminist researchers envision for an inclusive and thriving future.
Feminist AI = Algorithmic Decision-Making Systems and Artificial Intelligence harnessed to deliver equality outcomes, designed with inclusion at the core, creating new opportunities and proactive, innovative correction of inequities.
Call for Proposals opens 15 September, closes on 31 October. We are looking for social science and algorithmic models that intersect with core public policy agendas in the Global South. Economic / Social Allocation: in areas such as Health, Education, Justice, Social Protection/Benefits (including Subsidy, Digital ID, Land Tenure/Use, Pensions, Conditional Cash Transfer, Housing Lotteries), new algorithmic applications for Indigenous, Decolonial, Traditional models, etc, Geospatial Mapping, Climate Change, Feminist Data collection or community-driven data stewardship models including Accountable Local, National, or Regional AI Policy or Governance are welcomed. Click on Our Questions below for more information.