Feminist AI Papers, Prototypes & Pilots

< Incubating Feminist AI >
From paper to prototype

We are so incredibly proud of our <A+> Alliance’s f<a+i>r Feminist AI Research Network’s third cohort of projects, which include papers and now a series of prototypes that emanate from  applied research funded by the IDRC.

f<A+I>r’s aim is to support the skill and imagination of Global South/Majority World feminists in producing effective, innovative, interdisciplinary models that harness emerging technologies which correct for real life bias and barriers to women’s rights, representation and equality

 

The papers go through 3 stages of development:

Pilot

Aymur AI | Argentina

 

Measuring Gender Based Violence in Latin America

Gender based violence takes on many forms: physical, psychological, economical among others. While the harms are known, the impact of GBV has not yet adequately been measured. The absence of this data in Latin America has led, in part, to the championing of open data initiatives in the judiciary. The aim is to understand GBV from a judicial perspective, as well as foster a more transparent, innovative and accountable judiciary. AymurAI is the first pilot project, developed by Data Género in Argentina initially as a paper for the f<a+i>r network: “Feminisms in Artificial Intelligence: Automation Tools Towards A Feminist Judiciary Reform in Argentina and Mexico”, through to a successful “Prototype for an open and gender-sensitive justice in Latin America – AymurAI”. When gathered  this data could be used in machine learning to identify the patterns of violence that might ultimately lead to feminicide – and then to policy and potential remedies to hinder the violence and death.

Prototypes

SafeHer | Philippines

AI-(Em)powered Mobility of Women– Socio-cultural, Psychological, Personal, and Spatial Factors to Urban Transit Safety: Informing AI-Driven Filipino Women Safety Apps 

Manila has one of the most dangerous transport systems in the world for women due to harrassment, and sexual assault. AI can be used to mitigate and supplement the efforts to make transit safer for women – providing an avenue to contact authorities, get in touch with their emergency contacts and be alerted about unsafe areas or modes of transportation.

This prototype  was developed from the paper on “AI (em)powered Mobility” which focused on transport systems in Metro Manila,  which have been hailed as dangerous and unsafe for women. To address this, some machine learning applications powered by Artificial Intelligence (AI) have been created and developed to ensure women’s safety. These safety apps, however, do not tackle the underlying issue of perpetrators’ violence against women. Rather than empowering women to take full control of their mobility, these older apps normalize violence and reinforce victim-blaming mentalities. To empower Filipino women, and to make sure that these apps are what women need and want, frameworks for future models of AI-driven safety apps should be rethought.   SAFEHER in its prototype phase is doing just that.

SOF+IA | Chile

Digital Gendered Violence in Chile: Development of a system for reporting, and response-orientation based on a feminist chatbot prototype

This prototype is a reporting and guidance information system for technology-facilitated gender violence (tfGBV) situations in Chile.  It is based on a conversational agent (‘chatbot’ type) using feminist principles that consider ethical issues and put the needs and context of women who are exercising the right to freedom of expression and opinion on social networking platforms –  particularly women with a public voice –  activists, academics, women involved in politics, among others who  live this type of situations daily – at the center of the prototype.”SOF+IA” was named after a public consultation in social media. SOF+IA means “Sistema de Oída Feminista” in Spanish.  In English this could be translated as ‘Feminist Hearing System’.




SOF+IA Website  

 

La Independiente | Chile

Gender perspectives in AI crowd work

How can AI Crowd work become more feminist and fair?

In order to function, algorithms require millions of labeled training data to learn, recognize and categorize information. This labeling is often done by crowd workers who are neglected in system designs and whose well being is seldom considered. Latin American and Caribbean workers have been identified as significant crowd work contributors depending on crowd work as a steady source of income. However most studies on crowd work center around Western women and fail to take into account the personal and professional advancement of Latin American women.

To facilitate communication between these women crowd workers and help them identify relevant conversations, an intelligent a conversational agent was developed to emulate the personality of Latin American heroines and assist users in searching for specific advice, articulating their interests, and navigating the platform. Furthermore, the conversational agent will recommend other crowd working women who might be valuable connections based on shared interests, expertise, and experiences.

E.D.I.A  | Argentina

A tool to overcome technical barriers for bias assessment in Human Language Technologies

There are currently many tools and techniques to detect and mitigate biases in word embeddings, but they present many barriers for the engagement of people without technical skills. Most of the experts in bias, either social scientists or people with deep knowledge of the context where bias is harmful, do not have such technical skills, and they cannot engage in the processes of bias detection because of the technical barriers.

Via Libre, a non-profit civil organization established in Córdoba, Argentina, alongside the National University of Cordoba and the Faculty of Mathematics, Astronomy, Physics and Computing developed this tool that is specially aimed to lower technical barriers and to provide the exploration power to address requirements of experts, scientists (and people in general!) who are interested and willing to audit these technologies.

Feminist AI Framework | Chile

Towards a feminist framework for AI development: From Principles to Practice

This is a practical approach, with a feminist perspective located in Latin America, focused on the development of Artificial Intelligence (AI). It questions if it is possible to develop AI that does not reproduce logics of oppression. To answer, they focus on the power relations immersed in the field of AI and make an interpretative analysis of the day-to-day experiences of seven women working in some field of AI or data science in the region, in dialogue with different statements of feminist principles and guidelines for the development and deployment of digital technologies. The prototype turns the paper into a living document and new methodology. Workshops will explore deepening of the basic guide of questions from the initial paper, and practices for development with projects actively in development.

To read a one-pager about the project:
https://cloud.piratea.me/s/Rs9Hk9AkXTjLDky

To read the full paper:
https://www.derechosdigitales.org/wp-content/uploads/Fair_Doc_Eng.pdf

To read the full paper (Spanish version):
https://www.derechosdigitales.org/wp-content/uploads/Fair_Doc_Esp.pdf

To watch a short explainer: https://www.youtube.com/shorts/5pMd6ys22yU

 

Papers

PAPER: IdeasGym / Egypt
> Explainable AI-Based Tutoring System for Upper Egypt Community Schools 

 

PAPER: Point of View & Digital Futures Lab / India
> Reimagining Automated Violence Interventions through Participatory Technology Design Feminist AI 

 

PAPER: DataLat / Ecuador
> Analyzing Public Procurement Anomalies in Ecuador with a Gender Perspective 

 

PAPER: Coding Rights / Brazil
> Compost engineers and their slow knowledge

PAPER: Child marriage in Indonesia: 

> An exploration of how AI and a Digital Public Good for Gender Justice could contribute to better outcomes for girls

 

PAPER: University of Chile / Chile
> Feminist NLP: An Annotated Corpus to Evaluate Sex Differences in Work Related Diseases in Chile