BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//A+ Alliance - ECPv6.15.18//NONSGML v1.0//EN
CALSCALE:GREGORIAN
METHOD:PUBLISH
X-ORIGINAL-URL:https://aplusalliance.org
X-WR-CALDESC:Events for A+ Alliance
REFRESH-INTERVAL;VALUE=DURATION:PT1H
X-Robots-Tag:noindex
X-PUBLISHED-TTL:PT1H
BEGIN:VTIMEZONE
TZID:Europe/Helsinki
BEGIN:DAYLIGHT
TZOFFSETFROM:+0200
TZOFFSETTO:+0300
TZNAME:EEST
DTSTART:20230326T010000
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:+0300
TZOFFSETTO:+0200
TZNAME:EET
DTSTART:20231029T010000
END:STANDARD
BEGIN:DAYLIGHT
TZOFFSETFROM:+0200
TZOFFSETTO:+0300
TZNAME:EEST
DTSTART:20240331T010000
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:+0300
TZOFFSETTO:+0200
TZNAME:EET
DTSTART:20241027T010000
END:STANDARD
BEGIN:DAYLIGHT
TZOFFSETFROM:+0200
TZOFFSETTO:+0300
TZNAME:EEST
DTSTART:20250330T010000
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:+0300
TZOFFSETTO:+0200
TZNAME:EET
DTSTART:20251026T010000
END:STANDARD
END:VTIMEZONE
BEGIN:VTIMEZONE
TZID:Europe/Moscow
BEGIN:STANDARD
TZOFFSETFROM:+0300
TZOFFSETTO:+0300
TZNAME:MSK
DTSTART:20240101T000000
END:STANDARD
END:VTIMEZONE
BEGIN:VTIMEZONE
TZID:Europe/Paris
BEGIN:DAYLIGHT
TZOFFSETFROM:+0100
TZOFFSETTO:+0200
TZNAME:CEST
DTSTART:20240331T010000
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:+0200
TZOFFSETTO:+0100
TZNAME:CET
DTSTART:20241027T010000
END:STANDARD
BEGIN:DAYLIGHT
TZOFFSETFROM:+0100
TZOFFSETTO:+0200
TZNAME:CEST
DTSTART:20250330T010000
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:+0200
TZOFFSETTO:+0100
TZNAME:CET
DTSTART:20251026T010000
END:STANDARD
BEGIN:DAYLIGHT
TZOFFSETFROM:+0100
TZOFFSETTO:+0200
TZNAME:CEST
DTSTART:20260329T010000
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:+0200
TZOFFSETTO:+0100
TZNAME:CET
DTSTART:20261025T010000
END:STANDARD
END:VTIMEZONE
BEGIN:VTIMEZONE
TZID:America/New_York
BEGIN:DAYLIGHT
TZOFFSETFROM:-0500
TZOFFSETTO:-0400
TZNAME:EDT
DTSTART:20250309T070000
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:-0400
TZOFFSETTO:-0500
TZNAME:EST
DTSTART:20251102T060000
END:STANDARD
BEGIN:DAYLIGHT
TZOFFSETFROM:-0500
TZOFFSETTO:-0400
TZNAME:EDT
DTSTART:20260308T070000
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:-0400
TZOFFSETTO:-0500
TZNAME:EST
DTSTART:20261101T060000
END:STANDARD
BEGIN:DAYLIGHT
TZOFFSETFROM:-0500
TZOFFSETTO:-0400
TZNAME:EDT
DTSTART:20270314T070000
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:-0400
TZOFFSETTO:-0500
TZNAME:EST
DTSTART:20271107T060000
END:STANDARD
END:VTIMEZONE
BEGIN:VEVENT
DTSTART;TZID=Europe/Helsinki:20241111T160000
DTEND;TZID=Europe/Helsinki:20241111T170000
DTSTAMP:20260404T132640
CREATED:20241101T114816Z
LAST-MODIFIED:20241101T114816Z
UID:24555-1731340800-1731344400@aplusalliance.org
SUMMARY:Open Studio: Evaluation of SOF+IA in light of Data Feminism Principles | Daniela Moyano
DESCRIPTION:In this open studio\, Daniela Moyano will speak about Sof+IA: a prototype chatbot designed to report and provide guidance on Digital Gender Violence (DGV) in Chile. This evaluation examines its development through Data Feminism principles\, highlighting how it challenges power structures\, prioritizes survivor needs\, and integrates emotion into technology design. Despite challenges such as limited resources and inclusion gaps\, Sof+IA centers survivors’ experiences and promotes pluralism. Continuous reflection and evaluation are essential for integrating feminist principles into AI development. \nAbout Daniela MoyanoDaniela Moyano is an information designer and data visualization expert with a focus on gender data and feminist perspectives. With a B.A. in Design and an M.S.C. in Sociology\, she has co-founded ODEGI\, an observatory promoting critical gender data use. Daniela has led multiple research and design projects\, including Sof+IA\, a chatbot addressing digital gender violence. Her work blends technology\, social justice\, and feminist theory\, advocating for more inclusive and ethical technological practices. \n \nJoin the Open Studio by joining the AI Equality Community on Circle! 
URL:https://aplusalliance.org/event/open-studio-evaluation-of-sofia-in-light-of-data-feminism-principles-daniela-moyano/
CATEGORIES:Eventos A+,Eventos LAC,Eventos virtuales
ATTACH;FMTTYPE=image/jpeg:https://aplusalliance.org/wp-content/uploads/2024/11/AIEQ-OPEN-STUDIO_02-4.jpg
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=Europe/Helsinki:20241112T160000
DTEND;TZID=Europe/Helsinki:20241112T170000
DTSTAMP:20260404T132640
CREATED:20240909T105930Z
LAST-MODIFIED:20240909T105930Z
UID:24519-1731427200-1731430800@aplusalliance.org
SUMMARY:Open Studio: AI & Health\, thinking through maintenance\, repair and response-ability | Amina Soulimani
DESCRIPTION:This open studio will focus on exploring the multiple ways through which we can understand the entanglement of repair and maintenance with visionary aspirations around smart hospitals. By revisiting earlier and contemporary Feminist STS texts\, new materialism\, phenomenology and leaning on insights from the field\, Amina will share insights emerging from her ongoing PhD research. Presenting Morocco as a case study\, this talk shall explore further the dimensions that waiting takes in environments that run through foreign software in local hospitals. \nAbout Amina Soulimani: Amina Soulimani’s research investigates human-technology interactions in oncology care\, and algorithmic infrastructures within hospitals in Morocco. Her work is guided primarily through interdisciplinary practice\, decolonial scholarship\, and critical realism when dreaming about dignified care futures. She is Doctoral Research Fellow at HUMA\, The Institute for Humanities in Africa\, and a doctoral candidate in Anthropology at the University of Cape Town\, South Africa. \n \nJoin the Open Studio by joining the AI Equality Community on Circle! 
URL:https://aplusalliance.org/event/open-studio-ai-health-thinking-through-maintenance-repair-and-response-ability-amina-soulimani/
CATEGORIES:Eventos virtuales
ATTACH;FMTTYPE=image/jpeg:https://aplusalliance.org/wp-content/uploads/2024/09/AIEQ-OPEN-STUDIO_02-1.jpg
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=Europe/Helsinki:20241118T160000
DTEND;TZID=Europe/Helsinki:20241118T170000
DTSTAMP:20260404T132640
CREATED:20241101T115115Z
LAST-MODIFIED:20241101T115115Z
UID:24556-1731945600-1731949200@aplusalliance.org
SUMMARY:Open Studio: Much Distress and Little Relief | Jamie Fuller
DESCRIPTION:Earlier this year\, predictive optimization was introduced into South Africa’s social grant system to detect and counter fraud. As a result\, thousands (and counting) of the country’s most financially vulnerable have been mistakenly excluded from receiving their grant payment. This open studio unpacks the case study\, explaining what went wrong and reflecting on a more productive use of AI. It concludes that while AI may be most alluring in the resource-constrained realities of developing countries\, it is precisely in such contexts that it poses the greatest threat. \nAbout the speaker: Jamie Fuller is a junior researcher at Research ICT Africa\, a Cape Town – based think tank committed to enabling universal and meaningful digital access across the continent. She holds a Masters degree in Philosophy and is especially passionate about everyday ethical dilemmas\, including those associated with advanced data-driven technologies.\n \nJoin the Open Studio by joining the AI Equality Community on Circle! 
URL:https://aplusalliance.org/event/open-studio-much-distress-and-little-relief-jamie-fuller/
CATEGORIES:Eventos virtuales
ATTACH;FMTTYPE=image/jpeg:https://aplusalliance.org/wp-content/uploads/2024/11/AIEQ-OPEN-STUDIO_02-6.jpg
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=Europe/Helsinki:20241202T160000
DTEND;TZID=Europe/Helsinki:20241202T170000
DTSTAMP:20260404T132640
CREATED:20241101T115326Z
LAST-MODIFIED:20241108T093056Z
UID:24557-1733155200-1733158800@aplusalliance.org
SUMMARY:Open Studio: Can Bias in LLMs Be Used for Good? | Francesca Lucchini Wortzman
DESCRIPTION:It is a well-known fact that LLMs express harmful biases in their predictions. The main source comes from the training datasets\, which are too large and expensive to check thoroughly. In this open studio\, Francesca Lucchini will explore how we can leverage the bias in LLMs and use them to examine massive datasets\, discovering starting points for a data audit. \nAbout the speaker: Francesca Lucchini Wortzman is a Tech Lead at CENIA\, the National Artificial Intelligence Research Center. She has a computer science and masters degree from the Pontificia Universidad Católica de Chile\, she specializes in machine learning applications related to urban analysis and computer vision. Francesca is passionate about gender equality and applied ethics in AI.\n \nJoin the Open Studio by joining the AI Equality Community on Circle! 
URL:https://aplusalliance.org/event/open-studio-can-bias-in-llms-be-used-for-good-francesca-lucchini-wortzman/
CATEGORIES:Eventos A+,Eventos LAC,Eventos virtuales
ATTACH;FMTTYPE=image/jpeg:https://aplusalliance.org/wp-content/uploads/2024/11/AIEQ-OPEN-STUDIO_02-5.jpg
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=Europe/Moscow:20250908T160000
DTEND;TZID=Europe/Moscow:20250908T170000
DTSTAMP:20260404T132640
CREATED:20250828T080700Z
LAST-MODIFIED:20250828T093019Z
UID:24918-1757347200-1757350800@aplusalliance.org
SUMMARY:The African AI & Equality Toolbox Launch
DESCRIPTION:The African AI & Equality Toolbox is a strategic initiative designed to empower African stakeholders—policymakers\, technologists\, civil society actors\, and communities—to shape Artificial Intelligence (AI) systems that are contextually relevant\, inclusive\, and grounded in human rights. \nDeveloped in collaboration with Women at the Table and the African Centre for Technology Studies (ACTS)\, and adapted from the global <AI & Equality> Human Rights Toolbox Initiative in collaboration with the UN Office of the High Commissioner for Human Rights (OHCHR)\, this African iteration provides practical tools and methodologies to guide equitable AI development across the continent. \nThe Toolbox applies a Human Rights-based AI Lifecycle Framework\, integrating reflective questions and the Human Rights Impact Assessment (HRIA) developed with the Alan Turing Institute. It emphasizes participatory\, multidisciplinary approaches and is rooted in feminist\, decolonial\, and Justice\, Equity\, Diversity\, and Inclusion (JEDI) principles and incorporates lessons from emerging digital rights challenges\, ensuring AI systems are designed with safety and dignity at their core. \nThis launch webinar will present the methodology and the process of development of the toolbox featuring the contributors and introduce the webinar series that focus on each stage of the lifecycle with the selected case studies. \n1PM GMT | 3PM SAST | 4PM EAT \nRegister here 
URL:https://aplusalliance.org/event/the-african-ai-equality-toolbox-launch/
CATEGORIES:Eventos A+
ATTACH;FMTTYPE=image/jpeg:https://aplusalliance.org/wp-content/uploads/2025/08/AI-EQ-Toolbox-black-w-gradient-scaled.jpg
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=Europe/Moscow:20250915T160000
DTEND;TZID=Europe/Moscow:20250915T170000
DTSTAMP:20260404T132640
CREATED:20250828T081351Z
LAST-MODIFIED:20250828T093411Z
UID:24925-1757952000-1757955600@aplusalliance.org
SUMMARY:The African AI & Equality Toolbox Webinar 1: Introduction & Stage 1
DESCRIPTION:Across African contexts\, development and technology projects are often driven by external actors with little grounding in local priorities. Solutions frequently arrive pre-packaged—built around assumed problems\, rather than those identified by communities themselves. This is particularly evident in AI deployments in agriculture\, education\, and public health\, where tools may miss the mark\, or worse\, exacerbate inequities. \nThis webinar serves to explore Stage 1 of AI Development which is fundamentally a participatory and grounded approach which is crucial for centering gender equity\, given that women often carry the brunt of labor in agriculture and caregiving\, yet remain underrepresented in AI design and governance. \nThe African AI & Equality Toolbox is a strategic initiative designed to empower African stakeholders—policymakers\, technologists\, civil society actors\, and communities—to shape Artificial Intelligence (AI) systems that are contextually relevant\, inclusive\, and grounded in human rights. \nDeveloped by Women at the Table and the African Centre for Technology Studies (ACTS)\, and adapted from the global AI & Equality Human Rights Toolbox Initiative in collaboration with the UN Office of the High Commissioner for Human Rights (OHCHR)\, this African iteration provides practical tools and methodologies to guide equitable AI development across the continent. \nThe Toolbox applies a Human Rights-based AI Lifecycle Framework\, integrating reflective questions and the Human Rights Impact Assessment (HRIA) developed with the Alan Turing Institute. It emphasizes participatory\, multidisciplinary approaches and is rooted in feminist\, decolonial\, and Justice\, Equity\, Diversity\, and Inclusion (JEDI) principles and incorporates lessons from emerging digital rights challenges\, ensuring AI systems are designed with safety and dignity at their core. \n1PM GMT | 3PM SAST | 4PM EAT \nRegister here
URL:https://aplusalliance.org/event/the-african-ai-equality-toolbox-webinar-1-introduction-stage-1/
CATEGORIES:Eventos A+
ATTACH;FMTTYPE=image/jpeg:https://aplusalliance.org/wp-content/uploads/2025/08/AI-EQ-Toolbox-black-w-gradient-scaled.jpg
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=Europe/Moscow:20251006T160000
DTEND;TZID=Europe/Moscow:20251006T170000
DTSTAMP:20260404T132640
CREATED:20250828T093812Z
LAST-MODIFIED:20250828T103135Z
UID:24934-1759766400-1759770000@aplusalliance.org
SUMMARY:The African AI & Equality Toolbox Webinar 2: System Requirements
DESCRIPTION:In many African AI deployments\, system requirements are defined by international technical partners or funders\, often without fully understanding the day-to-day realities of use. This leads to design choices—like requiring high-speed internet\, English-only interfaces\, or complex interfaces—that make tools ineffective or even harmful. \nThis webinar looks at how requirement setting should function as a bridge between vision and use: Aligning system features with cultural context\, infrastructure gaps\, and social expectations; Identifying constraints early on—connectivity\, literacy\, consent\, power dynamics—and building around them and making conscious trade-offs between speed\, scale\, and equity. \n  \nThe African AI & Equality Toolbox is a strategic initiative designed to empower African stakeholders—policymakers\, technologists\, civil society actors\, and communities—to shape Artificial Intelligence (AI) systems that are contextually relevant\, inclusive\, and grounded in human rights. \nDeveloped by Women at the Table and the African Centre for Technology Studies (ACTS)\, and adapted from the global AI & Equality Human Rights Toolbox Initiative in collaboration with the UN Office of the High Commissioner for Human Rights (OHCHR)\, this African iteration provides practical tools and methodologies to guide equitable AI development across the continent. \nThe Toolbox applies a Human Rights-based AI Lifecycle Framework\, integrating reflective questions and the Human Rights Impact Assessment (HRIA) developed with the Alan Turing Institute. It emphasizes participatory\, multidisciplinary approaches and is rooted in feminist\, decolonial\, and Justice\, Equity\, Diversity\, and Inclusion (JEDI) principles and incorporates lessons from emerging digital rights challenges\, ensuring AI systems are designed with safety and dignity at their core. \n1PM GMT | 3PM SAST | 4PM EAT \nRegister here \n  \n 
URL:https://aplusalliance.org/event/the-african-ai-equality-toolbox-webinar-2-system-requirements-2/
CATEGORIES:Eventos A+
ATTACH;FMTTYPE=image/jpeg:https://aplusalliance.org/wp-content/uploads/2025/08/AI-EQ-Toolbox-black-w-gradient-scaled.jpg
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=Europe/Moscow:20251016T150000
DTEND;TZID=Europe/Moscow:20251016T160000
DTSTAMP:20260404T132640
CREATED:20250828T101850Z
LAST-MODIFIED:20250828T102042Z
UID:24945-1760626800-1760630400@aplusalliance.org
SUMMARY:The Latin American AI & Equality Toolbox Introduction
DESCRIPTION:AI & Equality partnered with the Chilean Centro Nacional de Inteligencia Artificial\, CENIA\,  to co-construct a Latin American Spanish language version of the validated <AI & Equality> Toolbox\, with use cases relevant to the regional experience.\n\nThe partnership builds on the learnings from the workshop structure and outreach from the African <AI & Equality> Toolbox.\n\n\nTranslating and adapting tools like the <AI & Equality> Human Rights Toolbox can facilitate the co-creation of a common vocabulary and common understanding on the specific needs and challenges of each region of the world\, unlocking informed debates about their visions for data with purpose and collaboratively finding examples of regional use cases of AI. \n\n3PM ET | 10AM Santiago  \nRegister here 
URL:https://aplusalliance.org/event/the-latin-american-ai-equality-toolbox-introduction/
CATEGORIES:Eventos A+
ATTACH;FMTTYPE=image/png:https://aplusalliance.org/wp-content/uploads/2025/08/Untitled-design-1.png
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=Europe/Moscow:20251020T160000
DTEND;TZID=Europe/Moscow:20251020T170000
DTSTAMP:20260404T132640
CREATED:20250828T093502Z
LAST-MODIFIED:20250828T094547Z
UID:24930-1760976000-1760979600@aplusalliance.org
SUMMARY:The African AI & Equality Toolbox Webinar 3: Data Discovery
DESCRIPTION:In many African AI projects\, available datasets are either imported (trained on non-African populations) or incomplete (lacking local language\, gender\, or cultural nuance). This misalignment risks perpetuating systemic bias under the guise of neutrality.This webinar will offer a deep dive into the Essential Questions of Data Discovery including a case study on the requirement for building effective TFGBV prevention datasets that include African languages. \n  \nThe African AI & Equality Toolbox is a strategic initiative designed to empower African stakeholders—policymakers\, technologists\, civil society actors\, and communities—to shape Artificial Intelligence (AI) systems that are contextually relevant\, inclusive\, and grounded in human rights. \nDeveloped by Women at the Table and the African Centre for Technology Studies (ACTS)\, and adapted from the global AI & Equality Human Rights Toolbox Initiative in collaboration with the UN Office of the High Commissioner for Human Rights (OHCHR)\, this African iteration provides practical tools and methodologies to guide equitable AI development across the continent. \nThe Toolbox applies a Human Rights-based AI Lifecycle Framework\, integrating reflective questions and the Human Rights Impact Assessment (HRIA) developed with the Alan Turing Institute. It emphasizes participatory\, multidisciplinary approaches and is rooted in feminist\, decolonial\, and Justice\, Equity\, Diversity\, and Inclusion (JEDI) principles and incorporates lessons from emerging digital rights challenges\, ensuring AI systems are designed with safety and dignity at their core. \n1PM GMT | 3PM SAST | 4PM EAT \nRegister here
URL:https://aplusalliance.org/event/the-african-ai-equality-toolbox-webinar-2-system-requirements/
CATEGORIES:Eventos A+
ATTACH;FMTTYPE=image/jpeg:https://aplusalliance.org/wp-content/uploads/2025/08/AI-EQ-Toolbox-black-w-gradient-scaled.jpg
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=Europe/Moscow:20251105T160000
DTEND;TZID=Europe/Moscow:20251105T170000
DTSTAMP:20260404T132640
CREATED:20250828T093940Z
LAST-MODIFIED:20250828T103053Z
UID:24937-1762358400-1762362000@aplusalliance.org
SUMMARY:The African AI & Equality Toolbox Webinar 4: Model Selection
DESCRIPTION:Imported or generalized models often underperform—especially when they are trained on data that does not reflect local language\, environment\, or lived experience. For AI to be trustworthy\, accuracy alone is not enough. \nThis webinar dives into what inclusion and efficiency means ensuring the development or building of systems that don’t require technical expertise to interpret—ensuring that trust\, oversight\, and agency are accessible to all users. Whether a rural health worker\, a student\, or a community organizer\, each person should be able to understand what a system is doing and why. \nThe African AI & Equality Toolbox is a strategic initiative designed to empower African stakeholders—policymakers\, technologists\, civil society actors\, and communities—to shape Artificial Intelligence (AI) systems that are contextually relevant\, inclusive\, and grounded in human rights. \nDeveloped by Women at the Table and the African Centre for Technology Studies (ACTS)\, and adapted from the global AI & Equality Human Rights Toolbox Initiative in collaboration with the UN Office of the High Commissioner for Human Rights (OHCHR)\, this African iteration provides practical tools and methodologies to guide equitable AI development across the continent. \nThe Toolbox applies a Human Rights-based AI Lifecycle Framework\, integrating reflective questions and the Human Rights Impact Assessment (HRIA) developed with the Alan Turing Institute. It emphasizes participatory\, multidisciplinary approaches and is rooted in feminist\, decolonial\, and Justice\, Equity\, Diversity\, and Inclusion (JEDI) principles and incorporates lessons from emerging digital rights challenges\, ensuring AI systems are designed with safety and dignity at their core. \n1PM GMT | 3PM SAST | 4PM EAT \nRegister here
URL:https://aplusalliance.org/event/the-african-ai-equality-toolbox-webinar-4-model-selection/
CATEGORIES:Eventos A+
ATTACH;FMTTYPE=image/jpeg:https://aplusalliance.org/wp-content/uploads/2025/08/AI-EQ-Toolbox-black-w-gradient-scaled.jpg
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=Europe/Moscow:20251112T160000
DTEND;TZID=Europe/Moscow:20251112T170000
DTSTAMP:20260404T132640
CREATED:20250828T094050Z
LAST-MODIFIED:20250828T103014Z
UID:24939-1762963200-1762966800@aplusalliance.org
SUMMARY:The African AI & Equality Toolbox Webinar 5: Model Interpretation
DESCRIPTION:In African deployments\, there is often pressure to launch rapidly\, without thorough contextual testing. But skipping this step is where trust breaks down—and harm begins. Testing must happen with communities\, not just on them. \nThis stage examines the opportunity to reflect on how power operates in AI: Who gets to say if it works? Who can question it? Who can stop it? \nThe African AI & Equality Toolbox is a strategic initiative designed to empower African stakeholders—policymakers\, technologists\, civil society actors\, and communities—to shape Artificial Intelligence (AI) systems that are contextually relevant\, inclusive\, and grounded in human rights. \nDeveloped by Women at the Table and the African Centre for Technology Studies (ACTS)\, and adapted from the global AI & Equality Human Rights Toolbox Initiative in collaboration with the UN Office of the High Commissioner for Human Rights (OHCHR)\, this African iteration provides practical tools and methodologies to guide equitable AI development across the continent. \nThe Toolbox applies a Human Rights-based AI Lifecycle Framework\, integrating reflective questions and the Human Rights Impact Assessment (HRIA) developed with the Alan Turing Institute. It emphasizes participatory\, multidisciplinary approaches and is rooted in feminist\, decolonial\, and Justice\, Equity\, Diversity\, and Inclusion (JEDI) principles and incorporates lessons from emerging digital rights challenges\, ensuring AI systems are designed with safety and dignity at their core. \n1PM GMT | 3PM SAST | 4PM EAT \nRegister here  \n 
URL:https://aplusalliance.org/event/the-african-ai-equality-toolbox-webinar-5-model-interpretation/
CATEGORIES:Eventos A+
ATTACH;FMTTYPE=image/jpeg:https://aplusalliance.org/wp-content/uploads/2025/08/AI-EQ-Toolbox-black-w-gradient-scaled.jpg
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=Europe/Paris:20251113T140000
DTEND;TZID=Europe/Paris:20251113T150000
DTSTAMP:20260404T132640
CREATED:20250828T102243Z
LAST-MODIFIED:20250828T102316Z
UID:24950-1763042400-1763046000@aplusalliance.org
SUMMARY:The Latin American AI & Equality Toolbox Launch
DESCRIPTION:AI & Equality partnered with the Chilean Centro Nacional de Inteligencia Artificial\, CENIA\,  to co-construct a Latin American Spanish language version of the validated <AI & Equality> Toolbox\, with use cases relevant to the regional experience.\n\nThe partnership builds on the learnings from the workshop structure and outreach from the African <AI & Equality> Toolbox.\n\n\nTranslating and adapting tools like the <AI & Equality> Human Rights Toolbox can facilitate the co-creation of a common vocabulary and common understanding on the specific needs and challenges of each region of the world\, unlocking informed debates about their visions for data with purpose and collaboratively finding examples of regional use cases of AI. \n\n2PM ET | 10AM Santiago  \nRegister here 
URL:https://aplusalliance.org/event/the-latin-american-ai-equality-toolbox-launch/
CATEGORIES:Eventos A+
ATTACH;FMTTYPE=image/png:https://aplusalliance.org/wp-content/uploads/2025/08/Untitled-design-1.png
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=Europe/Moscow:20251119T160000
DTEND;TZID=Europe/Moscow:20251119T170000
DTSTAMP:20260404T132640
CREATED:20250828T094149Z
LAST-MODIFIED:20251008T114758Z
UID:24941-1763568000-1763571600@aplusalliance.org
SUMMARY:The African AI & Equality Toolbox Webinar 6: Deployment
DESCRIPTION:In African contexts\, post-deployment oversight is often underfunded or overlooked. Once a system is launched—especially by international actors—it can become invisible\, even as its consequences grow. This final stage and webinar of The African AI & Equality Toolbox looks at what true accountability means: planning for ongoing monitoring\, shared governance\, and the possibility of “no.” The webinar will also explore what it means for systems to be responsive—not just to data—but to dignity. \nThe African AI & Equality Toolbox is a strategic initiative designed to empower African stakeholders—policymakers\, technologists\, civil society actors\, and communities—to shape Artificial Intelligence (AI) systems that are contextually relevant\, inclusive\, and grounded in human rights. \nDeveloped by Women at the Table and the African Centre for Technology Studies (ACTS)\, and adapted from the global AI & Equality Human Rights Toolbox Initiative in collaboration with the UN Office of the High Commissioner for Human Rights (OHCHR)\, this African iteration provides practical tools and methodologies to guide equitable AI development across the continent. \nThe Toolbox applies a Human Rights-based AI Lifecycle Framework\, integrating reflective questions and the Human Rights Impact Assessment (HRIA) developed with the Alan Turing Institute. It emphasizes participatory\, multidisciplinary approaches and is rooted in feminist\, decolonial\, and Justice\, Equity\, Diversity\, and Inclusion (JEDI) principles and incorporates lessons from emerging digital rights challenges\, ensuring AI systems are designed with safety and dignity at their core. \n1PM GMT | 3PM SAST | 4PM EAT \nRegister here
URL:https://aplusalliance.org/event/the-african-ai-equality-toolbox-webinar-6-deployment/
ATTACH;FMTTYPE=image/jpeg:https://aplusalliance.org/wp-content/uploads/2025/08/AI-EQ-Toolbox-black-w-gradient-scaled.jpg
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=America/New_York:20260311T163000
DTEND;TZID=America/New_York:20260311T180000
DTSTAMP:20260404T132640
CREATED:20260303T134424Z
LAST-MODIFIED:20260303T134656Z
UID:25155-1773246600-1773252000@aplusalliance.org
SUMMARY:CSW70 | When Algorithms Discriminate: Gender Bias in Justice Systems
DESCRIPTION:Wednesday\, 11 March 2026 \n4:30 – 6:00 PM ET \nNGO CSW \n10th Floor\, Church Center of the United Nations \n777 United Nations Plaza\, New York \nAn In Depth Discussion: What happens when courts replace judges with computer algorithms? We are told these systems are “objective” and “fair”  but the evidence tells a different story. From bail decisions to sentencing\, algorithms are making life-changing choices about women based on biased data and male-centered assumptions. A woman seeking justice after assault may find her credibility automatically questioned. A mother may be flagged as “high risk” simply because of where she lives or her employment history. Meanwhile\, these same systems treat men’s violence as more predictable and less dangerous. \nThis is not science fiction\,  it is happening right now in courts worldwide. Join us to uncover how technology is creating new barriers to justice for women and girls\, and what policy solutions can effectively address it. \nLaura Nyirinkindi | UN Special Procedures Member\, Working Group on discrimination against women and girls\nAfrica Regional Vice President of the International Federation of Women Lawyers (Federación Internacional de Abogadas) \nFernanda K. Martins | Fundacion multitudes\, Director of Strategy and Advocacy \nCaitlin Kraft–Buchman | Women At The Table; CEO
URL:https://aplusalliance.org/event/csw70-when-algorithms-discriminate-gender-bias-in-justice-systems/
ATTACH;FMTTYPE=image/png:https://aplusalliance.org/wp-content/uploads/2026/03/Designing-AI-for-Human-Agency-29.png
END:VEVENT
END:VCALENDAR