A Black woman with glasses sits over a laptop. behind her you can see the blue glow of data servers, surrounded in darkness.

Gender bias is baked into AI. Feminist AI (FAI) wants to address this. We explore the importance of FAI, how it can challenge systemic discrimination, and how initiatives like Chayn’s FAI tool can make a difference.

AI isn’t neutral. As UN Women puts it, “The world has a gender equality problem, and Artificial Intelligence (AI) mirrors the gender bias in our society.”

The gender bias in AI comes from the:

  • fact that most AI is funded, designed, developed and controlled by a small group of western, wealthy, white men
  • biased and/or incomplete datasets being used to train AI models. These datasets often under represent or exclude marginalised groups
  • biased and prejudiced content from social media platforms being used as cheap training data. AI can accept, then reproduce and amplify, these biases and prejudiced views
  • processes used to improve AI models. AI models are usually designed to reproduce the most common patterns from the data it's been trained on. This can reinforce stereotypes.

This bias has led to:

Feminist AI (FAI) matters because it transforms how we think about, design and build AI. We can use it to address the biases and inequity baked into our patriarchal, capitalist, colonial and racist systems.

Defining Feminist AI 

There’s no universal, agreed definition of Feminist AI (FAI). The <A+> Alliance for inclusive algorithms is a global, multidisciplinary, feminist coalition of academics, activists and technologists. Its definition of FAI is:

“Algorithmic Decision-Making Systems and Artificial Intelligence harnessed to deliver equality outcomes, designed with inclusion at the core, creating new opportunities and proactive, innovative correction of inequities.”

In her article Feminist Artificial Intelligence: A New Era, Sophie Toupin, a professor in the Department of Information and Communication at Université Laval, expands this definition. For Toupin, FAI is more than just a technological concept and a movement. It’s also a: 

  • Model: FAI champions inclusive, community-led data sets
  • Design: FAI’s design includes a range of identities and experiences. So it creates more inclusive and culturally sensitive technologies
  • Policy: initiatives like Canada’s Feminist International Assistance Policy brings feminist perspectives to international AI development 
  • Culture: it addresses biases in AI systems. It considers the broader social contexts that produce them
  • Discourse: FAI as a discourse uses feminist, queer, and critical race theories to critique and reimagine AI systems
  • Science: FAI involves rethinking what’s accepted as 'intelligence' and how broader definitions are represented in AI systems

Feminist AI isn’t just about women

The ‘Feminist’ in Feminist AI suggests that FAI is only about women. But it’s not. It’s an approach that can benefit people of all genders and none. It prioritises social justice. It recognises systemic inequities, and that there are multiple, intersecting forms of oppression. FAI is about proactively using AI to challenge and dismantle, for example, ableism, heterosexism, classism and misogynoir

It wants to shift power from a small group of oligarchs and shareholders, to the marginalised people they deliberately silence. It embraces co-designing tech with and for seldom heard communities, so designs reflect their needs and experiences.

Despite FAI’s broad focus, there are also calls for AI to be decolonised and for an Afro-feminist AI

Feminist AI: an example

Catalyst’s Tech Justice Road Trip project is exploring what liberatory tech could look like. As part of the project, Chayn is using existing AI software to create an FAI tool. Chayn is an intersectional, feminist global not for profit. It creates online resources to support survivors of gender-based violence to heal. It’s run by survivors of gender-based violence and allies.

Chayn’s take on FAI

Eva Blum-Dumontet, Chayn’s Head of Movement Building and Policy, explains that technology has been weaponised. Online gender-based violence includes hate speech, device and app control, image-based abuse and stalking and monitoring. According to Eva, FAI could play a part in challenging this. For them FAI is:

“...designed with, and built for, women and gender diverse people. Chayn wants to create hope. AI is here to stay so we should be finding how to use it in positive ways. The world will be better off if we do the following things when we’re creating AI tools: use an intersectional lens; recognise that communities have different needs; and focus on addressing those needs.”

An AI letter-writing tool for survivors

The idea for Chayn’s AI tool came from a conversation it had with survivors. The tool will help survivors write a ‘takedown request’ - a legal request to have malicious content removed from a website or service. The tool will send the request to the police or tech companies. 

Eva says, “we often hear that what gets in the way of survivors being taken seriously is nailing legal speak. Knowing the right way to present the narrative when they're engaging with police forces. Or with companies, in the case of takedown requests. This is where we think an FAI tool could help - because it’s really good at reproducing specific kinds of writing.”

Chayn has tested this with letters about the police’s failure to investigate cases. The FAI tool asks a series of questions. Based on the responses, it produces a letter that makes a clear and strong case explaining why a police force has failed you. 

Nothing about us without us

Chayn will be speaking with survivors before designing and building the tool. It wants to understand what an FAI tool looks like to them. So that the tool meets the community’s needs. 

Chayn is looking to answer these types of questions:

  • what challenges are survivors facing?
  • is there anything else that might be more helpful for them? 
  • what would an AI that really benefits survivors and their needs look like? 
  • what are the risks of this type of AI?

Eva explains that the work is about prioritising the needs of the people Chayn serves:

“it's all very well, talking about what this FAI tool looks like to Chayn. But if we stay in our bubble of gender-based violence activists and tech rights activists, and don’t engage with the wider community, then we’re failing survivors as well.”

An inclusive and trauma-informed process

All Chayn’s work on its FAI tool will be based on its trauma-informed design principles:

  • safety
  • trustworthy
  • plurality 
  • agency
  • open and accountable
  • solidarity
  • empathy
  • friction
  • hope.

Eva says these principles help make sure that survivors are “...given agency, their privacy is respected, they have hope and, later down the line, they feel empowered.” 

When the project is evaluated, Chayn will be looking to see how the tool has served their community. It will also be working with academics to document the role of organisational principles in creating FAI tools. It hopes this will help provide guidance for other organisations who want to build AI that aligns with their principles. 

Advice for organisations considering building FAI tools

For Eva, the starting point for organisations interested in creating FAI tools with the people they support is to listen.

“A lot of people have been excluded from the AI debate. They hear about AI, but they don't feel like they understand the conversation or can contribute to it. So listening, finding out their hopes, fears and needs, and using that as a starting point is the goal.” 

Where to find out more about Feminist AI 

Author's note: Eva, thanks for taking the time to share your thoughts about FAI, and Chayn’s work on it, with us.

---

Image credit: Women of Colour in Tech. Used under a CC2.0 Attribution license.

Our Catalyst network - what we do

Support & services

Our free services help you make the right decisions and find the right support to make digital happen.

Learn what other non-profits are doing

39+ organisations share 50+ Guides to how they use digital tools to run their services. Visit Shared Digital Guides.