Over 50 multicoloured bubbles floating against a dark brown background. The bubbles are reflecting light.

Project members reflect on their past 2 weeks. They share thoughts and opinions on AI, and how to help non-profits focus on tech justice. These blogs are creating an archive of the learning and experiences of people involved in Catalyst's Tech Justice Road Trip.

Project members reflect on their past 2 weeks. They share thoughts and opinions on AI, and how to help non-profits focus on tech justice. 

These blogs are creating an archive of the learning and experiences of people involved in Catalyst's Tech Justice Road Trip.

How AI fails minoritised communities: environmental damage, silencing voices and ignoring harm

Kayleigh Walsh: sociocracy support and tech justice research

I think that it could be misleading to focus on the main way (just one way) that AI fails minoritised communities. Because it’s a myriad of complexities that are physical and digital. 

AI is developed by wealthy white men (or at least controlled by them). So it's being built around, and focuses on, a narrow and privileged set of life experiences. There’s very little acknowledgement of, or care for, anything beyond that experience. 

If we were to change this (if only!), we’d still have to deal with labour exploitation, and the environmental impact caused by the huge amount of energy that Large Language Models need to work. And minoritised communities have less access to protection from the devastating impact of these environmental changes.” 

Nikita Shah: circle member and reflective session facilitator

There’s a dangerous assumption that AI is ‘neutral’ and ‘objective’. Two terms that are laced with notions of colonialism and racism. Generative AI has been posed as a tool to help change the world and provide easy solutions. But there remains a cautionary tale.

AI has been developed and trained using data sets that are deeply entrenched with racial bias.

Further, we know that AI is not accessible to all. 

There’s an urgent need to ensure that people from marginalised backgrounds, and underrepresented communities, can develop an interest in AI, develop skills and have opportunities to access the field and sector more broadly. When lived experiences and knowledge of how bias can lead to harm are used to inform the development and application of AI, we may begin to shift away from duplicating the inequities and harms that exist in real life into tech. 

Eva Blum-Dumontet, Head of movement building and Policy at Chayn. Chayn is developing a feminist AI tool that helps survivors of gender-based violence create letters that advocate for their rights

“AI is currently built in bubbles – the Silicon Valley Bubble and the Hangzhou bubble in China (where Chinese tech companies like Ali Baba are headquartered). There’s no dialogue with minorities to understand the harms AI could cause, or how it could benefit them. So there’s no meaningful understanding of its harms (for example, those caused by deepfakes or biased answers) and how they can be mitigated.

What could help is decentralising who is building AI.

This is what Chayn is doing as part of the Road Trip by building a feminist AI tool.” 

What would help non-profits use tech more fairly: capacity building, knowledge and funding 

Eva: “Non-profits need to be empowered with better tech knowledge. Chayn is lucky, because we have amazing technologists on our team. But many organisations don’t have this, or can’t afford to pay the salaries tech workers expect. More funding’s always helpful. But so is more collaboration, and capacity building, between the organisations that have tech resources and expertise - and the ones that don’t.” 

Kayleigh: “Non-profits would benefit from a deep understanding of when tech is actually necessary. This would help to avoid isolating people. Or wasting time and energy using tech just because it exists. Guidelines that make it clear could help.” 

Abi Handley, Alpacka Collective Ltd: supporting collaborative and open ways of working in Catalyst

“Funders explicitly funding:

  • internal time and space for reflection
  • learning
  • experimentation
  • supporting open source and collaboration projects around tech to encourage partnership working between, and with, non-profits

Right now, the off the shelf options offer value for money and speed of implementation. But they don't disrupt the norms and power dynamics within the tech industry.

Non profits have less money and time and therefore less freedom of choice when it comes to tech decisions.”

Making tough decisions and acting on our values: stepping back and having deep conversations 

Abi: I stepped away from attending circles, partly due to timing, but also to consciously abstain. I wasn't needed because the members have incredible experience and knowledge to take the project forward without me. I’m still in the reflection space to support learning and understanding.

Eva: “The Road Trip has been fostering very deep conversations internally about decolonising our work and dismantling power. We don’t have answers yet, or a clear path of action for decision making, but we are getting there. It’s a long road trip of its own for our organisation, and we’re thankful for the support we have been getting.”

Development and growth: holding space, increasing knowledge and making connections

Kayleigh: “The Road Trip has made me rethink my style of holding space and things that I've wanted to do in the past, and felt too shy or worried to do from the risk of being judged or isolated. It's definitely created space for me to follow strands of tech that genuinely interest me. For example, tech justice.”

Abi: “I’ve met people I know I’ll be connected to beyond the lifetime of the project. And this will almost certainly help me (us) to achieve our goals over the longer term. I'm learning with them the whole time.”

Eva: “The conversations we have had have helped us think seriously and concretely about feminist AI, and how to move away from exploitative models. The fact that we’ve been able to offer therapy sessions, as well as remuneration, to our lived experience participants is a good outcome.“

Nikita: “I’m gaining more knowledge on the subject matter, and more knowledge about the partner organisations and the work they are doing. The deeper knowledge is helping me feel more confident and informed on the subject matter.

This is helpful for other work I’m doing. For example, an evaluation of an AI-related programme at universities that want to support underrepresented students in the AI/Tech sector.” 

Useful links

Catalyst’s thoughts on:

Abi, Kayleigh, Nikita and Eva, thanks for your contributions.

Project members reflect on their past 2 weeks. They share thoughts and opinions on AI, and how to help non-profits focus on tech justice. 

These blogs are creating an archive of the learning and experiences of people involved in Catalyst's Tech Justice Road Trip.

How AI fails minoritised communities: environmental damage, silencing voices and ignoring harm

Kayleigh Walsh: sociocracy support and tech justice research

I think that it could be misleading to focus on the main way (just one way) that AI fails minoritised communities. Because it’s a myriad of complexities that are physical and digital. 

AI is developed by wealthy white men (or at least controlled by them). So it's being built around, and focuses on, a narrow and privileged set of life experiences. There’s very little acknowledgement of, or care for, anything beyond that experience. 

If we were to change this (if only!), we’d still have to deal with labour exploitation, and the environmental impact caused by the huge amount of energy that Large Language Models need to work. And minoritised communities have less access to protection from the devastating impact of these environmental changes.” 

Nikita Shah: circle member and reflective session facilitator

There’s a dangerous assumption that AI is ‘neutral’ and ‘objective’. Two terms that are laced with notions of colonialism and racism. Generative AI has been posed as a tool to help change the world and provide easy solutions. But there remains a cautionary tale.

AI has been developed and trained using data sets that are deeply entrenched with racial bias.

Further, we know that AI is not accessible to all. 

There’s an urgent need to ensure that people from marginalised backgrounds, and underrepresented communities, can develop an interest in AI, develop skills and have opportunities to access the field and sector more broadly. When lived experiences and knowledge of how bias can lead to harm are used to inform the development and application of AI, we may begin to shift away from duplicating the inequities and harms that exist in real life into tech. 

Eva Blum-Dumontet, Head of movement building and Policy at Chayn. Chayn is developing a feminist AI tool that helps survivors of gender-based violence create letters that advocate for their rights

“AI is currently built in bubbles – the Silicon Valley Bubble and the Hangzhou bubble in China (where Chinese tech companies like Ali Baba are headquartered). There’s no dialogue with minorities to understand the harms AI could cause, or how it could benefit them. So there’s no meaningful understanding of its harms (for example, those caused by deepfakes or biased answers) and how they can be mitigated.

What could help is decentralising who is building AI.

This is what Chayn is doing as part of the Road Trip by building a feminist AI tool.” 

What would help non-profits use tech more fairly: capacity building, knowledge and funding 

Eva: “Non-profits need to be empowered with better tech knowledge. Chayn is lucky, because we have amazing technologists on our team. But many organisations don’t have this, or can’t afford to pay the salaries tech workers expect. More funding’s always helpful. But so is more collaboration, and capacity building, between the organisations that have tech resources and expertise - and the ones that don’t.” 

Kayleigh: “Non-profits would benefit from a deep understanding of when tech is actually necessary. This would help to avoid isolating people. Or wasting time and energy using tech just because it exists. Guidelines that make it clear could help.” 

Abi Handley, Alpacka Collective Ltd: supporting collaborative and open ways of working in Catalyst

“Funders explicitly funding:

  • internal time and space for reflection
  • learning
  • experimentation
  • supporting open source and collaboration projects around tech to encourage partnership working between, and with, non-profits

Right now, the off the shelf options offer value for money and speed of implementation. But they don't disrupt the norms and power dynamics within the tech industry.

Non profits have less money and time and therefore less freedom of choice when it comes to tech decisions.”

Making tough decisions and acting on our values: stepping back and having deep conversations 

Abi: I stepped away from attending circles, partly due to timing, but also to consciously abstain. I wasn't needed because the members have incredible experience and knowledge to take the project forward without me. I’m still in the reflection space to support learning and understanding.

Eva: “The Road Trip has been fostering very deep conversations internally about decolonising our work and dismantling power. We don’t have answers yet, or a clear path of action for decision making, but we are getting there. It’s a long road trip of its own for our organisation, and we’re thankful for the support we have been getting.”

Development and growth: holding space, increasing knowledge and making connections

Kayleigh: “The Road Trip has made me rethink my style of holding space and things that I've wanted to do in the past, and felt too shy or worried to do from the risk of being judged or isolated. It's definitely created space for me to follow strands of tech that genuinely interest me. For example, tech justice.”

Abi: “I’ve met people I know I’ll be connected to beyond the lifetime of the project. And this will almost certainly help me (us) to achieve our goals over the longer term. I'm learning with them the whole time.”

Eva: “The conversations we have had have helped us think seriously and concretely about feminist AI, and how to move away from exploitative models. The fact that we’ve been able to offer therapy sessions, as well as remuneration, to our lived experience participants is a good outcome.“

Nikita: “I’m gaining more knowledge on the subject matter, and more knowledge about the partner organisations and the work they are doing. The deeper knowledge is helping me feel more confident and informed on the subject matter.

This is helpful for other work I’m doing. For example, an evaluation of an AI-related programme at universities that want to support underrepresented students in the AI/Tech sector.” 

Useful links

Catalyst’s thoughts on:

Abi, Kayleigh, Nikita and Eva, thanks for your contributions.

Our Catalyst network - what we do

Support & services

Our free services help you make the right decisions and find the right support to make digital happen.

Learn what other non-profits are doing

39+ organisations share 50+ Guides to how they use digital tools to run their services. Visit Shared Digital Guides.