A drawing of 4 hands with different shades of skin colours interlaced together. A set of scales sits in the middle, with a leaf and a lightbulb balancing each other.

Explains the state of tech justice in the UK in 6 minutes.

This resource is for anyone who wants to find out more about tech justice.

It covers:

  • what tech justice is
  • why Catalyst carried out research into tech justice
  • the research’s findings, conclusions and recommendations

Bias and discrimination in tech

Tech often has bias built into it. For example, facial recognition software misidentifies racially minoritised people.

Tech also gets used in discriminatory ways. For example, the UK-based charity Glitch carried out research into digital misogynoir (the combination of anti-Black racism and misogyny Black women experience online). It found that, “both new alternative platforms and large established multi-billion dollar social media companies are not successfully moderating content at the intersection of racist and sexist hate.”

These examples are just some of many behind our change of focus to tech justice and liberatory tech

About tech justice

At the moment, we see tech justice as being: 

  • an approach to creating technology that challenges structural inequalities and supports justice for marginalised communities
  • a social movement – of people and organisations working to use tech to create a more just society
  • an outcome – tech becomes fairer for people experiencing inequity and injustice.

You can find out more about it in our article about tech justice.

There isn’t a UK tech justice movement yet. It’s not clear why this is. But what we do know is that a more just approach to tech can help create social justice. Because all struggles for liberation are interconnected.

Why we commissioned the tech justice report

We commissioned the tech justice research to get a clearer idea about:

  • the main areas of concern in the UK
  • who’s doing what work and in which fields
  • how Catalyst can support this work and champion best practice
  • how Catalyst can embed tech justice practice and principles into its work.

The report’s findings 

Here’s a summary of what the researchers found.

Prejudice and discrimination is widespread in the digital world

The prejudice and discrimination that marginalised groups face offline follows them into the digital world. But 'Big Tech' (multinational companies like Google, Apple, Meta, Microsoft and Amazon) neglects its digital responsibility. And it's rarely held accountable for the harm caused on its platforms. 

New legislation could give regulators more power to address this. There's research taking place that aims to expand our knowledge about:

  • misinformation
  • identifying at-risk groups
  • tackling and safeguarding against misinformation.

Evidence of impact on marginalised communities is growing

Understanding the experiences of marginalised communities is essential for building a more just world. Its the same with building more just tech. They are most impacted by tech injustice so can provide the clearest vision of what tech justice should be. They need to be involved in designing, regulating, and governing emerging technologies. Otherwise tech will continue to make inequalities worse.

Many not for profit organisations work with marginalised communities but also use Big Tech products. Often these are seen as a necessary compromise for financial or engagement reasons. Organisations need support to move towards more ethical, open-source alternatives.

Research is starting to influence policy and regulation

Some organisations are supporting tech justice by influencing policy and regulation. Areas being researched include:

  • increasing tech literacy
  • understanding biometrics
  • facial recognition
  • surveillance.

There are people with expert tech knowledge. And people who understand where tech overlaps with security and justice. We need to bridge the gap between these groups.

Legal frameworks aren’t protecting people fast enough 

Emerging technologies like AI and education technology are developing faster than legal frameworks can. And current legal challenges can only focus on past or existing harms, not future ones.

Discussion about new tech often focuses on facial recognition and algorithm-based products. There's not much attention given to tools like ChatGPT.

In the UK, the regulatory approach prioritises innovation over public protection. Automated decisions affect important areas of people's lives. But there's little transparency or accountability involved. And it's not clear how to complain or appeal against decisions.

Unlike other countries, the UK’s approach to regulating AI is sector-based. It doesn’t have dedicated legislation. We should keep track of what that means for how AI is used in the UK, compared to the EU.

Existing laws can help slow down the development of dangerous tech. And some organisations are helping individuals to protect their online data. But we need a focus on public awareness and accountability. 

Workplace approaches to technology don’t always support employees  

Discussions about race and technology in the workplace are happening. But they're not framed as tech justice issues. Instead they are framed within Equality, Diversity and Inclusion (EDI) perspectives based on vague ideas about fairness. Their starting point isn't equity and justice. 

Increases in remote and hybrid work are a legacy of COVID-19. These changes have improved some parts of work for marginalised people. But presenteeism, pressure to be productive, and surveillance by employers are still issues.

Racially minoritised people are underrepresented in all areas of the tech sector. Including the teams designing and building tech. They also experience much more bias and discrimination than their white colleagues.

Key actions include:

  • upskilling and supporting workers (because technological advancements may remove the need for some jobs).
  • connecting people with tech skills and knowledge to people impacted by tech, and people in the not for profit sector.

There’s a lack of transparency in data collection and surveillance 

In the UK, surveillance practices infringe on human rights. There's no way to avoid state surveillance. There are no regulations for facial recognition technologies. And the Home Office can access personal data via public sector organisations. This has led to calls for more audits and transparency around using tech for surveillance.

Biased AI training data leads to discriminatory facial recognition algorithms. And tech also embeds classism. For example, fingerprint technologies don't consider that some jobs lead to changes in people's fingerprints.

There's also limited public access to data broken down by protected characteristics. So it's hard to see if algorithms disproportionately harm people with intersecting identities. We need to decolonise our approach to sharing and protecting knowledge and data.

Shifting power towards racially marginalised groups in the UK

Academics are working on using data to shift power to racially marginalised communities. But they’re not communicating with grassroots groups. And academics’ findings aren’t reaching the general public. 

Tech justice research is difficult to limit to only one country like the UK. Because there are organisations based in the UK working on digital rights, or discriminatory uses of tech, that happen outside of the UK. 

Catalyst’s role in tech justice

The report recommends that Catalyst’s tech justice work focuses on these 5 areas: 

  1. Digital and data literacy: helping to upskill individuals, organisations, funders, and grassroots groups in digital and data literacy. Supporting them to engage more effectively in the digital space. This is crucial for not for profits, a key focus of Catalyst's work.
  2. Emerging technologies: supporting these groups to understand and engage with emerging technologies. Reducing fear and anxiety about their impact on our digital lives.
  3. Inclusive work culture: leading by example by creating and maintaining an inclusive work environment, especially for individuals of Global Majority descent. Prioritising decolonial, intersectional, Black feminist principles. Ensuring the most marginalised are not just included, but integral to shaping the network's DNA.
  4. Shifting power: acting as a bridge between funders and grassroots organisations. Helping redistribute resources and support cutting-edge digital work, including advocacy and innovation.
  5. Building tech justice: commissioning research on emerging digital issues, with a focus on the UK. Convening groups for People of Global Majority descent and people with disabilities to lead conversations and build tech justice in the UK.

Using funding to shift power

Funding models should provide the space, time, and financial support for people to test ideas while allowing for failure. Doing this means redefining failure as a learning opportunity. One where testing a co-produced project with a community allows funders and participants to gain new insights.

Funders:

  • should focus on building new ecosystems that engage both Big Tech and local communities.
  • need to take bold risks and support experts (including those with lived experience). They can do this by giving experts time, space, and resources needed to experiment.
  • share examples from projects not directly linked to tech justice. For example housing or migrant justice. Applying a tech lens to these areas can deepen understanding of, and foster connections with, broader justice issues. This can also provide opportunities for these areas to develop tech skills.

Read the manifesto

We have outlined the world we want to see in our Manifesto for a more tech just society.

---

The tech justice research was carried out by a small group of British women of Global Majority descent:

  • Siana Bangura: author
  • Nikita Shah: interviews
  • Quito Tsui: desk research 
  • Rachel Arthur: visualisation

Our Catalyst network - what we do

Support & services

Our free services help you make the right decisions and find the right support to make digital happen.

Learn what other non-profits are doing

39+ organisations share 50+ Guides to how they use digital tools to run their services. Visit Shared Digital Guides.