News   >   Ripples Story: “Closing equity gap with AI”

Ripples Story: “Closing equity gap with AI”

23 Mar 2026   /   Nhung Phung

In this article, Lucy Jaffé explores how Artificial Intelligence (AI) can be both beneficial and detrimental to the voluntary, community and social enterprise sector, and why it is important for organisations to adopt an equity-centred approach to AI that does not reinforce existing biases against marginalised communities.

Lucy is an independent consultant at Jaffeworks, a director of RestorativU, a board member of the European Forum of Restorative Justice, and an Advisor to the Media Trust’s Stronger Voices Programme and the Why me? Heritage Project.

AI is no longer a distant frontier—it’s already reshaping how charities, community groups, and social enterprises operate. From chatbots offering mental health support to predictive tools helping food banks anticipate demand, AI is quietly embedding itself into the fabric of the UK’s voluntary, community and social enterprise (VCSE) sector.

Since the 1980s, when I did a Master’s in Artificial Intelligence, I have been excited about the power and potential of AI to enable equitable communication, greater connection and democracy. Since then, I have worked for social justice campaigns and also for the commercial software industry. The recent policy work for Refuge on the Online Safety Act Violence Against Women and Girls Strategy highlighted the positives and dangers of AI for underrepresented and marginalised communities. And in this article, I ask, as transformation accelerates: will AI help close the equity gap—or deepen it?

The promise and peril of AI in the VCSE sector

AI offers enormous potential for VCSE organisations. It can streamline operations, personalise services, and unlock insights from data that were previously out of reach. For overstretched teams, AI can be a lifeline—automating admin, enhancing fundraising, and improving service delivery.

Yet without intentional design, AI risks reinforcing the very inequalities the sector exists to challenge. Here is an example from Black Girl Nerds: “Predictive policing is a painfully obvious example of AI systems reproducing racial biases found in their training data. These AI systems make assessments about future crimes, who might commit them, and where, based on data such as location and personal information. But therein lies the issue, which could potentially exacerbate policing or even over-policing in communities along racial and ethnic lines.” Algorithms trained on biased data can exclude marginalised communities. Automated systems may overlook those without digital access and introduce barriers for disabled people. Smaller organisations may lack the resources to adopt AI ethically or effectively. On the other hand, with the proper funding and expertise, they may be more versatile, more able to consult their community of interest, and able to change quickly.

Equity: more than a buzzword

Equity means more than equal access. It requires that AI systems be designed with diverse communities, ensuring that the voices of those most affected by inequality are central to how these technologies are built and deployed. Major social media platforms like Facebook and Instagram are preparing to comply with the UK’s Online Safety Act, but there is an urgent need to embed domestic violence survivors and advocates within their design and moderation teams to curb online abuse. It is no secret that harmful content drives revenues, and social media giants are slow to respond to reporting of harmful behaviour. Consider also a youth charity in East London that trialled an AI tool to match young people with local services. The algorithm, trained on historic referral data, consistently under-prioritised young people from migrant backgrounds. Why? Because past systems had failed to serve them adequately, the AI simply replicated that bias.

I’m working with RestorativU, which has developed an AI-enabled mobile application for mentors and facilitators. We have co-designed the app with people who have lived experience of youth violence, and the AI is being trained by lived experience mentors. This is not just an afterthought, but an embedded and strategic approach that the development team are taking. Equity must be baked into AI from the start—not patched on later. Take a look at Third Sector Lab’s recent examples of seven charities using AI to deliver impact. Ask yourself where biases may be introduced and how your organisation could introduce mechanisms to mitigate against them.

Building equity-centred AI

What does equitable AI look like in practice? A community of interest involved in shaping how AI tools are developed and used. See Refuge recommendations for women and girls to tackle online domestic violence.

Transparent and accountable: Use explainable AI models, share decision-making processes openly, and collaborate across the sector to reduce duplication and ensure access for smaller organisations. Ethical frameworks should align with initiatives like the UK’s data ethics framework or AI for Good principles, and an AI policy should be agreed upon within your organisation.

Proper funding and resources: Encourage funders and the Government to invest in our valuable sector to ensure viability in the coming months and years.

A call to action

The VCSE sector’s role as a champion for justice, inclusion, and community led change means we have a duty to carry those values into the digital realm. That means advocating for inclusive tech policy. It means building digital literacy within our teams. And it means demanding that AI tools reflect the diversity and dignity of the people we serve. AI can be a tool for transformation— but only if we build it with equity at its core.

Key actions for VCSE leaders

  • AI is already impacting the VCSE sector. Be curious about how AI is already being used by your community and organisation (ask the staff and volunteers!) and how it may be risking replicating systemic bias.
  • Write and embed an AI policy and plan which reflects your community roots to design AI that serves everyone
  • Equip your organisation to shape ethical, inclusive AI adoption by applying for grants and funds for community engagement, service design, advocacy and training.

Read the full latest issue of Rank Ripples magazine

Recent News

FellowshipLeadershipNewsSchool Leadership Award

Fellowship Futures 2026

Read More
EnterpriseProfit for GoodSunderland

Profit for Good Case Study: Bike Recycling Hub

Read More
EnterpriseProfit for GoodSunderland

Profit for Good Case Study: We Make Culture

Read More