Case study
equity and inclusion, ethics and governance, quality practice, partnership working

Public Engagement Ambassadors (PEAs) in Pods: including the public in data and AI research

updated on 21 Apr 2026
5 minutes

Supporting data science and artificial intelligence (AI) researchers in the Manchester area to engage meaningfully with local underrepresented community groups, so that they can inform and influence research. 

Image showing a group of people talking

Snapshot

This project supported researchers to work together with local communities to explore data science and artificial intelligence (AI) research. With a focus on those traditionally underrepresented in research, the aim was to ensure that people who are often unheard have a say about how AI is developed and used in society. Impact has included development of “community ambassadors” in AI information, a “People’s Charter for AI” and, among the researchers, a new understanding of the value of co-producing knowledge about AI with communities, has affected how they teach and do research. 

The project is based at Manchester Metropolitan University (Manchester Met) and involves researchers from Manchester Met, the University of Manchester, and the University of Salford. It is funded by the Engineering and Physical Sciences Research Council (EPSRC).  

Background and Purpose

The project was inspired by a series of “Show and Tell” events around Greater Manchester, run by data science and AI researchers, where conversations about AI were held over hot meals for the community. The aim was to create a friendly environment to hear from communities who might not be typically engaged with university-conducted outreach in data science and AI. The roadshows also facilitated the development of ideas between communities and the university on ways to work together to challenge and change university procedures and governance which create barriers to engagement with local communities. The roadshows raised important questions:  

  • what does “co-production” between researchers and the public look like in a highly technical discipline? (“Co‑production” here means working together as equal partners from start to finish of a project.)  
  • How can learning about co-production be translated into practice across disciplines?  
  • How can solidarity be built between universities and communities whose voices are usually underrepresented in this kind of work?  

The project team recognised that these questions could best be answered through further engagement with underrepresented communities in the Greater Manchester area. 

Approach to engagement

The team designed a training programme for researchers based in the three universities in the Manchester region. These researchers were recruited as Public Engagement Ambassadors (PEAs), receiving mentorship and going on to produce their own public engagement events and activities. These were carefully co-produced with communities to ensure they are designed to be based on their needs.  

A community member was appointed as a paid consultant advisor throughout the project. Community participants were also compensated for their time, and the project paid careful attention to diverse access needs to ensure a safe and secure environment to enable trust. This included booking taxis for those who struggled accessing buses, and printing things off for those with sight issues.  

Approach to evaluation

Evaluation was embedded from the start of the project to track impact within the communities and cohort of researchers. Before conducting any outward-facing work, the team took time to reflect together on biases and passions and what they were bringing to the project. Reflective circles were held at the end of each PEAs training cycle, as well as conventional baseline and endpoint surveys to measure progress. Further reflection events between the PEAs and communities gathered further insights into the process.  

Outcomes and impact

The project provided a useful opportunity for the communities to explore issues about AI and its impact that they were concerned about, such as ‘deep fakes’ or who has control of people’s data. Some community members have gone on to become “ambassadors” who share information about AI with their wider communities. Some expressed interest in partnering on future projects with the universities, and some are currently supporting two PhD students.

The PEAs and communities together co-produced a “https://e-space.mmu.ac.uk/637617/1/booklet with graphics.pdf

People’s Charter for AI”, which invites organisations to sign up to ten principles for providing public services and products that use AI-based systems, to ensure services are fair and accessible to everyone. This has been presented by community members (with support from the project team) at community venues across Manchester, the AI Fringe festival in London and to members of Cabinet Office staff. A community member reflected that the work has shown that:  

“…community-led oversight does work and is valuable for both sides. Companies have listened and we have been heard and hopefully made a difference”.  

Community members are also going to take part in the development of a “Charter on Responsible AI” for Small and Medium-sized Enterprises. 

 Among the researchers, a new understanding of the value of co-producing knowledge about AI with communities has affected how they teach and do research. For instance, there have been changes to MSc and Undergraduate courses on Data Management and Governance and AI Ethics. 

The team are now considering how their project can make systems change within university institutional structures so that they better support this type of work; and hope it will inspire funders to do the same. Even when universities recognise the value of co-production with communities, these types of non-traditional projects and approaches can be more difficult to deliver or gain support for compared to a typical research project. They often place more demands on project teams – it is difficult, for instance, to estimate a researcher’s workload at the start.  

Legacy

PEAs in Pods has had a significant impact on how the researchers’ colleagues think about their research, public engagement and teaching. Thus far, 23 colleagues have received mentoring and one colleague has received training to become a mentor to others, creating further impact.  

To share their learning, the team have produced a range of outputs including academic papers, resources including written case studies and videos, and self-guided training modules. The PEAs approach is now being referenced within other projects. At such a critical moment for AI, PEAs in Pods is providing an important model for research and innovation in AI which genuinely includes underrepresented voices.