It’s (still) time to talk about evaluation and research in arts and health.
Measuring impact robustly remains key to the future sustainability of arts and health work, but spanning the worlds of arts organisations, researchers and health care staff to build evidence can be resource intensive and overwhelming.
What evidence do health leaders, commissioners and investors want to see, and how can arts and health teams deliver it? What sort of evaluation methods work best at different stages of a project’s life cycle, and how can projects build evidence over time?
In an online session facilitated by Cardiff University’s Y Lab, HARP-supported teams, researchers and practitioners discussed the shifts in approaches, practicalities and mindsets they felt would help build better evidence in arts and health innovations in the future.
Here are a few takeaways from the discussion:
1. Take a collaborative and inclusive approach to measuring success from the start.
Eleanor Davis, Arts and Health Project Officer at Aneurin Bevan University Health Board, said of their HARP-supported Arts and Health Programme: “Our evaluation has been collaborative and reflective, including partners and participants in working out how to measure success of our projects and doing this as we go, with more formal evaluation points at the start and end of projects. Working in this way we have been able to collect critical feedback about what is working, what is surprising and what's been challenging, and respond to it directly.
“Support workers have commented how working alongside their clients has enabled clients to feedback their experiences safely and honestly.”
2. Evaluation can be iterative and things don’t always go to plan, so keep an open mind.
Suzy West, Executive Director of Impelo dance charity, said of their HARP project Joio: “I expect you all have seen those gorgeous circular evaluations in which one thing leads to another. Well, initially our journey did start out like that.
“[But] with the delays and redirections that the pandemic continued to necessitate for our project, we had to think again.
“We all love to say we are curious, we will go where the learning takes us. But, for a small arts organisation… it felt daunting to go back to both the university we have been working with and HARP, the funder, and say SROI is not going to work now the project's changed. But that is what we did. We changed our evaluation focus to a more realist-based approach reflecting the project changes.
“It has been a rocky road. [But,] we end our journey with new confidence in ourselves as a learning organisation, able to create evaluation frameworks that put the voices and aspirations of the people in the room at the heart of the process.”
3. Get your research questions right at the start but be prepared to change them as you go.
Professor James Lewis, director of Cardiff University’s Y Lab, said: “[There are] two things that might help those [evidence] conversations early on. What are peoples' theories of change? If we took time as a delivery team to sit down and think about our theories of change and how that comes about, it would have led us to ask different questions early on.
“Another thing that helps you come up with the right questions at the beginning is thinking about what drives your research. I came up with three reasons people do different types of research. One is curiosity driven, which is often your classic traditional academic research. You also have research that is driven by a clear innovation need. The third is a de-risking approach to research. That is when commissioners want to pay for a service and want to know the service will work.
“If you take those two things on (theories of change and research drivers), that will help you come up with better questions.”
4. Embrace the messiness of arts and health
Dr Katey Warran, Deputy Director of the WHO Collaborating Centre for Arts & Health in the Department of Behavioural Science and Health at University College London, said: “Everything you do as a researcher or as part of an arts and health project is learning. So, I encourage people to make mistakes and learn from making those mistakes.
“To me, that is what makes this field so interesting - the richness of this experience, the fact that it brings together so many different stakeholders and disciplines in order to examine a complex intervention.
“People can talk about trying to pinpoint something, an exact ingredient and how that aligns with an exact mechanism or an exact outcome. But in reality, we can't really do that and it’s the complexity of the situation that makes this research landscape so interesting.”
5. Project teams and evaluators should dedicate time to discussing evaluation at the start.
Karen Gray, freelance evaluator and Researcher at University of Bristol, said: “The projects that don't work well, you (evaluators) come into when they are halfway through or you have got no time to really understand what is going on. You have to come up with a plan in two minutes flat and you don't have that time to really understand the project, the people involved in it, to make sure that you are talking a common language, to have that space to build relationships that mean the participants trust you, the organisation trusts you, that the people taking part in the projects trust you. Without all that, I think evaluations and research projects … could really go wrong.”
The art of building evidence was co-hosted by HARP and Cardiff University’s Y Lab.
Speakers included: Suzy West (Impelo), Eleanor Davis & Sarah Goodey (Aneurin Bevan University Health Board), Melanie Wotton (Cardiff and Vale University Health Board), Karen Gray (freelance evaluator), James Lewis - (Y Lab, Cardiff University), Katey Warran - (WHO Collaborating Centre on Arts and Health, University College London).
You can watch the full session below.