Back
7 min read

Fresh Eyes: How Stakeholder Involvement Helps Get More From Your Data

Dr John Moriarty looks at how stakeholder engagement is now an important facet of any programme of social research as it can help researchers get more from their data.

Fresh Eyes: How Stakeholder Involvement Helps Get More From Your Data

Article first appeared on the Administrative Data Research Network blog.

In 2004, the fourth season of The Wire brought viewers from the streets of Baltimore into its classrooms, through the viewpoint of a group of researchers trialling a behavioural intervention. By the season finale, with teachers and students telling them of the programme’s benefits and with data to back up the claim, the research team arrange a meeting at the mayor’s office and make the case for a city-wide trial.

The mayoral advisors see the programme as unaffordable and politically unpalatable in its approach. Colvin, the former police major, who is hired by the research team to help implement the intervention, sees it as a missed opportunity to make a positive impact on the education and lived experience of students across the city. The lead investigator, Dr Parenti, still sees the pilot as a success and as having had a sufficient impact to be publishable and potentially replicated elsewhere.

This idea of impact without change only adds to Colvin’s frustration. In the end-of-season montage, we see Parenti addressing a lecture theatre and pointing to a graph on a PowerPoint slide, presumably referring to a finding in their end-of-study report. Colvin is sitting in the audience, but can’t take it and walks out. We’re led to conclude that seeing the lived experiences of the young people he’s been dealing with distilled into a bar chart, and for that to be called an outcome, is too much for him.

As ever, The Wire has arrived at an important truth: that the same piece of data, no matter how rigorously collected, will mean very different things to different people, particularly depending on how the evidence at hand relates to their own position or role.

Stakeholder engagement is now an important facet of any programme of social research. For research to have an impact on policy or practice, having stakeholders ‘buy in’ at an early stage to the process of gathering evidence is greatly advantageous. I’ve found this process personally very rewarding, especially having first been drawn to research through doing frontline work and becoming curious about how ‘best practice’ could become more achievable.

My first postdoctoral job was funded through a Secondary Data Analysis Initiative (SDAI) grant from the Economic and Social Research Council (ESRC). One of the requirements on the Principal Investigator was to have identified and contacted public or community and voluntary organisations who would have an interest in the research. Part of my role was to convene a ‘Knowledge Exchange Group’ on three occasions, at the beginning, middle and end of the analysis.

The first meeting was to introduce the researchers and the various stakeholder representatives. The focus of the study was mental ill-health following bereavement, so the attendees ranged from counsellors and service providers, to representatives of the regional health care trusts. We talked through the proposed research questions and highlighted areas where we could use guidance, for example on identifying groups whom bereavement might affect more acutely. Naturally, some highlighted particular groups they would encounter as service users. Finding out about these areas of special interest at the outset allowed us to build time into our analysis plan to address these questions.

As researchers, we’re used to making certain assumptions to allow for the limitations of our data. We can habituate to these assumptions to the point that we almost forget they’re there, like invisible scaffolding. Talking through a research design with people from outside of the project team was beneficial in that it made us revisit, explain and justify the assumptions we were making. In this study, we were using antidepressant medication as a proxy measure for poor mental health, which has implications for how we interpret the results. One professional working with older people suggested that we would expect different patterns of prescribing with older people, as their psychological well-being might be treated differently by General Practitioners.

We reconvened the group to look at interim results and again once our final analysis was complete. Becoming familiar with a particular audience over this process was really useful when it came time to think about our findings and how to communicate them. This process led me to reflect on what will jump out of a research study for different audiences.

As researchers, we’re often interested in the scientific endeavour of isolating the causal effects. However, the causal story isn’t always what catches the eye of stakeholders. Often a simple description of the phenomenon, or a trend, is what jumps out at people and convey the magnitude of an issue. As a PhD student, I contributed to a presentation to the British and Irish Council’s Misuse of Substances working group, using data from the Belfast Youth Development Study.

This was a study of social attitudes and behaviours carried out over the five compulsory secondary school years (age 11-16). Much of our content was focused on social influence and mechanisms through which substance use behaviours take hold for young people. But what the working group picked up on was the scope to use the longitudinal design to identify which year saw the sharpest increase in different modes of substance use.

Similarly, the process of sharing knowledge on mental health and antidepressant prescribing exposed us to entirely different ways of thinking about data from our ‘what is the effect of X on Y?’ approach. One research officer from the Northern Ireland Assembly introduced me to the concept of ‘predictive budgeting’ for health service demand. This is the idea that, with the shifting make-up of the population and the changing nature of health and illness, planners need much more nuanced information in order to predict costs and levels of demand for services. Therefore, models like ours which identify factors associated with patterns of prescribing could be used to inform predictive budgeting for service demand into the future.

This process of exchange isn’t without its challenges. I have heard influential people openly state a preference for evidence that supports what’s already being planned. That almost sounds vulgar when said straight out, but maybe it’s best to be open and honest about our different motivations and prerogatives. No one person or group possesses the truth. A politician might have polling data that suggests a measure is unpopular. How does she or he weigh that against scientific evidence that says that the measure improves outcomes?

Perhaps that returns us to what went wrong for Colvin and Parenti in the The Wire. If some stakeholders aren’t on board at the beginning, then not everyone is invested in the aims, au fait with the assumptions and the approach. If everyone accepts the logic underpinning the research at the start, the scope to ‘explain away’ or ignore the less palatable findings at the end is reduced.

Of course, critical distance is still important. Nobody wants scientific principles traded off to serve an agenda. And some researchers will find the process of continued engagement uncomfortable, demanding perhaps a different set of skills to that which makes them good researchers. Would Einstein have written a good stakeholder engagement strategy when applying for funding, or might we still be wondering what E is equal to? Getting a balance is still a challenge still being worked out, but a challenge shared is a challenge halved.

For more info on use of administrative data in research, check out the ADRN Blog

 

The featured image in this article has been used thanks to a Creative Commons licence.