Saturday 11 July 2020

What matters, what's researched, what's measured

I've been thrown into the world of the 'validation of research instruments'. Or what you and I might call 'surveys and questionnaires that we are told can be trusted to do the job they say they will do'.

My whole self dislikes and distrusts any tool that says it's been scientifically validated, but is then used by humans on humans to find out about things that can't be observed - like wellbeing or health.

My visionary self has a dream of a different way of living together as humans, and my top priority is thinking how to get from the currently reality to something closer to this dream. Dreams usually take lifetimes or more, however hard and however strategically you fumble and work towards them.

My pragmatic self accepts that within the way things currently happen in the policy, practice and research world of health and social care, it is impossible to escape validated research instruments as they have power and credibility with those in power. Validated research tools are therefore key to improving things within the social and public service systems we currently have. It matters to people's lives now that things improve now within systems. We have to live in our present, not just dream for the future.

For validated research tools, therefore, the questions become ones of harm reduction, and opening up the debate about these tools.

I am very grateful to an academic* for linking me to a great 'primer' on the validation of research tools, as I've never really known what validation involves. For a published paper with so much information about such an obscure topic, I found it immensely readable https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6004510/
*She's nameless only because I'm writing this early on a Saturday morning and want to get this blog up before coproduction week finishes, ie before I can show it to her and check if she'd like to be named

Reading that paper made it clear that developing a validated research instrument is not a task to be undertaken lightly. It is time, brain and cost intensive.

But - some research funders for some types of research require the use of validated research instruments. So if there isn't a suitable validated research instrument, you can't get the funding, so can't do the research. Or to put it another way, you can only research something if there is already a research instrument that has been through the validation process. No instrument, no research.

It took a while for the significance of that to sink in for me. Some things that really matter to people's lives cannot be researched because the research cannot be funded because there is no suitable validated research instrument.

And then, add in another two layers of challenge.

1. I have yet to see a validated research instrument that can be used inclusively, across the diversity of people who experience whatever it is that is being researched. If used in line with the validation, this means research will perpetuate the exclusion of people who are already unheard, or heard but ignored when decisions are made that affect their lives. If used outside of the validation, for example by rephrasing and reframing questions or changing the response options in order to allow a more diverse group of people to take part, the tool is no longer validated.

2. Step 1 of developing a validated research instrument, according to Boateng and the rest of his research team (ie the people who wrote the paper that I found so helpful), is having a clear conceptual grasp of the topic. Say the topic is 'quality of life'. Whose conceptual grasp of the topic should provide the conceptual framework for the tool? I will always prioritise the conceptual framework of the potential participants whose lives will be affected by any decisions made on the basis of the research. And it is possible to do that. But it is also possible to use pre-existing academic conceptual frameworks. Yep, you can design a new tool based on the frameworks that are based on research that is shaped by what could be done using a previously validated research instrument. I suspect that route is more common, as it is much cheaper and easier to do a literature review than brand new qualitative research with people recruited from your pool of potential future participants.

So what?

So those of us who would prefer to avoid validated research instruments need to engage with the creation of validated research instruments if we are to avoid the perpetuation of exclusion and silencing of some groups of people from the kind of research that policy makers and governments listen to.

And it would be doable. But it is a big investment of effort. So it would be important to make sure that the effort is being invested in developing a tool that would allow research to be carried out on a topic of crucial or strategic importance to those who are usually excluded or silenced.

And that means starting with the research on 'what matters', so we know what is worth that investment.

And that is where coproduction comes in.

Doing research to identify 'what matters' cannot be done just by academic researchers, or just by people with lived experience, or just by think tanks, or just by government research teams. Doing this research needs the skills of academics, the skills of people with lived experience who are already involved in policy or practice work, the skills of policy makers, the skills of people responsible for providing public services, the skills of members of the public. What everyone in that list has in common is possessing a set of expertise, experience and perspective that others on that list do not have, but that are needed for top quality health and social care research. In other words, to do top quality health and social care research means developing our skills as research coproducers. And by 'our skills', I mean the skills of everyone in that list.

I would argue that everyone involved in coproducing research needs to learn the skills that are needed to be research coproducers. It does not come naturally to any of us. For those of us with professional training, it won't have been part of that training so we need training. For those of us without professional training, we need training too.

And I would argue that we need the same training, a training that helps us work out for ourselves how to apply the principles of coproduction to the professional training or lived experience we bring. And a training that helps us see other people's skill sets and ways that the skill sets work together to make us and our research stronger.

Where do we go from here?

I don't know!

I'm putting this out in Coproduction Week, because it is probably the best week of the year to wave a flag saying 'over here!' about something that I've not seen discussed. PLEASE come and find me if you are already grappling with this. I won't be doing anything practical about any of this until I've submitted my doctoral thesis. But this is something that would benefit from a long brew on the back burner. Here's to Coproduction Week 2021 when I will hold myself accountable for what I have (or haven't) done to start developing these rough ideas into actual life changing research changing work.

No comments:

Post a Comment