1530353382_f320c798b3_z

How to transfer research into practice


ResearchBlogging.org
Well, maybe that’s a misnomer for today’s post, but it does strike to the very heart of some of the more heated debates that I see when I browse the interweb. With all the conflicting research reports into all the various interventions for chronic pain (well, for anything really), how does a clinician decide when the time is right to start incorporating a new practice (such as working with acceptance or mirror therapy or laterality), or begin to phase out an old practice (like distraction or core stability or muscle imbalance)?

This paper, one of a series of excellent papers in Best Practice & Research Clinical Rheumatology on the management of low back pain, discusses in a really accessible way, the various problems that face an earnest clinician who wants to ‘do the right thing’ by his or her patients.

What are the issues?

Most clinicians would be well aware that the ways in which research is conducted can be a whole lot different from the daily reality of clinical work.  Researchers have the requirement to carefully select participants, control extraneous variables, ensure the intervention is rigorously applied and follow the participants up for a good period of time.

Researchers need to spend a long time setting up a trial – but don’t necessarily have to consider the pragmatics of how to do this with a mixed bag of participants, in a clinical setting that can’t control for so many things (like case managers insisting on a certain timeframe, a GP who won’t support the approach, team members who are keen to use their own approach, and managers who wonder why so much time is spent on measuring outcomes for months after the person has been discharged).

Some clinicians use systematic reviews, best practice guidelines or evidence-based summaries to guide practice.

These synthesise the findings of many studies, and assemble recommendations on the basis of the strength of evidence for each component.  Ostelo and colleagues note that for low back pain, 51 reviews have been completed on spinal manipulation with 17 deciding the evidence is neutral, and 34 giving a positive review.  The methodology of the reviews, according to GRADE (Cochrane Collaboration’s grading system), were poor, but the reviews on spinal manipulation showed that those following the best methodology arrived at positive conclusions.  But there were other factors involved in a positive review – just assessing one type of manipulation; including a clinician who used spinal manipulation on the review team; and completing a comprehensive literature search – but strong conclusions couldn’t be drawn because the review methodology was poor.

Implementing guidelines

Well, if clinical guidelines summarise the evidence (to a certain degree anyway) to reduce some of the demands on busy clinicians, then another question can be asked – how well are guidelines actually implemented?  And the answer is – well… even with specific training and support, one study cited in Ostelo and colleague’s paper found “only modestly improved implementation for certain portions of the recommendations in the Dutch LBP guideline by general practitioners and produced only small concomitant changes in patient management.”

And most guidelines are not presented with such a systematic and thorough implementation training.  In fact, many are simply delivered through the mail – possibly read by the enthusiastic – but often ‘filed’.

These are the factors that seem to influence adoption of guidelines:

– lack of knowledge (of the guidelines, or the way to integrate new changes),

– a shortage of time to read and consider new approaches,

– disagreement with the guideline content or reluctance from colleagues to adhere to the guideline

–  ‘getting lost’ in the large number of different guidelines available

Not to mention managing the expectations from patients – what? No x-ray for my sore back? What kind of a doctor are you?!

The conclusion this paper comes to is that ‘more coordination’ is needed to both develop and then integrate practice guidelines.  That guidelines can’t simply synthesise the evidence, but also need to work alongside patients and practitioners (and funders and health managers!) to discuss and advise on how to use the new evidence in the real world.

What do I think?

I am an advocate of using clinical guidelines, with all their flaws.  I’m also an advocate of clinicians reading original research papers to understand the ways in which the research is carried out and to make a judgement on how closely the ‘research’ approach fits with the clinical world of the practitioner.

I waited for about 12 months and about that many research papers to be published before I started to think about how mirror therapy and motor imagery could be implemented in my practice.  I’m a slow adopter of new interventions.  This is partly because I’ve seen so many waves of  ‘new’ ‘improved’ treatments that I’m just a little wary of rushing in with enthusiasm.  I’ve seen leg length disrepancy being touted as ‘the answer’, core stability, muscle imbalance, maintaining lumbar lordosis, various types of lifting practice, swiss balls, exercising with weights on the ankles, pulleys, hydrotherapy – oh the list goes on.

And in the end? A review of exercise in the same journal as this paper finds yet again that there is little evidence that any specific type of exercise is better than any other.  This is the same finding I’ve seen since the mid 1990’s.

For me? Watch, wait, be critical, think carefully, and don’t be blown away by anything that promises magic.  The people I work with have complex needs and have already been through a health system that has often failed them.  I don’t want to promise more than I can deliver.  Go gently, be kind, be flexible, be a little conservative about what can be achieved.

Ostelo, R., Croft, P., van der Weijden, T., & van Tulder, M. (2010). Challenges in using evidence to inform your clinical practice in low back pain Best Practice & Research Clinical Rheumatology, 24 (2), 281-289 DOI: 10.1016/j.berh.2009.12.006

9 comments

  1. “For me? Watch, wait, be critical, think carefully, and don’t be blown away by anything that promises magic.”
    Sounds about right.
    My dad – a GP – used to show me how he checked the methodology of research papers in the BMJ and kept a packet of a fancy anti-emetic on his surgery windowsill, just to remind him…
    It was for the morning-sickness wonder-drug Thalidomide.

    Neil de Reybekill, Chief Executive at Life Research (http://www.liferesearch.co.uk)

    1. What a stunning reminder… Thank you for sharing that wee anecdote, because while we might not be prescribing, we’re all influencing thoughts and beliefs just as powerfully. And thoughts and beliefs have just as much ability to influence people as a drug. Thanks for taking the time to comment, it’s always a pleasure to have someone contribute!

      1. I am an occupational therapy Graduate student at Utica College in Utica, NY and your blog caught my attention because a great deal of how we learn is through evidence based research. We are taught what resources are reliable and what resources are not, and how to use the resources we have available. Since we are going to be new graduates soon we are often pushed to consider trying new approaches and techniques as we feel comfortable when starting our careers.

        The question that always arouse to me was how do I know that what I am doing is the right thing just because some research articles say it could be. I think your advice at the end of article said it all “be kind, be gentle and go slow”. Since I have an occupational therapy background I would say that to me this means assess your patient’s needs and wants for their therapy. Then propose what treatments or activities that you can provide, as well as, the benefits and drawbacks to each and see what the patient thinks. As the professional I feel I would have to provide the evidence to back up me ideas and always make sure that the patient is informed on what is out there and how it working for others. That way I would be giving the patient a choice and if they say okay the new idea may work for them and then I have even more support for my next patient. I am learning slowly that people always ask our advice on healthcare issues; it comes with being a health care professional. Maybe that is the real reason we try to keep up with current research and what is going on in healthcare. My thoughts are that no matter what treatment a health care professional is thinking of providing the patient should always be included and well informed.

      2. I think you’re right – it’s important to know what the scientific position is on any intervention we provide, and to discuss this openly with the people we work with. It’s also important to know what we don’t know, and how to refer on to others as necessary.
        I find motivational interviewing is a useful approach to help people decide what is important to them, and where to start with any input. This way we’re working alongside people and aligned with their values – and provided we’re clear about our own position, and not working contrary to our own knowledge and values, then it’s up to the patient/person to decide what to prioritise.
        Of course that’s easier said than done – so it is a case of waiting until evidence accumulates before adopting a new approach – or at least be up-front when we do decide to start using something innovative.
        I suppose my concern arises when I see occupational therapists (and others) assuming that new treatments do ‘work’, or that older treatments ‘work’ when the research into effectiveness has not been established – activity pacing is a great example because there really is little evidence to support (or not) its use, yet it’s a common intervention!
        As a graduate student, it might tickle your research fancy a bit to think that there is a whole lot out there that we don’t know – yet! More research is needed!

  2. From my work in an organisation supporting primary healthcare practise in South Africa, my opinion is that it’s critical that on-site training and support be given for the implementation of evidence-based guidelines; mentorship.

    I don’t believe it’s sufficient to just have “roadshows” and presentations or Q&As. To get widespread adherence to best-practises, I believe mentorship would probably prove to be an effective, if costly, model.

    1. I wonder whether any government would invest the funding for something as intensive as that! of course if you weighed up the cost of all the ineffective treatments that continue to be carried out, and invested THAT into training, maybe it would even out?! Thanks for taking the time to comment, I appreciate it.

  3. Thank you for yet an interesting and inspiring post. As you I find it very important to discuss and improve how we transferee data from research to clinical work – that is to improve life for the patients and at the same time remember the huge individual differences.
    Actually your post inspired me to write about it on my own blog Picture of Pain.
    Thanks again
    Kim

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s