Sunday, 23 December 2012

What evidence is supporting what you're being told to do?

Communication, evidence and ignorance

We had a patient come in with irreversible pulpitis in a lower premolar a few days ago. The student treating the patient was good clinically and she opened the tooth, did all the necessary instrumentation and a was ready to obturate. But she hesitated. She had been taught that you don't obturate when a tooth (or its periodontal ligament) is symptomatic. We dressed the tooth and had a look to see whether anyone had researched this (I'm a cynic by nature, remember). 

On a quick search of PubMed on clinic we were able to find randomised controlled trials involving symptomatic teeth some of which found no difference in postoperative pain or success, whether the tooth was left for a second appointment or not, and some of which did. There was the Cochrane review from 2008 that found no difference in healing outcomes whether the tooth was treated in one or two visits, though it did find that single visit treatments resulted in significantly more people taking pain killers.

When we discussed this the students said they assumed that what they were told in their course was based on black and white evidence: obturate when the tooth is symptomatic and your failure rate is higher. In fact this recommendation was based on the personal opinion of the teacher (I checked) perhaps drawing on some of the studies showing single visit treatments resulted in more pain killers being taken, who thought dentists and patients would be happier knowing the tooth had settled down before obturating. I know this person very well and I am fairly certain that there is no intention to mislead students into thinking the healing was better in two-visit treatments (evidence level: personal opinion). This to me seems to be as much about miscommunication as about anything else: it isn't clear to the students what the basis of a recommendation is (evidence level: personal opinion).

Stay with me here as I move from undergraduate teaching to continuing professional development (CPD).

CPD: just wise thinking or based in high level evidence?

A few weeks ago Martin Kelleher wrote a decent opinion piece in the BDJ asking how much of what is taught on CPD courses gets translated into practice. He noted the difficulty of measuring the uptake of new knowledge in practice. There are all sorts of issues around this, amongst them: does a user of the knowledge need to use it as it was taught for it to qualify as being used; what if it gets used, but many months or years later; even if it gets used does it change patient outcomes?

Martin is right to raise these issues but I think there is something else we need to think about, whether we are teachers or learners.

When we sit down in a lecture or some other environment where we hope to learn something, the "knowledge" we gain from it could come from many sources. We are often listening to someone we consider an expert (at least, relative to us) and they have experience beyond ours. But their experience its still limited to the things they have done and rarely have they compared what they have done to what they haven't done in an objective and open way. Don't get me wrong - in dentistry this may often be the best evidence we can get even after thorough searches of the medical databases.

Yet when those educating us tell us what the best thing to do is, it is often unclear whether what we are being told its based on their experience, some old dental folklore, a lab study from which they are extrapolating, a single clinical study at high risk of bias or a systematic review of high quality and highly relevant clinical trials. Does this matter?

Well if you, like me, want to do the things that are most likely to benefit the patients you treat and minimise the time you waste trying out useless techniques, then surely it makes sense to know what the likelihood is that the change being advocated will improve your patients' outcomes. You'll want to know what the recommendations are based on.  

Learning the evidence level too

But are you told - whether as an undergraduate or someone doing cpd or postgraduate studies - the level of evidence supporting what you are being taught? My personal experience (evidence level: personal experience) from which I make an assumption is that you aren't. Are you happy about that?

I'm not. I feel that anyone teaching others should be open about whether what they teach is based on a high level of evidence or something less than this. To me that is simply respecting that as learners we need the information necessary to help us decide whether we change our practice or in some other way apply their teaching to practice - or not. 

So what do we do? 

Well, evidence levels have been used in guidelines for years. I would think we could start creating an adaptation of this that does not make CPD and other teaching clunky but which gives the learners a summary of the levels of evidence supporting key components of the teaching - perhaps a page with evidence level and references that accompanies the course. And as students (we're often both teachers and students at the same time) we ought to ask our teachers to provide these. 

No comments:

Post a Comment