Saturday, 29 December 2012

General Dental Council, CPD and what the point of it all is

The General Dental Council is, until the end of January 2013, seeking comments from the public and dental professions on its new ideas for CPD.


As I've learnt more about the way we do - and don't (largely) - put new knowledge into practice I have become more sceptical that sitting in a lecture theatre for 15 / 20 / 50 hours a year is going to change my practice much. Indeed, there is a systematic review published by the EPOC group at Cochrane that looked at various educational meeting formats (conferences, lectures, workshops, seminars, symposia, and courses) to see how effective they were at improving professional practice or health outcomes, how they compared to other types of interventions to improve practice or health outcomes, and whether they can be made more effective by changing the way they are delivered.

The review included 81 randomised controlled trials. There was a wide range of educational delivery and support amongst these including didactic teaching, interactive teaching, mixed formats, reminders, patient education materials, supportive services, feedback reports, and educational outreach. The authors judged that 17 studies had a low risk of bias, 44 a moderate risk, and 20 a high risk.

...if 100 people attended an educational meeting and 100 didn't, 6 more in the attending group would comply with the 'desired practice' afterwards compared to those who complied in the non-attending group.

Overall they found that any intervention where educational meetings were a component resulted in a median risk difference for complying with 'desired practice' of 6% (interquartile range 1.8 to 15.9%) for the low to medium risk of bias studies. What this means is that if 100 people attended an educational meeting and 100 didn't, 6 more in the attending group would comply with the 'desired practice' afterwards compared to those who complied in the non-attending group.

Mixed didactic and intervention meetings had a risk difference of 13.6%, didactic a risk difference of 6.9% and interactive meetings a risk difference of 3.0%, when compared to no intervention. 

There are a number of other reviews from EPOC that look at other ways of trying to improve on practice and health outcomes, such as audit and feedback and educational outreach visits. These find risk differences of  4.3% (interquartile range 0.5% to 16%) and 5.6% (interquartile range 3.0% to 9.0%) respectively. Thus, whilst these interventions don't seem to be much different in their effectiveness in changing professional practice, educational meetings (which are a core part of verifiable CPD) are not the only game in town.

the GDC needs to think more broadly about what counts as useful CPD

So it seems to me that if the intention of CPD is - at least in part - to help improve patient outcomes then the GDC needs to think more broadly about what counts as useful CPD. After all, if attending a meeting that gives me an hour's worth of CPD results in no improvement in patient outcomes, what was the point? But if instead I read a Cochrane Review (for which I receive no hours because it is non-verifiable) and yet I change the way I and colleagues manage patients for the better by using audit and feedback and peer-support, wouldn't this be something that really counts?

Please go read the documents and give them some feedback before the end of January 2013...

Sunday, 23 December 2012

What evidence is supporting what you're being told to do?

Communication, evidence and ignorance

We had a patient come in with irreversible pulpitis in a lower premolar a few days ago. The student treating the patient was good clinically and she opened the tooth, did all the necessary instrumentation and a was ready to obturate. But she hesitated. She had been taught that you don't obturate when a tooth (or its periodontal ligament) is symptomatic. We dressed the tooth and had a look to see whether anyone had researched this (I'm a cynic by nature, remember). 

On a quick search of PubMed on clinic we were able to find randomised controlled trials involving symptomatic teeth some of which found no difference in postoperative pain or success, whether the tooth was left for a second appointment or not, and some of which did. There was the Cochrane review from 2008 that found no difference in healing outcomes whether the tooth was treated in one or two visits, though it did find that single visit treatments resulted in significantly more people taking pain killers.

When we discussed this the students said they assumed that what they were told in their course was based on black and white evidence: obturate when the tooth is symptomatic and your failure rate is higher. In fact this recommendation was based on the personal opinion of the teacher (I checked) perhaps drawing on some of the studies showing single visit treatments resulted in more pain killers being taken, who thought dentists and patients would be happier knowing the tooth had settled down before obturating. I know this person very well and I am fairly certain that there is no intention to mislead students into thinking the healing was better in two-visit treatments (evidence level: personal opinion). This to me seems to be as much about miscommunication as about anything else: it isn't clear to the students what the basis of a recommendation is (evidence level: personal opinion).

Stay with me here as I move from undergraduate teaching to continuing professional development (CPD).

CPD: just wise thinking or based in high level evidence?

A few weeks ago Martin Kelleher wrote a decent opinion piece in the BDJ asking how much of what is taught on CPD courses gets translated into practice. He noted the difficulty of measuring the uptake of new knowledge in practice. There are all sorts of issues around this, amongst them: does a user of the knowledge need to use it as it was taught for it to qualify as being used; what if it gets used, but many months or years later; even if it gets used does it change patient outcomes?

Martin is right to raise these issues but I think there is something else we need to think about, whether we are teachers or learners.

When we sit down in a lecture or some other environment where we hope to learn something, the "knowledge" we gain from it could come from many sources. We are often listening to someone we consider an expert (at least, relative to us) and they have experience beyond ours. But their experience its still limited to the things they have done and rarely have they compared what they have done to what they haven't done in an objective and open way. Don't get me wrong - in dentistry this may often be the best evidence we can get even after thorough searches of the medical databases.

Yet when those educating us tell us what the best thing to do is, it is often unclear whether what we are being told its based on their experience, some old dental folklore, a lab study from which they are extrapolating, a single clinical study at high risk of bias or a systematic review of high quality and highly relevant clinical trials. Does this matter?

Well if you, like me, want to do the things that are most likely to benefit the patients you treat and minimise the time you waste trying out useless techniques, then surely it makes sense to know what the likelihood is that the change being advocated will improve your patients' outcomes. You'll want to know what the recommendations are based on.  

Learning the evidence level too

But are you told - whether as an undergraduate or someone doing cpd or postgraduate studies - the level of evidence supporting what you are being taught? My personal experience (evidence level: personal experience) from which I make an assumption is that you aren't. Are you happy about that?

I'm not. I feel that anyone teaching others should be open about whether what they teach is based on a high level of evidence or something less than this. To me that is simply respecting that as learners we need the information necessary to help us decide whether we change our practice or in some other way apply their teaching to practice - or not. 

So what do we do? 

Well, evidence levels have been used in guidelines for years. I would think we could start creating an adaptation of this that does not make CPD and other teaching clunky but which gives the learners a summary of the levels of evidence supporting key components of the teaching - perhaps a page with evidence level and references that accompanies the course. And as students (we're often both teachers and students at the same time) we ought to ask our teachers to provide these. 

Friday, 21 December 2012

December issue of EBD journal out



The December issue of the Evidence-Based Dentistry Journal is out. 

If you are an undergraduate at QMUL you have full access through your institutional login. 

If you're a BDA member you get access through your automatic subscription to the BDJ. 

If you're working for the NHS in England and are not a member of the BDA then you should get access through your local NHS library using Athens (register here).

Wednesday, 19 December 2012

Prezi

This blog has little to do with evidence-based dentistry but is simply to draw attention to Prezi. I have used this presentation software in preference to PowerPoint for a couple of years and personally find it much more fun to use. I was chatting about this with a couple of students on clinic this week and thought it may be helpful for others to know about it too.

Students and teachers get to use Prezi for free so long as its used for educational purposes. See here. Below is an example of a Prezi I used when presenting to the British Society of Gerodontology - it won't make much sense without me speaking but it'll give you as sense of what is possible.



Tuesday, 11 December 2012

NHS OpenAthens access in the UK - yippee...perhaps



A couple of days ago I blogged about how hard it is to get access to journals without an institutional subscription, imagining that most general dental practitioners wouldn't have such access. Well, I learnt something today. If you work in the UK and are treating patients under an NHS contract it seems you may well be able to access at least some journals (though I've yet to find out how many).

It takes a little to work out but basically, even if you don't have an nhs or academic email address, if you have some form of professional email address (this can be one from where you work e.g. a practice email address) and/or a practice website get in touch with your local deanery administrator. If you let them also know your GDC number then this may help speed things along. If you don't have an academic or nhs email address it seems that trying to register automatically online will just result in you being rejected.

Good luck.

Sunday, 9 December 2012

Dental Journal Access - ha! ha! ha!

I presented to the British Society of Gerodontology Thursday last week on how to find and access research, with an emphasis on Atraumatic Restorative Treatment (Prezi available here).

I had begun to develop an interest in the problem of access once people are outside of the institutional access that comes with being part of an educational establishment or NHS Trust because a couple of former students had contacted me to say this was a problem now that they are out in practice.

So I did a little investigation for the BSG presentation into the access one can obtain to the top 81 journals as ranked by impact factor when you have:

  1. no institutional access 
  2. 3-4 years after publication again without institutional access
  3. with an educational institution's access (in my case my university's - QMUL) 
  4. the access you could obtain as a member of the British Dental Association through their library.

I was not surprised to find that only 12% of these journals allowed access to non-subscribers but was very pleasantly surprised that the BDA has access to 87% of them. I have been in contact with the very helpful staff at the library there and it seems unlikely that the BDA will be able to afford the high cost of online institutional access for its members. They make a charge of £2.50 per article that they copy / scan for you but my feeling is that if one reads the abstract well and chooses only the studies with the most appropriate design (i.e. controlled trials for intervention studies) that one could keep the number of articles down to a minimum.

I think there are two further points that stand out for me here. One is that accessing research needs to be easy and immediate if we are to encourage its use in day-to-day decision-making and the current difficulty accessing journals hampers this (even with the excellent BDA service). Second, this makes it even more important the dental profession as a whole begins to contribute to the summarising of research in the form of systematic reviews or guidelines that are freely accessible. It would be much simpler for all of us if there were up to date summaries of research relating to particular clinical problems that we could access any time we wanted. 

Because of this I'm beginning to think we could work collaboratively in the form of an evidence-based dentistry wiki to collect, critique and summarise evidence relating to clinical problems. Anybody interested in helping me out here - please get in touch!