The General Dental Council is, until the end of January 2013, seeking comments from the public and dental professions on its new ideas for CPD.
So it seems to me that if the intention of CPD is - at least in part - to help improve patient outcomes then the GDC needs to think more broadly about what counts as useful CPD. After all, if attending a meeting that gives me an hour's worth of CPD results in no improvement in patient outcomes, what was the point? But if instead I read a Cochrane Review (for which I receive no hours because it is non-verifiable) and yet I change the way I and colleagues manage patients for the better by using audit and feedback and peer-support, wouldn't this be something that really counts?
As I've learnt more about the way we do - and don't (largely) - put new knowledge into practice I have become more sceptical that sitting in a lecture theatre for 15 / 20 / 50 hours a year is going to change my practice much. Indeed, there is a systematic review published by the EPOC group at Cochrane that looked at various educational meeting formats (conferences, lectures, workshops, seminars, symposia, and courses) to see how effective they were at improving professional practice or health outcomes, how they compared to other types of interventions to improve practice or health outcomes, and whether they can be made more effective by changing the way they are delivered.
The review included 81 randomised controlled trials. There was a wide range of educational delivery and support amongst these including didactic teaching, interactive teaching, mixed formats, reminders, patient education materials, supportive services, feedback reports, and educational outreach. The authors judged that 17 studies had a low risk of bias, 44 a moderate risk, and 20 a high risk.
...if 100 people attended an educational meeting and 100 didn't, 6 more in the attending group would comply with the 'desired practice' afterwards compared to those who complied in the non-attending group.
Overall they found that any intervention where educational meetings were a component resulted in a median risk difference for complying with 'desired practice' of 6% (interquartile range 1.8 to 15.9%) for the low to medium risk of bias studies. What this means is that if 100 people attended an educational meeting and 100 didn't, 6 more in the attending group would comply with the 'desired practice' afterwards compared to those who complied in the non-attending group.
Mixed didactic and intervention meetings had a risk difference of 13.6%, didactic a risk difference of 6.9% and interactive meetings a risk difference of 3.0%, when compared to no intervention.
There are a number of other reviews from EPOC that look at other ways of trying to improve on practice and health outcomes, such as audit and feedback and educational outreach visits. These find risk differences of 4.3% (interquartile range 0.5% to 16%) and 5.6% (interquartile range 3.0% to 9.0%) respectively. Thus, whilst these interventions don't seem to be much different in their effectiveness in changing professional practice, educational meetings (which are a core part of verifiable CPD) are not the only game in town.
the GDC needs to think more broadly about what counts as useful CPD
So it seems to me that if the intention of CPD is - at least in part - to help improve patient outcomes then the GDC needs to think more broadly about what counts as useful CPD. After all, if attending a meeting that gives me an hour's worth of CPD results in no improvement in patient outcomes, what was the point? But if instead I read a Cochrane Review (for which I receive no hours because it is non-verifiable) and yet I change the way I and colleagues manage patients for the better by using audit and feedback and peer-support, wouldn't this be something that really counts?
Please go read the documents and give them some feedback before the end of January 2013...