Qualitative methodology is pants and has no role in evidence-based practice
For the study of the efficacy (how well it worked in the study) and, indeed, the effectiveness (how well it worked in practice), of a drug the randomised controlled trial with its quantitative output of numerical data on its success or otherwise in treating a given condition is the ideal and I would not argue otherwise. For the limited context and the restricted set of patients in which the trial is conducted, if well done, it will allow some estimate of the "truth" of the efficacy of the treatment. At least for the outputs being measured.
But evidence-based practice is about much more than a risk or odds ratio and p-values or confidence intervals. These numerical - quantitative - outputs are but a tiny element of what I understand as evidence-based practice (EBP).
Rather than look at qualitative studies per se e.g. "how dentists use or don't use evidence in their practices" (rather than "how many use evidence"), for this blog I just want to draw attention to the way we use qualitative methods to deal with quantitative data in EBP.
Or perhaps it does...
So if I have a study that tells me that putting a stainless steel crown on a carious deciduous tooth rather than filling it with a composite will result in 12% fewer such teeth needing to be extracted I am grateful for this quantitative information on the efficacy of the intervention. I need this to understand what the potential benefit of using it in my patient could be from the point of view of losing a tooth.
Qualitative critical appraisal
These are not questions that can be answered reliably quantitatively. In a sense we are analysing the text - the report of a study - to try and construct some idea of what it means. Does this explanation mean this is likely to be a reliable study or not? And this, I would argue, is a qualitative process: we are constructing an idea in our head of whether we think the story the report tells is likely to be the truth or not. Someone else could well construct a different opinion that is contrary to ours. How many times have you read in systematic reviews that disagreements were resolved through consensus or by a third reviewer?
Qualitative understanding of patient values and clinical experience and expertise
Contrary to the positivist "truth" from the study, for a patient, the truth of what is - for want of a better term - in their interests and meets their values and aspirations - could be very different. Perhaps the outcome of the study is not the outcome that interests them. Or perhaps, even if it is, they ascribe a different value to a tooth only lasting 1 year rather than 5.
Likewise, the truth for the clinician about the effectiveness of the treatment may be vastly at odds to the researchers' results as they try to run a small business, manage a clinic, decide which hands-on courses to attend (and which not to), and make sense of their colleagues' opinions about the research, its value, their experience using the treatment...
The issues of why people do things and what drives them to or not to are inherently qualitative and as clinicians trying to practice in an evidence-based way we make decisions in this way each day.
So I guess my conclusion here is that as we teach and train colleagues and students to practice EBP that we not forget the essential component that qualitative methods play in making sense of quantitative data and helping us use it where it is appropriate. As we move forward we may want to think of how we develop some rigour in this process as the various tools for critical appraisal have sought to do.