There is increasing interest in evidence-based dentistry (EBD) internationally and from a wide variety of dental groups. This is accompanied by a welcome and unprecedented degree of collaboration between individuals and organisations who seek to exploit the potential of EBD to improve patient care. For example, there are increasing numbers of systematic reviews registered with the Cochrane Oral Health Group (COHG; see www.cochrane.co.uk); the formation of the Evidence-Based Dentistry Network by the International Association of Dental Research; and links between EBD researchers and dental associations, such as those led by the International Dental Federation (www.fdiworldental.org). Examples at national level include initiatives from the American Dental Association (ADA; see its “Position Statements” at www.ada.org/prof/resources/positions/statements/evidencebased.asp) and a number of strands of activity across the UK National Health Service which seek to incorporate EBD into the modernisation of dental services1 and to co-ordinate EBD activities.2

This level of interest and the number of professionals and public groups encountering EBD for the first time means that it is particularly important for the very broad scope of EBD to be appreciated. This is not a subject focused only upon the electronic retrieval of (sometimes obscure) scientific papers, or an activity dealing solely with the minutiae of critically appraising published research reports and reviews. It is a subject area and philosophy of care that seeks to ensure that clinical dentists or dental team members and their patients are equipped with unbiased, up-to-date, synthesised scientific knowledge about the best treatment alternatives for a contemporary clinical problem or clinical decision. In order to meet these aspirations EBD has to cover a wide range of interlocking facets — rather like the familiar form of a jigsaw puzzle (Figure 1).

Figure 1
figure 1

What is evidence-based dentistry? The matrix: how findings from research should influence research, education and practice.

This vision of integrated and comprehensive EBD represents an ongoing, iterative, process in which appropriate and relevant examples of primary research are taken through an interlocking matrix of activity. During this, they are appraised critically and synthesised in systematic reviews (across the top of the puzzle). These reviews are then disseminated effectively and widely with the aid of researchers, policy makers, professional and dental care organisations and industry, in conjunction with undergraduate, postgraduate and continuing dental education interests (back across the middle of the puzzle). The penultimate, often neglected, steps are then to implement the findings effectively and in good time by clinicians wishing to improve the care for their informed patients (back along the bottom of the puzzle). The final step is to ensure that this new, modified activity is itself the subject of robust and objective scrutiny, feeding back to the agenda for new primary research at the start of the puzzle.

This vision has yet to be achieved on a wide scale. Although some individual dentists and dental scientists have been carrying out a number of the key steps outlined above (albeit without using recent labels and jargon), there have been, and continue to be, many blind spots and delays in the knowledge-transfer process in dentistry, as in other areas of healthcare. The move to evidence-based healthcare is a global phenomenon but it is taking place at a variety of speeds in different countries.

The evidence-based healthcare philosophy requires an open approach from health professionals. High value is now given to robust research findings. Lower-quality research and expert opinion alone are given more limited credence, but become important guides in areas where there is no high-quality research relating directly to the clinical question. In a number of countries, patients and their views are also becoming increasingly empowered, for example in the Options for Change developments in England.1

EBD is designed to help the clinician and patient when addressing specific clinical dental questions. It has been defined by the ADA as, “an approach to oral health care that requires the judicious integration of:

  • systematic assessments of clinically relevant scientific evidence, relating to the patient's oral and medical condition and history, with

  • the dentist's clinical expertise, and

  • the patient's treatment needs and preferences.”

The rest of this editorial focuses on the top line of the jigsaw, addressing research and synthesis (see Figure 1). In later issues of this journal, Parts 2 and 3 will cover dissemination and implementation, respectively.

Primary research

The outcome of individual studies, or primary research, is the backbone of EBD philosophy. If there is a wealth of high-quality studies directly addressing the aspect of treatment or diagnosis relevant to the management of an individual case, then EBD can readily realise its potential. The coverage of the area in question may be patchy, however, or offer little generalisable evidence directly relating to a particular clinical problem. In these cases, the acknowledgement and rational management of uncertainty, combined with good clinical judgement, are essential.

Another difficulty with existing primary research is that the coverage of clinical dentistry is very uneven. Whereas the Cochrane Oral Health Group database reveals thousands of dental trials, these are grouped in specific areas and are of variable quality (see www.cochrane.co.uk). This patchy coverage reflects the vagaries of research funding and fashion over the past decades. Unfortunately, many of the areas of key importance to routine clinical dental practice have been relatively neglected in terms of primary research. In many countries, it is hard to attract funding to support studies on such topics as, in a very competitive sphere, scarce resources are prioritised into more medically aligned, long-term or academically esoteric areas. Other difficulties encountered in redressing this balance are the length of time it takes to design, fund, undertake and publish high-quality primary research in clinical areas, and the difficulty in many countries of recruiting, training and retaining clinical academic staff to undertake such studies.

Critical appraisal

Once suitable outputs of primary research have been identified, the next stage in the EBD process is to ensure that the studies are appraised critically. This ensures that the findings outlined in published work are valid and do not suffer from undetected biases in design, execution or reporting. It might be assumed that all articles in scientific and professional journals would be of very high quality and free from bias. This is not the case, regrettably, because refereeing and editorial processes are variable across and within journals. Nevertheless, it is important to balance such judgements about critical reviewing, to avoid excessive cynicism and inappropriately rejecting valuable clinical evidence.

However well individual clinicians are trained in appraisal skills, significant difficulties remain in practising them. Amongst these are securing prompt and convenient access to publications near to where patients are treated, managing to sift through the huge volume of work currently being published, and securing adequate dedicated time. There are also significant tensions in the field of critical appraisal because expert critical methodological review of the existing and older published work across evidence-based medicine (and EBD) shows that much falls short of modern standards of objectively measured methodological quality. It is important to establish a culture in which constructive critique of previous research is acceptable but, equally, where useful research conducted to the standards of the day is integrated appropriately into the evidence base. Recent examples of this difficulty in EBD are seen within the systematic reviews related to the US National Institutes of Health Consensus Development Conference on the Diagnosis and Management of Dental Caries.3

It is imperative that all clinicians are trained to adopt a constructively and professionally critical mindset in making their own assessments of published work. On a national level, the size and complexity of this training task is somewhat daunting. The critical appraisal process looks at not only the quality of any particular study, but also considers the relevance of the study to the particular clinical question being considered. Thus, a range of questions will be asked to establish whether the population under study, the setting or service environment in which it was carried out, and the particular procedure or intervention assessed, can be related to the clinical case or problem under consideration. This test of the generalisability of research findings is important in terms of the degree to which the findings should influence individual clinical practice.

Systematic reviews

Systematic reviews are the cornerstone of EBD and should consist of a thorough, unbiased, explicit, transparent and systematic process whereby all the evidence pertaining to specific, well-defined review questions is sought and appraised. Again, this should be in terms of both quality and relevance. The care with which the review question is specified is key to the utility and quality of the subsequent review. Until quite recently, literature reviews (secondary research) were regarded in many ways as being inferior to new (or primary) research. The balance has now shifted, and it is appreciated that the value of each new study should be established by comparison with an unbiased and comprehensive review of the previous work in that specific area.

Having identified as much of the evidence as possible, the next step involves the systematic synthesis of findings using robust and comparable analyses of the findings. This will establish the strength of evidence across a number of studies, research teams, designs or countries. Ideally, one would want to review a number of randomised controlled trials (RCT) and conduct a meta-analysis — a rigorous, composite overview of the degree of success of the particular intervention in question. Given the amount of effort and time taken to undertake such reviews, it is important not to duplicate effort. The international EBD/evidence-based medicine community is usually extremely generous in sharing search strategies and results to minimise duplication of effort. This type of review is often led by the Cochrane Collaboration, which has a specific Oral Health Group that maintains a register of randomised trials in dentistry. The Cochrane Collaboration is an international organisation whose purpose is to help people make informed decisions about healthcare by preparing, maintaining and promoting the accessibility of systematic reviews of the effects of healthcare interventions. Systematic reviews of other types of study design are also undertaken; in the UK this is notably done by the Centre for Reviews and Dissemination based at the University of York.

Systematic reviews are also undertaken as a necessary prelude to drawing up evidence-based clinical guidelines. Where there is plentiful evidence and a reasonable number of high-quality RCT, this is straightforward and there are well-identified systematic processes for undertaking them.4 Where evidence is in different forms (for example diagnostic studies) and/or of very variable quality, these reviews become more difficult to conduct and interpret. One example is the evidence on radiographic diagnostic yield and safety, which was reviewed in an evidence-based guideline on selection criteria in dental radiography.5 Another increasingly difficult area, given the large gaps in the evidence covering clinical dentistry, is where reviews are clearly indicated, but there is little or no high-quality scientific evidence directly applicable to the review question. A recent example includes clinical examination and record keeping, for which the Faculty of General Dental Practitioners in the UK sought to produce an evidence-based guideline but, with insufficient evidence, had to publish a good-practice guideline instead.6

It is not always appreciated that the outcomes of systematic reviews are inherently unpredictable: they depend on objective analyses of studies that meet specific inclusion criteria, some of which may not be well known to researchers or policy makers in advance of the review. Equally, some popular and oft-cited studies may unexpectedly fail to pass pre-set thresholds of methodological quality.

It is hoped that this consideration of the various components of EBD will be helpful in understanding the subject as a whole — how the various pieces of the jigsaw can and should fit together.