Documente Academic
Documente Profesional
Documente Cultură
1
Ofce of Education Research, Mayo Medical School, Rochester,
Minnesota, USA
2
Division of General Internal Medicine, Mayo Clinic, Rochester,
Minnesota, USA
3
Division of Biomedical Statistics and Informatics, Mayo Clinic,
Rochester, Minnesota, USA
943
INTRODUCTION
As medical education research continues to proliferate, syntheses of this evidence will become increasingly important. Systematic reviews play a critical role
in this process of synthesis by identifying and
summarising published research on focused topics
and highlighting strengths and weaknesses in that
field. Although some have criticised systematic
reviews as engendering false confidence in their
objectivity and freedom from bias,1 others have
argued for a more balanced role.2 Of the 108
applications for funding to conduct a literature
review received in the past 2 years by the Society of
Directors of Research in Medical Education, over half
have proposed systematic reviews. As chair of the
review committee, author DAC has observed that
many applicants fail to anticipate the key actions
required in a rigorous review. It appears that guidance on how to conduct a high-quality systematic
review in medical education is needed. Although a
number of books have been published on systematic
reviews, and previous articles in the medical education literature have highlighted challenges and
provided brief tips,35 we are not aware of any concise
Table 1
guidelines offering a structured approach to planning, conducting and reporting a systematic review in
medical education.
The purpose of this article is to provide a concise
and practical guide to the conduct and reporting of
systematic reviews, with particular attention to issues
affecting medical education research. Table 1
summarises the key steps. An annotated bibliography (Appendix S1, online) lists resources that
elaborate on each of the methods that will be
described. At each step, we will illustrate the
principles involved using text drawn from a recent
review of simulation-based education published by
the first author.6
THE PROCESS
944
945
Assemble a team
946
947
948
Table 2
Tool
Example
Useful for:
Various options
teleconferencing
Videoconferencing
Skype
Wikis
Spreadsheet
23
Reference Manager
Bibliographic software
EndNote,
Translation software
Google Translate
Purpose-built review
DistillerSR, EPPI-Reviewer 4
software
Meta-analysis software
Meta-analysis
SAS
and
SPSS
949
Table 3
Methods
Divide the Methods into five sections, with headings pertaining to: a focused question; search strategy; inclusion
criteria; data abstraction, and analysis
Most of this information can be pulled from your written protocol
Results
Divide the Results section into at least four sections, with headings pertaining to: trial flow; study characteristics;
study quality, and synthesis
Trial flow: describe the number of studies identified in the search, the number included and excluded, the number
remaining for full review, and any special issues that arose during this process
Study characteristics: describe a few (three to five) of the most salient study features
Study quality: describe the most important aspects of study quality
Synthesis: present the results of the narrative or quantitative data analysis. The synthesis narrative should distil the
evidence into a clear message for readers. Articles that support a similar point (either favourable or unfavourable
to the overall conclusion) should be grouped together rather than listed individually. For example, a reviewer
might write: It appears that intervention X improves outcomes. Five randomised trials addressed this issue; four
found favourable results and one found no significant difference (see Table 2 for details). The reviewer might
then proceed to discuss salient between-study differences, such as in design, participants, interventions or
instruments, that might have influenced results
Include at least one specific figure and two specific tables: (i) a trial flow diagram as specified in the QUOROM
and PRISMA guidelines; (ii) a table that contains information on the key features of each study, and (iii) a table
with details of each studys quality features
Discussion
Divide the Discussion into at least four sections, with headings for the last three, pertaining to: summary;
limitations; integration with previous reviews, and implications for practice and future research
Summary (no separate heading): recap (but do not restate) the most important results, including key
uncertainties if any
Limitations: acknowledge the reviews limitations and unique strengths
Integration: discuss how the present review supports, contradicts or extends the findings of previous relevant
reviews. In addition to considering reviews in the present field of study, it is often helpful to draw parallels with
findings in other fields (e.g. clinical medicine, other education topics, or non-medical education)
Implications: outline two to four main points that can be immediately applied in practice or will provide a starting
point for future research. Note that there is no need for a separate conclusions section; the implications are,
in reality, your conclusions
Abstract
950
REVIEWING TOOLS
REPORTING
CONCLUSIONS
REFERENCES
1 Eva KW. On the limits of systematicity. Med Educ
2008;42:8523.
951
952
18
19
20
21
22
23
24
25
26
programs/clinical_epidemiology/oxford.htm.
[Accessed 29 February 2012.]
Whiting PF, Rutjes AWS, Westwood ME, Mallett S,
Deeks JJ, Reitsma JB, Leeflang MMG, Sterne JAC,
Bossuyt PMM, Group tQ. QUADAS-2: a revised tool for
the quality assessment of diagnostic accuracy studies.
Ann Intern Med 2011;155:52936.
Cook DA. Randomised controlled trials and metaanalysis in medical education: what role do they play?
Med Teach 2012;34: 46873.
Bland CJ, Meurer LN, Maldonado G. A systematic
approach to conducting a non-statistical meta-analysis
of research literature. Acad Med 1995;70:64253.
Pawson R, Greenhalgh T, Harvey G, Walshe K. Realist
review a new method of systematic review designed
for complex policy interventions. J Health Serv Res Policy
2005;10 (Suppl 1):2134.
Bushman BJ, Wang MC. Vote-counting procedures in
meta-analysis. In: Cooper H, Hedges LV, Valentine JC,
eds. The Handbook of Research Synthesis, 2nd edn. New
York, NY: Russell Sage Foundation 2009;20720.
King R, Hooper B, Wood W. Using bibliographic software to appraise and code data in educational systematic review research. Med Teach 2011;33:71923.
Moher D, Cook DJ, Eastwood S, Olkin I, Rennie D,
Stroup DF. Improving the quality of reports of metaanalyses of randomised controlled trials: the QUOROM statement. Quality of reporting of meta-analyses.
Lancet 1999;354:1896900.
Stroup DF, Berlin JA, Morton SC et al. Meta-analysis of
observational studies in epidemiology: a proposal for
reporting. JAMA 2000;283:200812.
Moher D, Liberati A, Tetzlaff J, Altman DG. Preferred
reporting items for systematic reviews and metaanalyses: the PRISMA statement. Ann Intern Med
2009;151:2649.
SUPPORTING INFORMATION