Quorum
Quorum
David Moher, Deborah J Cook, Susan Eastwood, Ingram Olkin, Drummond Rennie, Donna F Stroup, for the QUOROM Group*
Summary Introduction
Health-care providers and other decision-makers now
Background The Quality of Reporting of Meta-analyses
have, among their information resources, a form of clinical
(QUOROM) conference was convened to address standards for
report called the meta-analysis,1-4 a review in which bias has
improving the quality of reporting of meta-analyses of clinical been reduced by the systematic identification, appraisal,
randomised controlled trials (RCTs). synthesis, and, if relevant, statistical aggregation of all
Methods The QUOROM group consisted of 30 clinical relevant studies on a specific topic according to a
epidemiologists, clinicians, statisticians, editors, and predetermined and explicit method.3 The number of
researchers. In conference, the group was asked to identify published meta-analyses has increased substantially in the
items they thought should be included in a checklist of past decade.5 These integrative articles can be helpful for
standards. Whenever possible, checklist items were guided by clinical decisions, and they may also serve as the policy
research evidence suggesting that failure to adhere to the foundation for evidence-based practice guidelines,
item proposed could lead to biased results. A modified Delphi economic evaluations, and future research agendas. The
value of meta-analysis is evident in the work of the
technique was used in assessing candidate items.
international Cochrane Collaboration,6,7 the primary
Findings The conference resulted in the QUOROM statement, purpose of which is to generate and disseminate high-
a checklist, and a flow diagram. The checklist describes our quality systematic reviews of health-care interventions.
preferred way to present the abstract, introduction, methods, Like any research enterprise, particularly one that is
results, and discussion sections of a report of a meta- observational, the meta-analysis of evidence can be flawed.
analysis. It is organised into 21 headings and subheadings Accordingly, the process by which meta-analyses are
regarding searches, selection, validity assessment, data carried out has undergone scrutiny. A 1987 survey of 86
abstraction, study characteristics, and quantitative data English-language meta-analyses8 assessed each publication
synthesis, and in the results with trial flow, study on 23 items from six content areas judged important in the
characteristics, and quantitative data synthesis; research conduct and reporting of a meta-analysis of randomised
documentation was identified for eight of the 18 items. The trials: study design, combinability, control of bias,
flow diagram provides information about both the numbers of statistical analysis, sensitivity analysis, and problems of
RCTs identified, included, and excluded and the reasons for applicability. The survey results showed that only 24 (28%)
exclusion of trials. of the 86 meta-analyses reported that all six content areas
had been addressed. The updated survey, which included
Interpretation We hope this report will generate further more recently published meta-analyses, showed little
thought about ways to improve the quality of reports of meta- improvement in the rigour with which they were reported.9
analyses of RCTs and that interested readers, reviewers, Several publications have described the science of
researchers, and editors will use the QUOROM statement and reviewing research,1 differences among narrative reviews,
generate ideas for its improvement. systematic reviews, and meta-analyses,2 and how to carry
out,3,4,10 critically appraise,1115 and apply16 meta-analyses in
Lancet 1999; 354: 1896900 practice. The increase in the number of meta-analyses
See Commentary page ???????? published has highlighted such issues as discordant meta-
analyses on the same topic17 and discordant meta-analyses
and randomised-trial results on the same question.18
An important consideration in interpretation and use of
meta-analyses is to ascertain that the investigators who did
the meta-analysis not only report explicitly the methods
*Other members listed at end of paper
they used to analyse the articles they reviewed, but also
University of Ottawa, Thomas C Chalmers Centre for Systematic report the methods used in the research articles they
Reviews, Ottawa (D Moher MSc); McMaster University, Hamilton analysed. The meta-analytical review methods used may
(D J Cook MD), Ontario, Canada; University of California,
not be provided when a paper is initially submitted: even
San Francisco (S Eastwood ELS(D)); Stanford University, Stanford, CA
(I Olkin PhD); JAMA, Chicago, IL (D Rennie PhD); and Centers for when they are, other factors such as page limitations, peer
Disease Control and Prevention, Atlanta, GA, USA (D F Stroup PhD) review, and editorial decisions may change the content and
format of the report before publication.
Correspondence to: Dr David Moher, Thomas C Chalmers Centre for
Systematic Reviews, Childrens Hospital of Eastern Ontario Research Several investigators have suggested guidelines for
Institute, 401 Smyth Road, Ottawa, Ontario K1H 8L1, Canada reporting of meta-analyses.3,19 However, a consensus across
(e-mail [email protected]) disciplines has not developed. After the initiative to
Describe
Objectives The clinical question explicitly
Data sources The databases (ie, list) and other information sources
Review methods The selection criteria (ie, population, intervention, outcome, and study design);
methods for validity assessment, data abstraction, and study characteristics, and
quantitative data synthesis in sufficient detail to permit replication
Results Characteristics of the RCTs included and excluded; qualitative and quantitative
findings (ie, point estimates and confidence intervals); and subgroup analyses
Conclusion The main results
Describe
Introduction The explicit clinical problem, biological rationale for the intervention, and rationale for review
Methods Searching The information sources, in detail28 (eg, databases, registers, personal files, expert
informants, agencies, hand-searching), and any restrictions (years considered, publication
status, 29 language of publication30,31)
Selection The inclusion and exclusion criteria (defining population, intervention, principal
outcomes, and study design32
Validity assessment The criteria and process used (eg, masked conditions, quality assessment, and their findings3336)
Data abstraction The process or processes used (eg, completed independently, in duplicate)35,36
Study characteristics The type of study design, participants characteristics, details of intervention, outcome
definitions, &c,37 and how clinical heterogeneity was assessed
Quantitative data synthesis The principal measures of effect (eg, relative risk), method of combining results
(statistical testing and confidence intervals), handling of missing data; how statistical
heterogeneity was assessed;38 a rationale for any a-priori sensitivity and subgroup analyses;
and any assessment of publication bias39
Results Trial flow Provide a meta-analysis profile summarising trial flow (see figure)
Study characteristics Present descriptive data for each trial (eg, age, sample size, intervention, dose, duration,
follow-up period)
Quantitative data synthesis Report agreement on the selection and validity assessment; present simple summary
results (for each treatment group in each trial, for each primary outcome); present data
needed to calculate effect sizes and confidence intervals in intention-to-treat analyses
(eg 232 tables of counts, means and SDs, proportions)
Discussion Summarise key findings; discuss clinical inferences based on internal and external validity;
interpret the results in light of the totality of available evidence; describe potential
biases in the review process (eg, publication bias); and suggest a future research agenda
Pretesting
After development of the checklist and flow diagram, two
members of the steering committee (DM, DJC) undertook
pretesting with epidemiology graduate students studying
meta-analysis, residents in general internal medicine,
participants at a Canadian Cochrane Center workshop,
and faculty members of departments of medicine and of
epidemiology and biostatistics. One group of candidates for
a master's degree in epidemiology used the checklist and
flow diagram to report their meta-analyses as if their work
were being submitted for publication. Feedback from these
four groups was positive, most users stating that the Progress through the stages of a meta-analysis for RCTs
checklist and flow diagram would be likely to improve
reporting standards. Modifications of the checklist (eg, discussion should include comments about whether the
inclusion of a statement about major findings) and changes results obtained may have been influenced by such bias.
to the flow diagram (eg, more detail) were incorporated. Publication bias derives from the selective publishing of
studies with statistically significant or directionally positive
results,4042 and it can lead to inflated estimates of efficacy in
Discussion
meta-analyses. For example, trials of single alkylating
In developing the checklist, we identified supporting agents versus multiple-agent cytotoxic chemotherapy in the
scientific evidence for only eight of 18 items to guide the treatment of ovarian cancer have been analysed.39
reporting of meta-analyses of RCTs.26-39 Some of this Published trials yielded significant results in favour of the
evidence is indirect. For example, we ask authors to use a multiple-agent therapy, but that finding was not supported
structured abstract format. The supporting evidence for when the results of all trialsboth those published and
this item was collected by examining abstracts of original those registered but not publishedwere analysed.
reports of individual studies27 and may not pertain The statement asks authors to be explicit about the
specifically to the reporting of meta-analyses. However, the publication status of reports included in a meta-analysis.
QUOROM group judged this a reasonable approach by Only about a third of published meta-analyses report the
analogy with other types of research reports and pending inclusion of unpublished data.29,43 Although one study
further evidence about the merits of structured abstracts found that there were no substantial differences in the
for meta-analyses. dimensions of study quality between published and
We have asked authors to be explicit in reporting the unpublished clinical research,42 another suggested that
criteria used when assessing the quality of trials included intervention effects reported in journals were 33% greater
in meta-analyses and the outcome of the quality than those reported in doctoral dissertations.44 The role of
assessment. There is direct and compelling evidence to the grey literature (difficult to locate or retrieve) was
support recommendations about reporting on the quality examined in 39 meta-analyses that included 467 RCTs,
of RCTs included in a meta-analysis. A meta-analytic 102 of which were grey literature.29 Meta-analyses limited
database of 255 obstetric RCTs provided evidence that to published trials, compared with those that included both
trials with inadequate reporting of allocation concealment published and grey literature, overestimated the treatment
(ie, keeping the intervention assignments hidden from all effect by an average of 12%. There is still debate between
participants in the trial until the point of allocation) editors and investigators about the importance of including
overestimated the intervention effect by 30% compared unpublished data in a meta-analysis.43
with trials in which this information was adequately We have asked authors to be explicit in reporting
reported.33 Similar results for several disease categories and whether they have used any restrictions on language of
methods of quality assessment have been reported.34 These publication. Roughly a third of published meta-analyses
findings suggest that inclusion of reports of low-quality have some language restrictions as part of the eligibility
RCTs in meta-analyses is likely to alter the summary criteria for including individual trials.30 The reason for such
measures of the intervention effect. restrictions is not clear, since there is no evidence to
We also ask authors to be explicit in reporting support differences in study quality, and there is evidence
assessment of publication bias, and we recommend that the that language restrictions may result in a biased summary.