Article ID Journal Published Year Pages File Type
346824 Children and Youth Services Review 2008 19 Pages PDF
Abstract

ObjectiveTo assess methods used to identify, analyze, and synthesize results of empirical research on intervention effects, and determine whether published reviews are vulnerable to various sources and types of bias.MethodsStudy 1 examined the methods, sources, and conclusions of 37 published reviews of research on effects of a model program. Study 2 compared findings of one published trial with summaries of results of that trial that appeared in published reviews.ResultsStudy 1: Published reviews varied in terms of the transparency of inclusion criteria, strategies for locating relevant published and unpublished data, standards used to evaluate evidence, and methods used to synthesize results across studies. Most reviews relied solely on narrative analysis of a convenience sample of published studies. None of the reviews used systematic methods to identify, analyze, and synthesize results. Study 2: When results of a single study were traced from the original report to summaries in published reviews, three patterns emerged: a complex set of results was simplified, non-significant results were ignored, and positive results were over-emphasized. Most reviews used a single positive statement to characterize results of a study that were decidedly mixed. This suggests that reviews were influenced by confirmation bias, the tendency to emphasize evidence that supports a hypothesis and ignore evidence to the contrary.ConclusionsPublished reviews may be vulnerable to biases that scientific methods of research synthesis were designed to address. This raises important questions about the validity of traditional sources of knowledge about “what works,” and suggests need for a renewed commitment to using scientific methods to produce valid evidence for practice.

Related Topics
Health Sciences Medicine and Dentistry Perinatology, Pediatrics and Child Health
Authors
,