Article ID Journal Published Year Pages File Type
3250369 The Journal of Emergency Medicine 2009 6 Pages PDF
Abstract

The objective of this study was to assess whether residents have the essential tools and a sense of competency when evaluating published studies, especially the statistics. Questionnaires were mailed to emergency medicine (EM) residency programs in the United States querying residents' demographics and training in statistics as well as their impressions and use of statistics in the current literature; a five-question statistical quiz was also included. Possible responses of—almost always, more than ½ time, ½ time, less than ½ time, almost never—were tallied individually as well as compared in groups of polarized answers: over 1/2 time (almost always + more than ½ time) vs. under ½ time (less than ½ time + almost never). There were 495 questionnaires returned from 42 centers. No significant difference was found when comparing quiz performance with participants' self-reported statistical knowledge. There were considerable differences in the polarized answers (Over vs. Under), whether statistics: were used appropriately (40% vs. 15%, respectively); were used to enhance weak data (54% vs. 13%, respectively); enhanced their understanding of information (38% vs. 24%, respectively); simplified complex data (26% vs. 41%, respectively); were understood by them (23% vs. 38%, respectively); confused them (37% vs. 24%, respectively); were skipped (52% vs. 23%, respectively). Participants felt there should be more statistical training (49% vs. 22%, Over vs. Under, respectively). There was no difference in respondents who did or did not read the statistics (39% vs. 34%, Over vs. Under, respectively). Many EM residents surveyed do not trust, read, or understand statistics presented in current journal articles. Residency programs may want to consider enhanced training in statistics.

Related Topics
Health Sciences Medicine and Dentistry Emergency Medicine
Authors
, , ,