Article ID Journal Published Year Pages File Type
516090 International Journal of Medical Informatics 2016 9 Pages PDF
Abstract

•We designed and evaluated a clinical decision support tool (Clinical Knowledge Summary) that automatically summarizes patient-specific clinical evidence to help clinicians’ decision making.•We followed an iterative, user-centered design, guided by information foraging theory and information visualization principles and informed by observations of target users interacting with high-fidelity prototypes.•The resulting tool consists of an interactive clinical evidence browser that presents users with a set of clinically actionable recommendations extracted from relevant documents in UpToDate and high quality clinical studies in PubMed.•Physicians successfully completed most of the usability tasks in a short period of time.•Physicians’ perceived decision quality was significantly higher with the Clinical Knowledge Summary compared with manual search, but no difference was found in information seeking time.

ObjectiveTo iteratively design a prototype of a computerized clinical knowledge summarization (CKS) tool aimed at helping clinicians finding answers to their clinical questions; and to conduct a formative assessment of the usability, usefulness, efficiency, and impact of the CKS prototype on physicians’ perceived decision quality compared with standard search of UpToDate and PubMed.Materials and methodsMixed-methods observations of the interactions of 10 physicians with the CKS prototype vs. standard search in an effort to solve clinical problems posed as case vignettes.ResultsThe CKS tool automatically summarizes patient-specific and actionable clinical recommendations from PubMed (high quality randomized controlled trials and systematic reviews) and UpToDate. Two thirds of the study participants completed 15 out of 17 usability tasks. The median time to task completion was less than 10 s for 12 of the 17 tasks. The difference in search time between the CKS and standard search was not significant (median = 4.9 vs. 4.5 min). Physician’s perceived decision quality was significantly higher with the CKS than with manual search (mean = 16.6 vs. 14.4; p = 0.036).ConclusionsThe CKS prototype was well-accepted by physicians both in terms of usability and usefulness. Physicians perceived better decision quality with the CKS prototype compared to standard search of PubMed and UpToDate within a similar search time. Due to the formative nature of this study and a small sample size, conclusions regarding efficiency and efficacy are exploratory.

Related Topics
Physical Sciences and Engineering Computer Science Computer Science Applications
Authors
, , , , , , ,