Article ID Journal Published Year Pages File Type
516703 International Journal of Medical Informatics 2016 12 Pages PDF
Abstract

IntroductionCumbersome electronic patient record (EPR) interfaces may complicate data-entry in clinical practice. Completeness of data entered in the EPR determines, among other things, the value of computerized clinical decision support (CCDS). Quantitative usability evaluations can provide insight into mismatches between the system design model of data entry and users’ data entry behavior, but not into the underlying causes for these mismatches. Mixed method usability evaluation studies may provide these insights, and thus support generating redesign recommendations for improving an EPR system’s data entry interface.AimTo improve the usability of the data entry interface of an EPR system with CCDS in the field of cardiac rehabilitation (CR), and additionally, to assess the value of a mixed method usability approach in this context.MethodsSeven CR professionals performed a think-aloud usability evaluation both before (beta-version) and after the redesign of the system. Observed usability problems from both evaluations were analyzed and categorized using Zhang et al.’s heuristic principles of good interface design. We combined the think-aloud usability evaluation of the system’s beta-version with the measurement of a new usability construct: users’ deviations in action sequence from the system’s predefined data entry order sequence. Recommendations for redesign were implemented. We assessed whether the redesign improved CR professionals’ (1) task efficacy (with respect to the completeness of data they collected), and (2) task efficiency (with respect to the average number of mouse clicks they needed to complete data entry subtasks).ResultsWith the system’s beta version, 40% of health care professionals’ navigation actions through the system deviated from the predefined next system action. The causes for these deviations as revealed by the think-aloud method mostly concerned mismatches between the system design model for data entry action sequences and users expectations of these action sequences, based on their paper-based daily routines. This caused non completion of data entry tasks (31% of main tasks completed), and more navigation actions than minimally required (146% of the minimum required). In the redesigned system the data entry navigational structure was organized in a flexible way around an overview screen to better mimic users’ paper-based daily routines of collecting patient data. This redesign resulted in an increased number of completed main tasks (70%) and a decrease in navigation actions (133% of the minimum required). The think-aloud usability evaluation of the redesigned system showed that remaining problems concerned flexibility (e.g., lack of customization options) and consistency (mainly with layout and position of items on the screen).ConclusionThe mixed method usability evaluation was supportive in revealing the magnitude and causes of mismatches between the system design model of data-entry with users’ data entry behavior. However, as both task efficacy and efficiency were still not optimal with the redesigned EPR, we advise to perform a cognitive analysis on end users’ mental processes and behavior patterns in daily work processes specifically during the requirements analysis phase of development of interactive healthcare information systems.

Related Topics
Physical Sciences and Engineering Computer Science Computer Science Applications
Authors
, , , , ,