Article ID Journal Published Year Pages File Type
6177479 European Urology 2016 4 Pages PDF
Abstract

Because surgical skill may be a key determinant of patient outcomes, there is growing interest in skill assessment. In the Michigan Urological Surgery Improvement Collaborative (MUSIC), we assessed whether peer and crowd-sourced (ie, layperson) video review of robot-assisted radical prostatectomy (RARP) could distinguish technical skill among practicing surgeons. A total of 76 video clips from 12 MUSIC surgeons consisted of one of four parts of RARP and underwent blinded review by MUSIC peer surgeons and prequalified crowd-sourced reviewers. Videos were rated for global skill (Global Evaluation Assessment of Robotic Skills) and procedure-specific skill (Robotic Anastomosis and Competency Evaluation). We fit linear mixed-effects models to estimate mean peer and crowd ratings for each video. Individual video ratings were aggregated to calculate surgeon skill scores. Peers (n = 25) completed 351 video ratings over 15 d, whereas crowd-sourced reviewers (n = 680) completed 2990 video ratings in 38 h. Surgeon global skill scores ranged from 15.8 to 21.7 (peer) and from 19.2 to 20.9 (crowd). Peer and crowd ratings demonstrated strong correlation for both global (r = 0.78) and anastomosis (r = 0.74) skills. The two groups consistently agreed on the rank order of lower scoring surgeons, suggesting a potential role for crowd-sourced methodology in the assessment of surgical performance. Lack of patient outcomes is a limitation and forms the basis of future study.Patient summaryWe demonstrated the large-scale feasibility of assessing the technical skill of robotic surgeons and found that online crowd-sourced reviewers agreed with experts on the rank order of surgeons with the lowest technical skill scores.

Related Topics
Health Sciences Medicine and Dentistry Obstetrics, Gynecology and Women's Health
Authors
, , , , , , , , , , , ,