کد مقاله کد نشریه سال انتشار مقاله انگلیسی نسخه تمام متن
2627784 1136101 2014 9 صفحه PDF دانلود رایگان
عنوان انگلیسی مقاله ISI
Pitfalls in the use of kappa when interpreting agreement between multiple raters in reliability studies
ترجمه فارسی عنوان
مشکلات در استفاده از کاپا هنگام تفسیر توافق بین چند ارزیابی در مطالعات قابلیت اطمینان
موضوعات مرتبط
علوم پزشکی و سلامت پزشکی و دندانپزشکی طب مکمل و جایگزین
چکیده انگلیسی

ObjectiveTo compare different reliability coefficients (exact agreement, and variations of the kappa (generalised, Cohen's and Prevalence Adjusted and Biased Adjusted (PABAK))) for four physiotherapists conducting visual assessments of scapulae.DesignInter-therapist reliability study.SettingResearch laboratory.Participants30 individuals with no history of neck or shoulder pain were recruited with no obvious significant postural abnormalities.Main outcome measuresRatings of scapular posture were recorded in multiple biomechanical planes under four test conditions (at rest, and while under three isometric conditions) by four physiotherapists.ResultsThe magnitude of discrepancy between the two therapist pairs was 0.04 to 0.76 for Cohen's kappa, and 0.00 to 0.86 for PABAK. In comparison, the generalised kappa provided a score between the two paired kappa coefficients. The difference between mean generalised kappa coefficients and mean Cohen's kappa (0.02) and between mean generalised kappa and PABAK (0.02) were negligible, but the magnitude of difference between the generalised kappa and paired kappa within each plane and condition was substantial; 0.02 to 0.57 for Cohen's kappa and 0.02 to 0.63 for PABAK, respectively.ConclusionsCalculating coefficients for therapist pairs alone may result in inconsistent findings. In contrast, the generalised kappa provided a coefficient close to the mean of the paired kappa coefficients. These findings support an assertion that generalised kappa may lead to a better representation of reliability between three or more raters and that reliability studies only calculating agreement between two raters should be interpreted with caution. However, generalised kappa may mask more extreme cases of agreement (or disagreement) that paired comparisons may reveal.

ناشر
Database: Elsevier - ScienceDirect (ساینس دایرکت)
Journal: Physiotherapy - Volume 100, Issue 1, March 2014, Pages 27–35
نویسندگان
, , , , , , ,