کد مقاله کد نشریه سال انتشار مقاله انگلیسی نسخه تمام متن
1099188 953182 2015 7 صفحه PDF دانلود رایگان
عنوان انگلیسی مقاله ISI
When are LibQUAL+® and LibQUAL+® Lite scores psychometrically comparable?
موضوعات مرتبط
علوم انسانی و اجتماعی علوم اجتماعی کتابداری و علوم اطلاعات
پیش نمایش صفحه اول مقاله
When are LibQUAL+® and LibQUAL+® Lite scores psychometrically comparable?
چکیده انگلیسی


• Loss of information for LibQUAL Lite administration was studied using simulation.
• Means, SDs, adequacy and superiority gaps, r, polychoric correlations, and CIs were considered.
• If only interested in means, libraries may administer Lite to < 80% of users.
• All other statistics were compromised when data contained > 20% Lite scores.
• Loss of information will be compounded in most statistical analyses for > 20% Lite data.

Planned missingness in commonly administered proportions of LibQUAL+® and Lite instruments may lead to loss of information. Data from three previous administrations of LibQUAL+® protocol were used to simulate data representing five proportions of administration. Statistics of interest (i.e., means, adequacy and superiority gaps, standard deviations, and Pearson and polychoric correlations) and their confidence intervals (CIs) from simulated and real data were compared. All CIs for the statistics of interest for simulated data contained the original values. Root mean squared errors, and absolute and relative biases showed that accuracy in the estimates decreased with increase in Lite proportion. The recommendation is to administer the Lite version to not more than 20% of the respondents if the purpose of the data collection is to conduct any inferential analysis. If researchers are interested in calculating means alone, up to 80% Lite version may be used to capture the true values adequately. However, standard deviations need to be interpreted to understand the quality of the means. Loss of accuracy in estimates may be compounded in analyses that use at least two statistics of interest.

ناشر
Database: Elsevier - ScienceDirect (ساینس دایرکت)
Journal: Library & Information Science Research - Volume 37, Issue 1, January 2015, Pages 21–27
نویسندگان
, , ,