Article ID Journal Published Year Pages File Type
896588 Technological Forecasting and Social Change 2013 8 Pages PDF
Abstract

Delphi studies are often conducted with the aim of achieving consensus or agreement among experts. However, many Delphi studies fail to offer a concise interpretation of the meaning of consensus or agreement. Whereas several statistical operationalizations of agreement exist, hardly any of these indices is used in Delphi studies. In this study, computer simulations were used to study different indices of agreement within different Delphi scenarios. A distinction was made between the indices of consensus (Demoivre index), agreement indices (e.g., Cohen's kappa and generalizations thereof), and association indices (e.g., Cronbach's alpha, intraclass correlation coefficient). Delphi scenarios were created by varying the number of objects, the number of experts, the distribution of object ratings, and the degree to which agreement increased between subsequent rounds. Each scenario consisted of three rounds and was replicated 1000 times. The simulation study showed that in the same data, different indices suggest different levels of agreement, and also, different levels of change of agreement between rounds. In applied Delphi studies, researchers should be more transparent regarding their choice of agreement index and report the value of the chosen index within every round as to provide insight into how the suggested agreement level has developed across rounds.

► A distinction has to be made between consensus, agreement, and association indices. ► Different indices suggest different levels of agreement in the same data. ► The consensus index does not always suggest the lowest level of agreement. ► Researchers need to explain their interest in consensus, agreement, or association. ► It is important to report which index is used within each Delphi study.

Related Topics
Social Sciences and Humanities Business, Management and Accounting Business and International Management
Authors
, , ,