Ετικέτες

Σάββατο 17 Μαρτίου 2018

The Problems with the Kappa Statistic as a Metric of Interobserver Agreement on Lesion Detection Using a Third-reader Approach When Locations Are Not Prespecified

S10766332.gif

Publication date: Available online 16 March 2018
Source:Academic Radiology
Author(s): Joanna H. Shih, Matthew D. Greer, Baris Turkbey
Rationale and ObjectivesTo point out the problems with Cohen kappa statistic and to explore alternative metrics to determine interobserver agreement on lesion detection when locations are not prespecified.Materials and MethodsUse of kappa and two alternative methods, namely index of specific agreement (ISA) and modified kappa, for measuring interobserver agreement on the location of detected lesions are presented. These indices of agreement are illustrated by application to a retrospective multireader study in which nine readers detected and scored prostate cancer lesions in 163 consecutive patients (n = 110 cases, n = 53 controls) using the guideline of Prostate Imaging Reporting and Data System version 2 on multiparametric magnetic resonance imaging.ResultsThe proposed modified kappa, which properly corrects for the amount of agreement by chance, is shown to be approximately equivalent to the ISA. In the prostate cancer data, average kappa, modified kappa, and ISA equaled 30%, 55%, and 57%, respectively, for all lesions and 20%, 87%, and 87%, respectively, for index lesions.ConclusionsThe application of kappa could result in a substantial downward bias in reader agreement on lesion detection when locations are not prespecified. ISA is recommended for assessment of reader agreement on lesion detection.



http://ift.tt/2Iz13lZ

Δεν υπάρχουν σχόλια:

Δημοσίευση σχολίου

Αναζήτηση αυτού του ιστολογίου