Subscribe to DSC Newsletter

My data is about tests that are either positive or negative done on a number of specimens. The specimens are tested by 5 different lab techs and now i need to analyze agreements of these results. 

Cohen's kappa can't help while fleiss kappa does not have a detailed universal method of interprating its results.

 

What is the best tool for analyzing rater agreement?

Views: 88

Reply to This

Replies to This Discussion

perhaps I can also hear more on latent class models

RSS

On Data Science Central

© 2019   AnalyticBridge.com is a subsidiary and dedicated channel of Data Science Central LLC   Powered by

Badges  |  Report an Issue  |  Privacy Policy  |  Terms of Service