Blog

///Quantifying Agreement Among Assessors

Quantifying Agreement Among Assessors

How well do your inspectors agree with one another? How well do they agree with a standard?

What about people who review documents? How would several reviewers rate the same document? Do they agree with one another?

These are just several of the situations where Attribute Agreement would answer these questions.

In this simple video, Carmelo Cordaro, Business Excellence Advisory Sr. Manager at Accenture, addresses how to assess whether more than two assessors agree on reviewing the quality of work using an ordinal scale.

NOTE: The analysis uses Minitab software.

The example focuses on reproducibility among assessors and not repeatability since the ratings are only performed once. In addition, a standard is not included, so agreement against a standard is not assessed. However, Minitab does provide the capability for analyzing these.

Mr. Cordaro discusses the Fleiss Kappa statistic, which is a measure of agreement. He then mentions the use of Kendall’s Coefficient of Concordance, which considers the levels of agreement. The QA Analysts in the example could have agreed using low levels of assessment or high levels of assessment. The Fleiss Kappa statistic does not take that into account.

You can view Carmelo’s video below!

Mr. Cordaro has many other beneficial YouTube videos devoted to the use of Minitab in performing basic data analysis for Six Sigma.

2019-05-05T10:13:41+00:00 By Endrea Kosven|Tags: , , |0 Comments

Leave A Comment