Open In App

How to Calculate Fleiss’ Kappa in Excel?

Last Updated : 23 Sep, 2022
Improve
Improve
Like Article
Like
Save
Share
Report

Fleiss’ Kappa is a method for estimating the level of understanding between at least three raters when the raters are relegating straight-out evaluations to a bunch of things. Fleiss’ Kappa goes from 0 to 1 where:

  • 0 shows no arrangement by any means among the raters.
  • 1 shows amazing between rater understanding.

This instructional exercise gives an illustration of how to ascertain Fleiss’ Kappa in Excel.

Fleiss’ Kappa in Excel

Assume 14 people rate 10 distinct items on the size of Poor to Excellent. The accompanying screen capture shows the absolute evaluations that every item gets:

Dataset

 

  • In Column I, calculate the sum from C3 to G3 and follow till  C12 to G12 as shown below example.
Calculating-sum

 

  • The trickiest calculations in column J. Apply below formula in Column J3 to J12.

=(C3^2-C3)+(D3^2-D3)+(E3^2-E3)+(F3^2-F3)+(G3^2-G3)

Applying-Fleiss'-Kappa-formula

 

  • You will get the below result.
Fleiss'-Kappa-applied

 

  • Calculate column K using the below formula.

=J3/(14*13)

Calculating-column-k

 

  • Calculate column sum from C3 to C12 in cell C14 and so on…
Calculating-columns-sum

 

  • Calculate the sum divided by 140 in row 14. and square the result in row 15, as shown below.
Calculating-sum-by-140-squared

 

  • Calculate row 14 sum in cell H16 and Column K sum in cell K14 and the sum divided by 10 in cell 15 as shown below.
Sum-divided-by-15

 

  • Note that the Fleiss’ Kappa in this model ends up being 0.2099. The genuine recipe used to work out this worth in cell C18 is:
Fleiss'-Kappa-obtained

 

Fleiss’ Kappa = (0.37802 – 0.2128) / (1 –  0.2128) = 0.2099

Despite the fact that there is no proper method for interpreting Fleiss’ Kappa, the accompanying qualities tell the best way to decipher Cohen’s Kappa, which is utilized to evaluate the degree of rater arrangement between only two raters:

  • < 0.20 | Poor
  • .21 – .40 | Fair
  • .41 – .60 | Moderate
  • .61 – .80 | Good
  • .81 – 1 | Very Good

In view of these qualities, Fleiss’ Kappa of 0.2099 in our model would be deciphered as a “fair” level of between rater understanding.


Like Article
Suggest improvement
Share your thoughts in the comments

Similar Reads