16.3 C
London
Sunday, July 13, 2025
HomeSoftware TutorialsExcelHow to Calculate Cohen’s Kappa in Excel

How to Calculate Cohen’s Kappa in Excel

Related stories

Learn About Opening an Automobile Repair Shop in India

Starting a car repair shop is quite a good...

Unlocking the Power: Embracing the Benefits of Tax-Free Investing

  Unlocking the Power: Embracing the Benefits of Tax-Free Investing For...

Income Splitting in Canada for 2023

  Income Splitting in Canada for 2023 The federal government’s expanded...

Can I Deduct Home Office Expenses on my Tax Return 2023?

Can I Deduct Home Office Expenses on my Tax...

Canadian Tax – Personal Tax Deadline 2022

  Canadian Tax – Personal Tax Deadline 2022 Resources and Tools...

Cohen’s Kappa is used to measure the level of agreement between two raters or judges who each classify items into mutually exclusive categories.

The formula for Cohen’s kappa is calculated as:

k = (po – pe) / (1 – pe)

where:

  • po: Relative observed agreement among raters
  • pe: Hypothetical probability of chance agreement

Rather than just calculating the percentage of items that the raters agree on, Cohen’s Kappa attempts to account for the fact that the raters may happen to agree on some items purely by chance.

The value for Cohen’s Kappa always ranges between 0 and 1, with 0 indicating no agreement between the two raters and 1 indicating perfect agreement between the two raters.

The following table summarizes how to interpret different values for Cohen’s Kappa:

Cohen's Kappa

The following example shows how to calculate Cohen’s Kappa in Excel.

Example: Calculating Cohen’s Kappa in Excel

Suppose two art museum curators are asked to rate 70 paintings on whether they’re good enough to be shown in a new exhibit.

The following 2×2 table shows the results of the ratings:

The following screenshot shows how to calculate Cohen’s Kappa for the two raters, including the formulas used:

Cohen's Kappa in Excel

The p0 value represents the relative agreement between the raters. This is the proportion of total ratings that the raters both said “Yes” or both said “No” on. 

This turns out to be 0.6429.

The pe value represents the probability that the raters could have agreed purely by chance. 

This turns out to be 0.5.

The k value represents Cohen’s Kappa, which is calculated as:

  • k = (po – pe) / (1 – pe)
  • k = (0.6429 – 0.5) / (1 – 0.5)
  • k = 0.2857

Cohen’s Kappa turns out to be 0.2857.

Based on the table from earlier, we would say that the two raters only had a “fair” level of agreement.

Additional Resources

The following tutorials offer additional resources on Cohen’s Kappa:

Introduction to Cohen’s Kappa
Cohen’s Kappa Calculator

Subscribe

- Never miss a story with notifications

- Gain full access to our premium content

- Browse free from up to 5 devices at once

Latest stories