Skip to Main Content

++

Clinicians Often Disagree

++

Clinicians often disagree in their assessment of patients. When 2 clinicians reach different conclusions regarding the presence of a particular physical sign, either different approaches to the examination or different interpretations of the findings may be responsible for the disagreement. Similarly, disagreement between repeated applications of a diagnostic test may result from different application of the test or different interpretation of the results.

++

Researchers also may face difficulties in agreeing on issues such as whether patients meet the eligibility requirements for a randomized trial, whether patients in a trial have experienced the outcome of interest (eg, they may disagree about whether a patient has had a transient ischemic attack or a stroke or about whether a death should be classified as a cardiovascular death), or whether a study meets the eligibility criteria for a systematic review.

++

Chance Will Always Be Responsible for Some of the Apparent Agreement Between Observers

++

Any 2 people judging the presence or absence of an attribute will agree some of the time simply by chance. Similarly, even inexperienced and uninformed clinicians may agree on a physical finding on occasion purely as a result of chance. This chance agreement is more likely to occur when the prevalence of a target finding (a physical finding, a disease, an eligibility criterion) is high—occurring, for instance, in more than 80% of a population. When investigators present agreement as raw agreement (or crude agreement)—that is, by simply counting the number of times agreement has occurred—this chance agreement gives a misleading impression.

++

Alternatives for Dealing with the Problem of Agreement by Chance

++

This chapter describes approaches to addressing the problem of misleading results based on chance agreement. When we are dealing with categorical data (ie, placing patients in discrete categories, such as mild, moderate, or severe or stage 1, 2, 3, or 4), the most popular approach to dealing with chance agreement is with chance-corrected agreement. Chance-corrected agreement is statistically determined with kappa (κ) or weighted κ. Another option is the use of chance-independent agreement or phi (φ). One can use these 3 statistics to measure nonrandom agreement among observers, investigators, or measurements.

++

One Solution to Agreement by Chance: Chance-Corrected Agreement or κ

++

The application of κ removes most of the agreement by chance and informs clinicians of the extent of the possible agreement over and above chance. The total possible agreement on any judgment is always 100%. Figure 19.3-1 depicts a situation in which agreement by chance is 50%, leaving possible agreement above and beyond chance of 50%. As depicted in the figure, the raters have achieved an agreement of 75%. Of this 75%, 50% was achieved by chance alone. Of the remaining possible 50% agreement, the raters have achieved half, resulting in a κ value of 0.25/0.50, or 0.50.

++
FIGURE 19.3-1

κ ...

Want remote access to your institution's subscription?

Sign in to your MyAccess profile while you are actively authenticated on this site via your institution (you will be able to verify this by looking at the top right corner of the screen - if you see your institution's name, you are authenticated). Once logged in to your MyAccess profile, you will be able to access your institution's subscription for 90 days from any location. You must be logged in while authenticated at least once every 90 days to maintain this remote access.

Ok

About MyAccess

If your institution subscribes to this resource, and you don't have a MyAccess profile, please contact your library's reference desk for information on how to gain access to this resource from off-campus.

Subscription Options

JAMAevidence Full Site: One-Year Subscription

Connect to the full suite of JAMAevidence content and resources including interactive self-assessment, videos, and more.

$495 USD
Buy Now

Pay Per View: Timed Access to all of JAMAevidence

24 Hour Subscription $34.95

Buy Now

48 Hour Subscription $54.95

Buy Now

Pop-up div Successfully Displayed

This div only appears when the trigger link is hovered over. Otherwise it is hidden from view.