Watch Out: A perfect example of why real world results are often different from study results

In a recent study published in the journal, JAMA Internal Medicine, researchers did something way more researchers should do.

They set out to compare what happens in the real world (IRL) from a medical recommendation versus what happened in the study that gave birth to that recommendation.

In this case they decided to look at complication rates following biopsy procedures for lung abnormalities that had been found on CT scan screening tests for lung cancer, a screening test that many experts strongly recommend for smokers (and many ex-smokers).

So, in the major study that found screening for lung cancer is worthwhile (in that study, there was a high rate of finding true cancers with a low rate of complications from doing the biopsies that are an essential follow-up to make a definitive diagnosis of lung cancer), the overall complication rate was said to be 9.8%, which is still pretty substantial but perhaps acceptable if the survival rate from finding those cancers is very high as a result of the screening.

In the real world comparison study, however, the complication rate from these invasive procedures was over 22 %, and even more troubling, since complications generally rise with age, in the real world, the complication rate for screened patients over the age of 65 was 23.8 %, which is a whopping nearly 150 % higher than in the study.

Most frustrating, a very substantial proportion of patients who were screened IRL did not have a pro-con discussion about screening with their doctors; they just took it on faith that when the doc says you need a screening test, you get a screening test.

Don’t ever be one of those people: Anything that’s done to you in this business requires a thorough discussion of benefits versus costs because everything – even the most seemingly benign procedure or drug – has a potential cost.