Why Smart Trainers Believe Stupid Things: (Part 3) Regression to the Mean

In each installment to this Why Smart Trainers Believe Stupid Things series, as I did in Part 1: The Bias Towards Positive Evidence, and in Part 2: The Dr. Fox Effect, I’m addressing one (or a few) of the many different cognitive illusions, failings of intuition, and inherent biases in the data upon which we base our beliefs, so you can recognize these psychological realities and overcome them in order to better avoid erroneous beliefs, and arrive at sound judgments and valid beliefs about your training/ treatment practices.

The focus of this article, the third installment in the series, is on how ineffective pain interventions can appear to “work” in one’s “in-practice experience.”

Anytime I talk to passionate and well educated fitness and rehabilitation professionals about why they should be highly skeptical of the many claims commonly associated with practices in the corrective exercise and alternative medicine area – I consider corrective exercise to be the alternative medicine of the fitness field – those who believe in these practices always come back with statements like, ”I don’t care what the science says, it works for me and it helps my clients/ patients,” or “I know it works because I see it all the time. That’s all the evidence I need.”

The problem is: Although these claims may be true, we can’t rely on the in-practice anecdotes of practitioners (or the stories of their clients/ patients) to prove them because we can’t be sure that they haven’t misinterpreted the evidence of their own experience. And, one of the ways we all misinterpret the evidence of our own experience, in addition to the bias towards positive evidence, is the regression fallacy due to regression to the mean.

What is Regression to the Mean and the Regression Fallacy?

According to this paper published in the International Journal of Epidemiology, regression to the mean (RTM) is a statistical phenomenon that can make natural variation in repeated data look like real change.

In other words, regression to the mean is a natural phenomenon, whereby when things are at their extremes (such as one’s sporting success or pain levels), they are likely to settle back to normal (i.e., settle back into the middle), or regress to the mean.

So, the regression fallacy is the failure to take into account natural and inevitable fluctuations of things when ascribing causes to them. (Gilovich 1993: 26).

What the Sports Illustrated Jinx and the Madden Curse have in common with “I’ve Seen it Work”

As I alluded to above, examples of erroneous beliefs produced by falling prey to the regression fallacy are in many walks of life from sports performance to questionable pain relief treatments. And, we’re going to look at an example of each, starting with the professional sports world: The “Sports Illustrated jinx” and the “Madden curse.”

Basically, these are the belief that being featured on the cover of Sports Illustrated magazine or on the Madden Football video game spells doom for whatever success was responsible for getting an athlete or a team on the cover in the first place.

According to Thomas Gilovich, author of How We Know What Isn’t So, “It does not take much statistical sophistication to see how regression effects may be responsible for the belief in the Sports Illustrated jinx (or in the Madden curse). Athletes’ performances at different times are imperfectly correlated. Thus, due to regression alone, we can expect an extraordinarily good performance to be followed, on the average, by somewhat less extraordinary performance. Athletes appear on the cover of Sports Illustrated when they are newsworthy (i.e., when their performance is extraordinary). Thus, an athlete’s superior performance in the weeks preceding a cover story is very likely to be followed by somewhat poor performance in the weeks after.” And, when an athlete appears on the cover of Madden Football, it’s usually because they had an extraordinary season. Thus, an athlete’s superior performance in the year preceding the cover is very likely to be followed by a more ordinary season, which could include getting injured, as injuries are extremely common in football.

Gilovich goes on to say, “those who believe in the jinx (or the curse) are mistaken, not in what they observe, but in how they interpret what they see. Many athletes do suffer a deterioration in their performance after being pictured on the cover, and the mistake lies in citing a jinx or a curse, rather than citing regression as the proper interpretation of this phenomenon.”

Now, lets apply this principle to, for example, non-specific low back pain, which is a common issue that fitness, rehabilitation and alternative medicine practitioners regularly deal with.

It’s quite natural that when someone’s non-specific low back pain symptoms are at their worst, they’ll do things to try to get better. They may seek some type of alternative medicine treatment, or maybe go see a physical therapist. Or, if they’re already working with a fitness trainer, the fitness professional may try to use some sort of “corrective exercise” method(s) to address the issue.

With the above reality in mind, when we apply the principle of regression to the mean to this issue, we know that when low back pain is at its worst, it’s going to get better regardless, because that’s the reality of things like low back pain, and all other trends characterized by considerable fluctuation improvement and deterioration. Low points will tend to be followed by periods of improvement even if the treatment is completely ineffective. Statistical regression guarantees it.

Regression to the Mean can make Ineffective Pain Treatments Appear as if they “Work”

As we established above, regression to the mean is basically a statistical phrase for the natural cycle of things. Even when the body cannot heal itself of certain afflictions or ailments – like non-specific low back pain – these issues tend to come and go; you have good days and bad days, good weeks and bad weeks. In other words, things like non-specific low back pain unfold in a non-uniform manner, with episodes of higher pain and periods of improvement. And, it’s these periods of relief that give rise to erroneous perceptions of a treatments effectiveness, whether the treatment actually is effective or not.

That said, when some sort of alternative medicine treatment or corrective exercise intervention is introduced very soon after a flare up in a person’s symptomology; when you see improvement – as you surely will from something like non-specific low back pain or many other ailments and sickness – you’ll naturally assume that whatever you did when your symptoms were at their worst must be the reason for your recovery. So, every time you get that same issue from now on, you’ll be back to using that same treatment.

With the above reality now in focus, you can see how many, otherwise smart people who seek treatments, along with the well-meaning practitioners who deliver various interventions, will experience a positive outcome even if the intervention does nothing objectively beneficial. Especially when you combine this with our natural bias towards positive evidence. Thus, without a general appreciation of the principle of regression to the mean, along with a lack of knowledge of realities like the bias towards positive evidence and post hoc thinking, even a worthless treatment can appear effective, and any improvement is likely to be attributed to the special treatment by both the individual receiving it and by the practitioner administering it.

Post Hoc Thinking is Sloppy Thinking

The undeniable reality of things like sports performance, sickness and pain is that they inevitably fluctuate. The periods of high sporting success, or high levels of pain or sickness are eventually followed by periods of less success, and lower levels of pain or sickness. And, as the Skeptic’s Dictionary says, “to ignore these natural fluctuations and tendencies leads to self-deception regarding their causes and to post hoc reasoning.”

The words post hoc comes from the Latin phrase “post hoc ergo propter hoc,” which means after this therefore because of this. Post hoc thinking is the natural assumption we tend to have that simply because one thing happens after another, the first event was a cause of, or at least influenced, the second event.

In other words, to put this in terms of the claims practitioners make to justify their questionable practices: “My clients/ patients got better after using the corrective exercise method or alternative medicine treatment, therefore my clients/ patients got better because of the corrective exercise method or alternative medicine treatment.”

According to David McRaney, author of You Are Not So Smart and You Are Now Less Dumb, “The post hoc fallacy is the kingpin of irrational thought. Post hoc rationalization is the fairy godmother of all things inaccurate, nonscientific, mystical, mythological, and superstitious. It makes sense that this sort of thinking would lead you into dark waters because recognizing patterns, especially “if this, then that” situations, is crucially important for navigating life. It’s just that you aren’t very good at noticing when that way of thinking is dumb, and it often is. For instance, most colds last only seven days, so whatever you take often treats only the symptoms. Still, a slew of home remedies and over-the-counter medications are probably close to your heart because you believe that getting better depends on those things even though you would have gotten better just as quickly without them. Your civilization may dance at the same time very year to bring the rains so that your harvest grows tall and bountiful, but that doesn’t mean your dancing has anything to do to with growth of crops. Your team may gather and pray super hard before every game, but that doesn’t mean you won because you persuaded an all-knowing deity to provide your team with strength against your pagan kickball rivals. Despite the usefulness of automatically coming to such conclusions, that way of thinking is still fallacious.”

In short, post hoc thinking highlights the reality that your brain doesn’t like randomness, and so it tries to connect a cause to every effect. And, when it can’t, you make one up.

The Problems With Arguing Anecdotes As Proof of Fringe Health Practices

As discussed above, a pain episode will cause a sufferer to seek the help of some sort of alternative medicine treatment or possibly motivate a fitness trainer to use some sort of “corrective exercise” method(s). And, as the pain begins to naturally (i.e., regress to the mean), the relief is often wrongly attributed to the therapy by both the individual receiving the intervention and the practitioner who’s administering it. Therefore, those individuals receiving the intervention, who are unaware of the principle of regression to the mean and their own bias toward positive evidence, will surely see the intervention “working” for them. And, they’ll most certainly provide a steady stream of confirmation to the practitioners, who are also commonly without a general knowledge of the principle of regression to the mean, post hoc thinking, etc.. Thus, the practitioner will regularly see what they’re doing “working,” in their “in-practice experience,” and feel the science is wrong when it collides with that experience.

However, with the realities provided above, we know this is a classic example of post hoc thinking and a lack of a general appreciation of the principle of regression to the mean. When people and practitioners ascribe the improvements they’ve experienced in pain levels to some questionable interventions by saying “it works for me,” I say, “I have no doubt you saw improvement, but how do you know the results you’ve seen in-practice aren’t due to the natural history of pain regressing to the mean?” These practitioners cannot answer with any meaningful reply, because with anecdotal testimony alone, they have no reliable means of distinguishing whether they’ve seem improvement through regression to the mean or not.

Regression to the mean might very well be the true explanation for one’s return to health, but these practitioners who present their in practice anecdotes as “proof” simply cannot reliably tell whether their clients/patients were going to get better anyway. Therefore, they cannot not honestly answer “no, it wasn’t due to regression to the mean,” nor can they honestly say it wasn’t due to any other things like the placebo effect, or their bias toward positive evidence, etc. that often cause all of us to misperceive, misinterpret and even misremember the evidence of our own experiences.

All these practitioners can do when faced with this question is restate their original statement: “All I know is, I feel as if it works. I see people get better after it,” which is an epistemologically vapid response. Not to mention, it’s basing their beliefs about their practices on “evidence” that is purely subjective and arbitrary. I don’t know about you, but I feel clients and patients deserve much better for their valuable time and hard-earned money than to receive interventions that are based on arbitrary assumptions.

Now, as long as this as far as a practitioner wants to go – simply to share that they had an experience – that’s fine. But when someone goes further and cites their anecdotal experiences (e.g., “it works for me”) in an attempt to 1) discredit science and 2) to convince others their beliefs about the method are true, then that’s a BIG problem. We cannot simply decide such things on the basis of one individual’s experience, for the realities described above in this article, along with those I’ve described here, here, here, here and here.

This brings us to another undeniable reality, which is: Without objective, corroborative evidence from other sources, or physical proof of some sort, 10 anecdotes are no better than one, and 100 anecdotes are no better than 10. Anecdotes are told by fallible human storytellers, which is why we have sayings like, “the plural of anecdote is not data.” The most to reliable means to determining which health interventions actually do work, and which ones are ineffective, but may still appear to work, is through scientific testing.

Science gives us precise, objective specifications of what constitutes success and failure, because, as we’ve seen, without that our hopes and expectations can, and often do, lead us to detect more support for a given treatment than is actually warranted.

Conclusion

The regression fallacy and post hoc thinking show how smart people can experience causal relationships where there are none, often leading us to develop elaborate explanations for phenomena that are the predictable result of statistical regression. As you’ve seen, these inescapable realities cause us to misidentify normal fluctuations as meaningful patterns, and ascribe causality were in fact there is none.

Finally, what I find usually separates skeptics from those who believe in fringe health practices more than anything else is a thorough knowledge of the mechanisms of self deception, the basic flaws in our reasoning apparatus that lead us to see patterns and connections in the world around us, that upon closer inspection reveals that in fact, there are none. And, that’s precisely the reason why I was inspired to put together this Why Smart Trainers Believe Stupid Things series in the first place; to help you become a better skeptic, to empower you with the know-how to think more clearly by being able to recognize and work to overcome the variety of innate cognitive mechanisms that cause our thinking to go wrong.

One of the key things to being a good skeptic is humility. That said, I realized it’s likely the only reason so many smart, well-meaning practitioners present their anecdotes as proof, even in the face of contradictory scientific evidence, is because they’re unaware of the innate fallibility of human reason and judgment in everyday life. When one doesn’t know about these realities, it’s very easy for one to overvalue the conclusions they and their colleagues have drawn from their “in-practice” experiences. However, when one does become of aware of these realities, if one values truth and reality, one embraces the fact that it’s not safe to let our intuitions and observations run unchecked and unexamined. It’s in our best interest to test our experiences against the scientific evidence. And, when they are tested and fail, it makes it even more likely that the science is correct, and our cognitive processes have caused us to misjudge the evidence of our own experience – Cognitive processes skepticism, and the methods of science and statistics grew up specifically in opposition to.

Muscle Building and Program Design Seminar in Montreal, Canada!

On September 20th, 2014 Brad Schoenfeld and I will be in Montreal, CA teaching a one-day workshop on Muscle Building and Practical Program Design for Personal Trainers.

Go here to reserve your spot now for this can’t miss event before the early bird discount ends on Aug 15th.

10483186_10203359972534213_2351967142184539952_n

Learn How To Program Like A Pro

Using My Must-Have Programming Charts

Learn how I how I quickly and easily create programs that get results and keep clients coming back for more by downloading my 2 must-have programming charts and watching the included video lesson.

With these two charts, you’ll be able to combine the most important functional movements with isolation exercises for the perfect balance of strength, hypertrophy and performance.

In this video lesson you’ll discover:

And much more!

If your goal is to write better programs for your clients and save time while doing so, then you’ll want to sign up for this free lesson!