Who’s the boss of you? My first response to the recent “news about meat” was that it wasn’t “news.” Let’s escalate that now: it wasn’t “about meat,” either.
In the sad realm of tribal diet wars where a predilection for scapegoats prevails, there are none more diametrically opposed than added sugar and processed meat. Only those inclined to overlook the rather flagrant crimes against human health of one of these can manage to blame it all on the other.
As an aside, I think both views are silly; there is more than one way to eat badly. I think the tribalization is sad, as well, and even tragic. Had we bickered so about the location of the moon, we would certainly never have gotten there. The fundamental, evidence-based, sensible truths about eating well have long hidden in plain sight, and readily accommodate more than one way to get it wrong, along with many variants on a common theme to get it right, and thus… have it your way.
But given that “meat is the enemy” versus “sugar is the enemy” represent polar opposites, and given that the same statisticians who just refuted the harms of even the most dubious varieties of meat previously refuted the harms of the most dubious additions of sugar in like manner- we should all be in agreement about the topic under assault. It’s not meat, important though that is for human health, and the fate of the planet. This recent imbroglio isn’t, and never really was, about meat.
It's not about foods, or diet, or nutrition at all.
It’s about how we humans understand what’s true, and who is authorized to tell us we are wrong when we know we are right. It’s about who is the boss of your understanding: you, or a select group of self-appointed statisticians.
Let me show you what I mean.
Imagine that experts in the synthesis of data, and in grading the strength of whatever evidence they synthesize, decide it’s important to look into some seemingly settled matter. Perhaps it’s the putative harms of carbon monoxide, or the alleged advantages of looking both ways before crossing a street. But let us say, for argument’s sake, that they decide to look into the evidence that sticking your hand into a fire will result in a burn.
These experts would find no randomized controlled trials on the topic. They would find no formal observational studies either. They would mostly find a gaping crater in the biomedical literature. Accordingly, using GRADE, they would score the evidence as very weak indeed, if existent at all.
On that basis, they might presume to publish guidelines on fire handling, encouraging people to go ahead and stick their hands into the flames if so inclined.
Would that alter your current practice? Do you think it should?
One might argue that the association between a hand in the flames and burns is just that- association. Famously, we are told, association is not causation. Maybe the fire, and the burn, are true, true, but unrelated. Maybe every time a burn seemed to result from handling fire- it was just coincidence. Or, maybe we have the temporal sequence wrong. Maybe the initial, invisible phase of an incipient burn is a kind of itch that makes people want to touch a fire.
Do you believe any of that? Are you willing to put your own skin at risk based on it? The skin of someone you love?
I trust a chorus of “NOs” to all of the above, and right you are. We know- beyond the least shadow of last doubt- that fire burns human flesh. Stick your hand into a fire at your peril.
But how do we know it?
Observation, pure and simple. Meager and mundane as the data experts make observation out to be, it is how humans know most of what matters most to surviving and thriving as humans. The observation of consistent patterns in the world around us is the primary means of learning, understanding, and knowing what is true. No method of data synthesis will ever change that.
We know the risks of drowning from observation and patterns. We know the benefits of breathing air from the same. We know the advantages of basic hygiene, and the disadvantages of being shot through the chest- from humble, homely observation.
We know what sinks, what floats, and what falls down when tossed up- from pure observation. We know to put food into our mouths rather than our eyeballs…from watching other humans successful at surviving do the same. (I don’t know if there ever were genes prompting humans to put food into their ears or eyeballs; we only know such genes were not passed along.)
We even know a great deal about what to eat from humble observation and mere experience. Imagine if we did not. Imagine if, as some seem blinkered enough to believe, we needed a randomized trial to know anything. Well, follow that to bedrock and you wind up with no basis to say what is “food” in the first place. After all, food is what we can ingest and digest so that it nourishes us, and that varies by species. But how do we know what’s suitable for our own? How do we know, for instance, that paper clips are not good Homo sapien food? Shoelaces? Shaving cream?
Absent RCTs and meta-analyses to tell us whether these are food or not food, and absent any robust clinical trial evidence that they are harmful, should we issue guidelines in favor of paper clip salads with shaving cream dressing?
However much as it may have seemed to be, the great health news imbroglio of the past weeks was not about meat. It was about how human beings learn and know what’s true, and who gets to tell us.
Human minds go to extremes; that’s why there are terrorists, and religious fanatics. Perhaps it’s also why there are meta-analysts inclined to issue guidelines at odds with their own data. Once you fall in love with the idea that you own the one, best way to decide what is and isn’t valid evidence, you can apparently fall into some very odd propositions as well.
Where does the tendency come from to take our views beyond the reach of reason and into the realm of extremes that part company with sense? There was likely a survival advantage for members of one small tribe confronting the potential hostilities of any other small tribe to rally around their shared convictions. The case has been made that the success of Homo sapiens as social animals was much driven by our passion to conform. Conformity begets more conformity just as the mass aggregated by gravity generates more gravity.
There is nothing to suggest scientists are immune to this. Gather only with those of like mind and shared opinion, and your world view -right or wrong- will amplify in echoes around you, until it’s the only view you see, the only truth you can hear.
That doesn’t mean it isn’t wrong. We humans are far better at adopting the perspective our “tribe” imparts to us than we are at judging its objective merits. We must all, accordingly, beware the gravitational allure of any given echo chamber.
When a given metric produces absurdities, we are not obligated to embrace the absurd. Rather, we are obligated to devise and apply more suitable metrics.
The body mass index (BMI) is used to track trends in the long-standing problem of epidemic obesity. But the BMI is a quite crude measure, and blind to the source of elevated body mass. It does nothing to differentiate fat from muscle, for instance.
Accordingly, the world’s most supremely “shredded” body builders would register as obese by this metric. In reality, their extremely low levels of body fat and extremely high lean body mass make them exactly the opposite.
This by no means makes the BMI useless; it is of real value at the population level where we do have epidemic obesity, and do not have epidemic “shreddedness.” But it does make abundantly clear that the tool used for measurement must match the task, or nonsense ensues. This applies equally to measures of body mass, as to the mass of evidence on which understanding depends.
I have a final and rather grave concern about this recent meaty debacle, and all the more so because it reached us on the pages of the Annals of Internal Medicine, flagship journal of the American College of Physicians (the ACP), in which I am a fellow. The ACP has long been in the very vanguard of medical ethics, identifying and promoting the highest standards of propriety.
What does it mean, then, when “guidelines” are published on such rarefied pages, at odds with… the Hippocratic Oath? The prime directive of medical ethics, first do no harm (primum non nocere), though somewhat erroneously ascribed to that oath, is embraced as a professional standard just the same.
Of course, none of us can avoid harm entirely, because even doing nothing can result in harm. So there is a practical translation of this aspiration known as “the precautionary principle.” Honored in the houses of Medicine and Public Health alike, this principle asserts that what appears to be a potential threat to the well-being of people must be treated as such until proven otherwise.
In other words, people are innocent until proven guilty. Potential, apparent threats to the health of people are guilty until proven innocent. The burden of proof is reversed.
By way of reminder, the recent series of systematic reviews on meat ingestion all reported statistically significant evidence of harm. The application of a particular scoring metric that graded those findings as “uncertain” certainly did not establish any evidence for the opposite: reliable harmlessness. The precautionary principle and Hippocratic Oath make abundantly clear how such “apparent but arguably uncertain” risk should be handled. But guidelines were published advocating just the opposite.
In a journal and from a College historically so devoted to the highest standards of medical ethics, I find this both bewildering, and sad.
So this was never really about meat, any more than it is about your need to master the minutiae of meta-analysis, strength of evidence metrics, and confidence intervals. It is all and only about how you know- truly know- that fire is too hot to handle, and whether you would let a statistician “guide” you out of that conviction by weighing and measuring your evidence, and finding it wanting.
This was never about meat. If you think otherwise, no need to worry about fire. You’ve been burned already.
David L. Katz, MD, MPH, FACPM, FACP, FACLM, is the Founding Director (1998) of Yale University’s Yale-Griffin Prevention Research Center, and current President of the American College of Lifestyle Medicine. He has published roughly 200 scientific articles and textbook chapters, and 15 books to date, including multiple editions of leading textbooks in both preventive medicine, and nutrition. He has made important contributions in the areas of lifestyle interventions for health promotion; nutrient profiling; behavior modification; holistic care; and evidence-based medicine. David earned his BA degree from Dartmouth College (1984); his MD from the Albert Einstein College of Medicine (1988); and his MPH from the Yale University School of Public Health (1993). He completed sequential residency training in Internal Medicine, and Preventive Medicine/Public Health. He is a two-time diplomate of the American Board of Internal Medicine, and a board-certified specialist in Preventive Medicine/Public Health. He has received two Honorary Doctorates.