Imagine, just for the sake of argument, that you are open-minded about the question of whether the weed-killer Roundup (long produced by Monsanto, which was recently acquired by Bayer AG) causes cancer. You want to make a decision based on scientific evidence. However, you aren't a scientist yourself, and you don't feel competent at trying to read scientific studies.
Geoffrey Kabat asks "Who’s Afraid of Roundup?" in the Fall 2019 issue of Issues in Science and Technology. More broadly, he uses the controversy over Roundup as a way to ask about the role of science in decision-making.
When it comes to Roundup and its active ingredient glyphosate, the Environmental Protection Agency has continually said that "there are no risks to public health when glyphosate is used in accordance with its current label and that glyphosate is not a carcinogen." As Kabat points out:
The US Environmental Protection Agency’s recent assessment is only the latest in a succession of reports from national regulatory agencies, as well as international bodies, that support the safety of glyphosate. These include Health Canada, the European Food Safety Authority (EFSA), the European Chemicals Agency, Germany’s Federal Institute for Risk Assessment, and the Food and Agriculture Organization of the United Nations, as well as health and regulatory agencies of France, Australia, New Zealand, Japan, and Brazil.
But just when you find yourself deeply relieve that the experts have reached a consensus, you ind that one agency disagrees. In 2015, International Agency for Research on Cancer listed glyphosate is a “probable carcinogen.” There are lots of reasons to be dubious about the IARC decision, and to believe the consensus of all the other agencies around the world, and Kabat runs through quite a list. Here are a few of his points:
In a bigger picture sense, the actual science over Roundup and glyphosate becomes almost irrelevant to the public disputes. The scientific question of whether glyphosate is a carcinogen is treated as identical to the question of whether one is anti-pesticide, anti-genetic modification, and anti-Big Agriculture.
The result is what the head of the European Food Safety Authority called "the Facebook age of science." As background, the European agencies are well-known for their willingness to invoke the "precautionary principle"--basically, if we aren't sure and it might cause a problem, we should prohibit it. In this spirit, a group of almost 100 scientists wrote to EFSA to complain about their decision allowing glyphosate. Here's how Bernhard Url, the head of EFSA, responded:
You have a scientific assessment, you put it on Facebook, and you count how many people ‘like’ it. For [EFSA], this is no way forward. We produce a scientific opinion, we stand for it, but we cannot take into account whether it will be liked or not. ... People that have not contributed to the work, that have not seen the evidence most likely, that have not had the time to go into the detail, that are not in the process, have signed a letter of support [for a ban on glyphosate]. Sorry to say that, for me, with this you leave the domain of science, you enter into the domain of lobbying and campaigning. And this is not the way EFSA goes.
Roundup is of course just one product, but the issue of how science will be used in public policy is of course much broader. For example, if a lawsuit alleges that Roundup causes cancer, the truth of that accusation presumably matters. As Kabat points out, it "should come as no surprise that the same factors that are at work here are at work in many other areas, whether electromagnetic fields, cell phone `radiation,' so-called endocrine disrupting chemicals, numerous aspects of diet, cosmetic talc, GMOs, vaccines, nuclear power, or climate change."
In my own contentious way, I find it especially interesting when people make strong appeals to a scientific consensus in one area, but then dismiss it in other areas. For example, those who believe that action should be taken to reduce greenhouse gas emissions sometimes accuse their opponents of denying "the science." But on occasion, it then turns out that those who wrap themselves in the mantle of "the science" when it comes to climate change turn out to oppose vaccinations or Roundup. The idea of whether to build the Keystone XL oil pipeline across Canada and into the United States went through multiple environmental reviews during the Obama administration, each one finding it would not have a negative effect. For those protesting the pipeline, like for those writing group letters to the European regulators about glyphosate, the "science" was only acceptable if it supported their prior beliefs.
One of my favorite examples about the "science" and popular beliefs involves the irradiation of food. For a quick overview, Tara McHugh describes "Realizing the Benefits of Food Irradiation" in the September 2019 issue of Food Technology Magazine. As she notes, the Food and Drug Administration recently approved irradiation for fresh fruits and vegetables, and it had already been approved for a range of other food products. McHugh writes:
The global food irradiation market was valued at $200 million in 2017 and was projected by Coherent Market Insights to grow at a 4.9% combined annual growth rate from 2018 to 2026. This projects the market size to rise to $284 million by 2026. This high growth rate was envisioned due to increased consumer acceptance since the U.S. Food and Drug Administration (FDA) approved phytosanitary treatment of fresh fruits and vegetables by irradiation. The food irradiation market in Asia is also growing very rapidly owing to approval of government agencies in India and other countries. Presently over 40 countries have approved applications to irradiate over 40 different foods. More than half a million tons of food is irradiated around the globe each year. About a third of the spices and seasonings used in the United States are irradiated.
It would be interesting to see a Venn diagram showing how many of those who believe in "the science" when it comes to climate change also believe in "the science" when it comes to the safety of Roundup, vaccinations, or irradiating food. Or perhaps there is a human cognitive bias which is more prone to believe "the science" when it warns of danger, but less likely to believe "the science" when it tells us that something we believe to be dangerous (or something that we oppose on other grounds) is actually safe.
A version of this article first appeared on Conversable Economist.
Timothy Taylor is an American economist. He is managing editor of the Journal of Economic Perspectives, a quarterly academic journal produced at Macalester College and published by the American Economic Association. Taylor received his Bachelor of Arts degree from Haverford College and a master's degree in economics from Stanford University. At Stanford, he was winner of the award for excellent teaching in a large class (more than 30 students) given by the Associated Students of Stanford University. At Minnesota, he was named a Distinguished Lecturer by the Department of Economics and voted Teacher of the Year by the master's degree students at the Hubert H. Humphrey Institute of Public Affairs. Taylor has been a guest speaker for groups of teachers of high school economics, visiting diplomats from eastern Europe, talk-radio shows, and community groups. From 1989 to 1997, Professor Taylor wrote an economics opinion column for the San Jose Mercury-News. He has published multiple lectures on economics through The Teaching Company. With Rudolph Penner and Isabel Sawhill, he is co-author of Updating America's Social Contract (2000), whose first chapter provided an early radical centrist perspective, "An Agenda for the Radical Middle". Taylor is also the author of The Instant Economist: Everything You Need to Know About How the Economy Works, published by the Penguin Group in 2012. The fourth edition of Taylor's Principles of Economics textbook was published by Textbook Media in 2017.