Gary Taubes is an investigative science and health journalist. He is the author of six books, including the best sellers Good Calories, Bad Calories and Why We Get Fat. He is a former staff writer for Discover and correspondent for Science, and his writing has appeared on the cover of The New York Times Magazine and in The Atlantic, Esquire, and numerous “best of” anthologies, including The Best American Science Writing. He has received three Science in Society awards from the National Association of Science Writers and the Robert Wood Johnson Foundation Investigator Award in Health Policy Research.
Below, Gary shares five key insights from his new book, Rethinking Diabetes: What Science Reveals About Diet, Insulin, and Successful Treatments. Listen to the audio version—read by Gary himself—in the Next Big Idea App.
1. Keep in mind all that we don’t know.
The Nobel Laureate physicist Richard Feynman famously described the first principle of science as, You must not fool yourself—and you are the easiest person to fool. This is why many of my insights relate to remaining skeptical of your own ideas for as long as possible because the odds are good that they’re wrong.
Medical doctors don’t have this luxury because their patients’ lives might depend on a diagnosis and treatment. They can accept that they may be fooling themselves but must act confident and hope for the best. This only becomes a problem after the fact because doctors want to assume that they did the right thing. They’ll want to justify their choices even if evidence comes in suggesting they made an error. This is human nature, but it gets in the way of medical science.
One way around this is to keep in mind all that you don’t know. Not just the known unknowns, as former Secretary of Defense Donald Rumsfeld famously put it, but the unknown unknowns: all the things we can’t even imagine yet but might turn out to be important.
In diabetes, physicians of the past could have treated the disease better had they kept in mind what they couldn’t yet know. Until 1921, the only thing physicians could do for diabetic patients was convince them to avoid carbohydrate-rich foods. These seemed responsible for diabetes symptoms: the extreme hunger and thirst, the constant need to urinate, the weight loss and emaciation that eventually led to death. If patients didn’t eat these foods, they could live much longer. For patients with what we now call type 2 diabetes (older, heavier patients), they could live indefinitely by eating diets that today we call keto or Atkins (high in fat and protein, virtually absent of carbohydrates).
“It made perfect sense, except for what they didn’t know and couldn’t know.”
In 1921, University of Toronto researchers discovered the hormone insulin and pioneered insulin therapy. It seemed to be a miracle cure for diabetes. Young patients with the disease we now call type 1 diabetes could be brought back from the very brink of death with insulin therapy. So, doctors decided that the best way to treat any instance of the disease (regardless of type 1 or type 2 variant) was by letting patients eat whatever they wanted and then “covering it” with insulin.
It made perfect sense, except for what they didn’t know and couldn’t know: the long-term complications and effects of this approach. By the time they started seeing those a decade later, with the tragic death of young patients from clogged arteries, kidney failure, strokes, and a host of other problems, they assumed it was because their patients weren’t using enough insulin. They couldn’t imagine that their patients might be dying because insulin didn’t keep them healthy regardless of dietary choices. Had they kept in mind all that they didn’t know, they might have been more open to other approaches and found the right one sooner.
2. The first idea can’t take precedence.
A common problem in science is that the first reasonable explanation proposed to explain some phenomenon fills a vacuum in our understanding. We go from having no idea why something is happening to having a possible explanation. The more the phenomenon relates to human health, the more desperate physicians are to explain it and whatever treatments that implies.
Replacing an idea that has filled a vacuum with a better idea is harder than filling the vacuum in the first place. Once people decide that the first idea might be right, they start assuming it is. They base their lives, research, and decisions on it. This causes resistance to the possibility that the idea is wrong, meaning that they have been wrong.
This problem played out repeatedly in the history of diabetes. In 1889, a German scientist named Oskar Minkowski famously realized that an organ called the pancreas plays a major role in diabetes. When the Toronto researchers purified insulin from the pancreas and demonstrated that it lowers blood sugar, the diabetes community assumed that diabetes was a disease of insulin deficiency, and the problem organ was the pancreas.
But by the 1950s, it was clear that the liver plays the critical role in type 2 diabetes, which consists of 90 to 95 percent of all cases. By the 1960s, it was clear that type 2 diabetes isn’t a disease of insulin deficiency but of insulin resistance. Cells in the body become resistant to the action of insulin, and the pancreas responds by secreting too much insulin—not too little, as physicians had been assuming. Also clear by the 1960s was that the hormone glucagon also plays a critical role. Our understanding of the cause and mechanisms in most diabetes cases changed dramatically. Still, this had virtually no effect on diabetes therapy because doctors kept thinking of the disease in old ways. Had doctors held their ideas tentatively, particularly those first ideas, they might have let go of incorrect ideas earlier and provided better, less harmful treatments sooner.
3. Pay attention to all the evidence.
This insight dates back 400 years to the British lawyer/philosopher Francis Bacon, who inaugurated the scientific method. Bacon pointed out that the key step in science is testing ideas and paying attention to all the evidence they generate—not just the evidence that supports our opinions. In the lingo of science, we must pay attention to negative evidence even more than the positive.
In the history of diabetes research, this is a recurring problem. In the early 1970s, a man named Richard Bernstein became the first person with type 1 diabetes to test his blood sugar multiple times a day. Bernstein had been diagnosed in 1946, when he was 12 years old, and he had been religiously following his doctor’s orders. But he was also suffering increasingly from complications and knew his diabetes was killing him prematurely. He had three children and wanted to survive as long as he could, so he got a device for testing his blood sugar to see how it responded to the insulin he was taking, the foods he was eating, and the exercise he was doing.
“In the lingo of science, we must pay attention to negative evidence even more than the positive.”
He realized he could keep his blood sugar at healthy levels if he didn’t eat carbohydrate-rich foods, and then he could get by on low doses of insulin. Bernstein, who prided himself on his networking skills, almost single-handedly convinced the diabetes community that patients should monitor their blood sugar throughout the day.
In the early 1980s, Bernstein’s findings motivated the National Institutes of Health to fund a huge clinical trial, but researchers only tested the self-monitoring aspect as it related to insulin dosage. They didn’t want to get into the diet issue, so they told participants to measure their blood sugar and then use as much insulin as necessary in response to those readings. It was a test of intensive insulin therapy—not Bernstein’s full program. When the trial ended and the researchers realized that intensive insulin therapy delayed microvascular complications, they hailed it as the greatest breakthrough in the field since insulin.
However, diabetes specialists paid little attention to the ways that intensive insulin therapy failed: patients were more likely to become obese, which in turn worsened their diabetes. The intensive insulin therapy caused more frequent episodes of dangerously low blood sugar. This approach didn’t bring blood sugar down to healthy levels; it just made the readings less unhealthy. In many ways, the trial demonstrated all the shortcomings of intensive insulin therapy, but the diabetes community embraced the one way that they could see it as a success. They never fully tested Bernstein’s ideas, even while he went to medical school in his 40s to show how well his program worked on his own patients. Bernstein is still practicing medicine in his 80s, having lived with his diabetes now for more than 70 years.
Had the diabetes community been as open to the negative evidence as the positive, and less anxious to tell the world how successful their trial was, they might have come around to the flaws in their treatment philosophy much earlier.
4. Disagreement is good.
From the 1950s onward, diabetes specialists were aware of two ways their therapies were failing. They knew that insulin therapy made patients fatter and that telling their patients to eat less and exercise more didn’t help them lose weight. Doctors assumed it was the patients’ fault that they weren’t willing to make the necessary healthy lifestyle commitments. So doctors continued prescribing ever higher doses of insulin, even as they knew it would fatten patients.
“Doctors who wrote books about what they had learned were dismissed as fad diet book authors.”
However, some doctors did seek to question conventional assumptions. Occasionally, a patient would get frustrated and try something different, and if it worked, the physician would pay attention, even if it went against what medical associations were advising. But if these doctors then embraced something different, they were written off as quacks. Doctors who wrote books about what they had learned were dismissed as fad diet book authors. This thinking allowed medical associations and other physicians to ignore what these doctors had to say.
A lot of people who argue with medical dogma are wrong, but by assuming that they all are, we never get a chance to learn otherwise. In the diabetes story, the physicians and the health associations knew that their therapies were not doing a good enough job, yet they discouraged alternative approaches. Disagreement is a good thing. If we pay no attention to those who disagree with us, we lose the opportunity to fix what might be broken.
5. Embrace uncertainty.
People often think expertise and authority are established by stating opinions as if they are facts. Medical doctors seem particularly prone to this tendency. Scientists are taught to avoid it at all costs, knowing that they’re likely fooling themselves. Physicians and public health authorities know that if they’re not forceful enough in their advice, people won’t listen. But if their beliefs are wrong, far more harm can be done than good. One beneficial revolution in medicine from the last 40 years has been teaching physicians to communicate uncertainty and alternatives in open dialogue with patients.
The best scientists I’ve interviewed (and I’ve interviewed thousands) are those who use the most caveats when talking about their research. They don’t tell me what’s true, as if it is white and black. They walk me through the evidence without embellishment and discuss shortcomings. They explain what it doesn’t tell them as much as what it seemingly does. They acknowledge what research is needed to fill the remaining gaps in knowledge. They know better than to communicate certainty when the evidence doesn’t justify it. They don’t want to fool others or themselves. We must not fool ourselves, and we are the easiest people to fool.
To listen to the audio version read by author Gary Taubes, download the Next Big Idea App today: