These days, it is common to manage one’s diet in scientific terms: how many grams of protein one is consuming per day, the ratio of LDL cholesterol to HDL cholesterol in one’s diet, the average calories in versus the average calories out, etc. However, this way of thinking about food is extremely new; a century ago, regular people didn’t know the meaning of the word “calorie” let alone how it pertained to their health and wellness.
This change to how our culture utilizes food is thanks to the rise of nutrition science. One of the newest medical sciences — and one of the least understood — nutrition is nonetheless critically important to healthcare. Learning about the brief but impactful history of nutrition science can help patients and providers gain more control over health into the future.
Though the term “nutrition science” wouldn’t be coined until the 1950s, chemists around the world were working to uncover the secrets of food as early as the beginning of the 19th century. Some disagreement lingers regarding who discovered what when; fats, carbohydrates and proteins were known to chemists by 1800, but chemists had not recognized the link between these compounds and health. In 1827, an English physician named William Prout formally recognized what we now call macronutrients, as the main building blocks of nutrition.
In the following decades, scientists in various fields raced to purify and test these compounds in various ways. Using widely available foods like sugar, oil and milk, researchers were able to strip away everything but the carbs, fats and proteins, which they could then use in nutritional experiments. When given a diet of pure macronutrients, animals like dogs and mice were unable to survive longer than a few days or weeks — indicating to physicians that proper nutrition involves more than macronutrients alone.
During the first decades of the 20th century, continued research into macronutrients yielded the discovery of “vitamines,” or components in some foods that are organically produced but that animals including humans cannot create themselves. This was the beginning of the Vitamin Era, when scientists focused on identifying single nutrients and physicians worked to link vitamin deficiencies to disease. In the period between 1910 and 1950, all major vitamins were identified and synthesized. What’s more, doctors finally found the solution to devastating diseases like rickets, scurvy and beriberi: a vitamin-rich diet.
Toward the end of the Vitamin Era, nutrition science was becoming more distinct as a field of study within medicine thanks to an abundance of research on macro- and micro-nutrients. To help reduce the occurrence of deficiency diseases, to include starvation, physicians worked to create guidelines to help ensure their patients were getting the baseline nutrients they needed to survive, especially during times of hardship like the Great Depression and World War II. Typically, these guidelines mandated a certain number of cups of milk, cereals and meat per day or week, to help address dietary deficiencies amongst the lowest socio-economic groups.
In the 1970s, governments began involving themselves with nutrition guidelines with the intention of improving national health and increasing productivity. In the U.S., Congress created the Senate Select Committee on Nutrition and Human Needs, which in 1977 released dietary goals for the country to include reducing fat consumption, increasing carbohydrate consumption and limiting sodium. Over the next few decades, these goals would shift — usually alongside lobbying efforts from different agricultural groups.
Today, dietary guidelines are remarkably diverse, even to the point of conflict. People can seek dietary advice from their healthcare providers educated by a nutrition science course; they can find information about macro- and micronutrients from government agencies like the USDA; or they can absorb dietary beliefs and adopt habits from celebrities and cultural figures. Some dietary recommendations advocate for limiting carbohydrates, others for limiting fats and others for tracking calories instead of nutrients.
The disagreement in modern guidelines comes largely from a continued lack of consensus within the field of nutrition science. Nutrition is difficult to study; beyond deficiency diseases, the impact of nutrition on health is affected by countless environmental variables, many of them impossible for researchers to identify and account for. The future of nutrition science lies in helping patients understand their unique nutritional needs — which healthcare providers can do by staying up-to-date on discoveries in the nutrition science field.