Many disabling and often fatal diseases are caused by lack of an essential vitamin or mineral in the diet such as:
- Scurvy (vitamin C)
- Beriberi (thiamine/B1)
- Pellagra (niacin/B3)
- Rickets (vitamin D)
- Cretinism and goitre (iodine)
- Pernicious anaemia (a failure to absorb vitamin B12).
In the first half of the twentieth century most of the essential nutrients were identified and their ability to cure these diseases confirmed. These diseases have had devastating consequences, for example:
- Between 1900 and 1950 there were 3 million cases of pellagra in the USA and 100,000 deaths
- In the early 1900s there were 150,000 cases and 30, 000 deaths per year from beriberi in British Malaya and 12-20,000 deaths in the American occupied Philippines
- In 1900 around 80% of children in Boston had rickets with similar prevalence in many British and northern European cities
- Prior to the 1920s a diagnosis of pernicious anaemia meant death was inevitable until it was found that eating raw liver (a very rich source of B12) could alleviate the condition.
These major breakthroughs meant that between 1929 and 1943, no less than 12 individuals shared 7 Nobel prizes for vitamin-related work during this golden age for nutrition researchers.
During the second part of the 20th century, the focus of nutrition research shifted from prevention of deficiency diseases to prevention of the chronic diseases that now cause most of the death and disability amongst affluent sedentary populations. Diseases like cancer, cardiovascular disease, tooth decay, osteoporosis and type 2 diabetes. These diseases were shown to be linked to a sedentary lifestyle and diets that are high in (saturated) fat, sugar and salt but relatively low in starch, dietary fibre and fruits and vegetables.
For more than 40 years we have had a pretty good idea of what nutrients are essential, how much of these are needed to prevent any signs of deficiency and the general characteristics of a diet that would reduce or delay the toll taken by the so-called diseases of affluence. Our abundant knowledge and understanding of nutrition has not always been fully translated into health improvement.
- In 1915, David Marine said that “endemic goitre is the easiest known disease to cure” yet hundreds of millions still suffer iodine deficiency and this is still the most common cause of mental retardation in the world’s children.
- Hundreds of thousands of children in the world still die or go blind each year due to vitamin A deficiency (“factor A” extracted from butter fat c1914).
- Many British adults and children are still classified as having inadequate intakes of one or more essential vitamins or minerals; a third of adults take supplements but these make no impact on the prevalence of inadequacy because they are only taken by those who do not need them. There has been a resurgence of rickets in the UK recent years with close to a thousand children hospitalised each year.
- By 1991, it was clear that supplements of folic acid (vitamin B9) taken just before conception and in early pregnancy could prevent around three quarters of neural tube defects like spina bifida and anencephaly yet advice to women to take supplements when planning a pregnancy made no measurable difference to the incidence of these defects in Britain and Europe.
Major improvements in nutritional health are now not held back primarily by lack of knowledge and understanding but by economic and political factors and a lack of compliance with nutritional advice and guidelines. Major and relatively costly programmes aiming to eliminate or minimise smallpox, rinderpest (an African cattle disease), polio and measles have been undertaken with a high level of success. However, deficiency diseases that could be cured or prevented by a simple dietary supplement or food fortification like iodization of salt have not been eradicated. These deficiency diseases still exact an enormous toll of death and disability even though cheap and effective cures have been known for around a century.
In more recent times, it seems to me that much nutrition research is lacking in real purpose and direction. Much of the research that generates headlines in the media is focused on looking for improbable or tenuous links between individual foods or food components and diseases.
Some of this research linking dietary components and diseases is improbably presented as having potential for drug discovery. Many important drugs have their origins in plants or other natural products but I can think of no important drug that has come from a common food. The nature of drug actions means have that they are likely to have side effects and to be toxic in excess so perhaps this failure of foods to be a useful source of drugs is to be expected as we have learnt to avoid eating them. Many of the potential drugs in plants also have an unpleasant taste.
Many papers report that a high or low intake of a food or component is associated with an increased or decreased risk of developing a particular disease or that studies with isolated cells or animal models give some preliminary evidence for such links. These associations or effects are usually weak and inconsistent and even where statistically significant the effect size is usually small. In many cases there may be a steady trickle of papers some of which support the association and some which do not. In most of these cases there seems little prospect that evidence will accumulate in the foreseeable future that is strong enough to justify encouraging dietary change based on any one of these claimed links. More likely this will become another research blind alley which will soak up researchers’ time and resources and generate papers for many years but with no serious prospect of producing any practically useful conclusion.
I spent a few minutes searching the BBC news website and found many headlines making such claims over the last 20 years or so. Some of them make a fleeting appearance and then disappear whereas others crop up several times. I personally do not believe that any of them will turn out to make any significant contribution to the prevention or treatment of disease, their only value will be in their marketing value for some products and boosting the publication list of the researchers involved. For example:
- Blackberries and dementia
- Olive oil and cancer/inflammation
- Green tea and cancer/Alzheimer’s/heart disease/arthritis/HIV/obesity
- Garlic and cancer/heart disease/MRSA/malaria
- Turmeric and cancer/arthritis/Alzheimer’s disease/cystic fibrosis
- Fish oil and depression/anti-social behaviour/exam performance
- Pomegranates and cancer/heart disease
- Watercress and cancer
- Vitamin C and infections/cancer/blood pressure/gout
- Broccoli and cancer/arthritis/heart disease
- Galactose (from milk sugar) as a causal factor in ovarian cancer.
Over the coming weeks I will give a brief account of the rationale and evidence for a few of these claims in short summary pieces; I will group these under the general heading “Food Claims”.
As well as these relatively minor blind alleys, there have been several major topics that have soaked up large amounts of research and other resources and yet have turned out to be mistakes base upon false assumptions or misinterpretation of early research. I have written in detail about three of these in my books and papers and in two cases I have discussed them on this blog:
- The belief that protein deficiency was the most serious nutritional problem in the world. This false belief produced not only a huge output of scientific papers over several decades but also led to a huge investment of money in measures to combat this illusory world protein crisis.
- The belief that extra antioxidants in the form of supplements would extend life expectancy in generally well-nourished people has been responsible for a vast literature of research papers. Antioxidant supplements have actually been found to be more likely to do harm than good and thus one must question whether there is any justification for boosting intakes of antioxidants by consuming specific foods based upon their high antioxidant content.
- The belief that defective heat generation in a tissue called brown fat was an important cause of human obesity and thus that obesity might be successfully treated by so-called thermogenic drugs that switched on brown fat to burn off surplus calories. This false belief was encouraged by false interpretation of studies made with genetically obese mice.
There is a growing body of scientific literature which is making an increasingly convincing case that a high proportion of published scientific research output is not reproducible and/or wrong. An article published in the Lancet in 2014 suggests that 85% of the US$240 billion spent globally on biomedical and public health research was wasted. I initially found this figure of 85% shocking but on reflection I would have found it difficult to make a case that this figure was exaggerated in my own area of expertise, nutrition. Of course the very nature of scientific research means that a relatively high wastage rate is inevitable as people’s bright ideas and exciting theories turn out to be incorrect. Some of the money classified as wasted may have resulted in the training of new researchers which is a useful output in its own right if they are properly trained.
The corollary of the 85% wastage rate is that all of the undoubted major advances in scientific understanding and more effective treatments can be attributed to the remaining 15% of expenditure. If this 15% figure could be raised then this would be the equivalent of a free boost in productive research spending. Increasing this 15% to 30% would have an effect equivalent to doubling of research expenditure.