July 08, 2020 3 min read

First, a quick history lesson. The concept of standardised nutrient requirements was first established to help ward off starvation during the depression of 1862. Later, during World War 2, rationing started and the next set of Reference Nutrient Intakes (RNIs) was established. The emphasis changed from avoiding starvation to making sure the rationed food provided all the essential nutrients (that they knew of at the time) to reduce the risk of deficiencies and chronic disease.

The RNIs as we know them today were launched in 1991. And today, even after nearly 30 years, most remain the same. Although we no longer have a public nutrition crisis or rationing, the emphasis is still on avoiding deficiency as opposed to optimising health (although thankfully deficiency diseases in the UK are now quite rare). 

RNIs are population-monitoring tools essential for public health. They represent the amount of a nutrient that’s enough to ensure the needs of nearly all the population are met (97.5%), so that deficiency diseases don’t occur. The UK RNI for vitamin C, for example, is 40mg (1), lower than the US and EU figures which are 60mg (2) and 80mg (3) respectively. You need at least 10mg per day to ward off scurvy, so 40mg still feels a bit low, especially as we don’t store vitamin C, making daily intake even more crucial.

 

Are modern RNI values high enough?

What is concerning, is that even with these modest RNIs, we’re often not meeting them. The most recent National Diet and Nutrition Survey shows that women in particular are averaging 6% lower vitamin C than the required plasma threshold levels (6). Vitamin D is often low in the winter months, as well as iron levels in teenage girls (6).

Because RNI figures are for population analysis, they don’t address individual lifestyle requirements such as the increased need for vitamin C if your immune system is fighting off disease (4, 5) or high stress levels that may increase the need for vitamin C due to its role in synthesis of the stress hormone cortisol. Digestive function is also a factor (we are what we digest, not what we eat). Cooking practices play a role, as vitamin C, being an unstable water-soluble vitamin, can easily be lost in cooking. Socioeconomic factors can come into play too. 


All ages groups still below 5 A Day

We all know that fruit and veg are some of the best sources of minerals and vitamins in our diet. But recent research shows that staggeringly ALL age groups across the UK population have an average fruit and vegetable intake below the 5 A Day recommendations (6). 

So although deficiency diseases are thankfully quite rare now in the UK, subclinical deficiencies that aren’t so noticeable are likely to be more common than we think – particularly in light of low intakes of nutrient-dense fruit and veg. We’re certainly not at risk of starvation anymore, but for many of us, even with an abundance of food, we’re often starved of nutrition.


  1. https://academic.oup.com/ajcn/article/69/6/1086/4714888
  2. http://www.efsa.europa.eu/en/efsajournal/pub/3418
  3. 124 Brinkevich et al. Radical-regulating and antiviral properties of ascorbic acid and its derivatives. Bioorg Med Chem Lett. 2012; 22(7): 2424-7.
  4. 127 Mora JR, Iwata M, von Andrian UH. Vitamin effects on the immune system: vitamins A and D take centre stage. Nature reviews Immunology.2008;8(9):685-698.
  5. https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/772674/NDNS_Y1-9_Appendices.zip
  6. https://www.gov.uk/government/statistics/ndns-time-trend-and-income-analyses-for-years-1-to-9