Contact me for feedback or questions! I reply to everyone.

Nutrition tidbits

This is extremely WIP and will be for a long time.

- Vitamin D -
- How essential is it? -
- Do we get enough? -
- How much do we need? -
- Is Vitamin D safe? -
- Is food a good source? -
- Is Sun a good source? -
- What about supplements? -
- Other issues -
- Vitamin C -
- Selenium -
- Magnesium and calcium -
- Manganese -
- Iodine -
- Coconut oil -
- Fermented cod liver oil -
- Obesity -
- Ketogenic diets -
- Beans -
- Whole vs refined grains -
- Cholesterol -
- The Paleo movement -
- Plants DO contain Vitamin B12! -
- Is animal protein superior to plant? -
- Non-restrictive eating -

Vitamin D

Hey, I want to get this writeup out quick, before I lose the excitement about this topic. That would be really bad, because this is the most important nutrient for human biology.

How essential is it?

Look at this chart:

Disease incidence by serum Vitamin D level

The more blood Vitamin D a person has, the lesser their probability of contracting a disease such as cancer or diabetes; suffering from a fracture or a heart attack. Though many nutrients have such associations, there is no other that affects the human body in such a fundamental way. Vitamin D is special for one more reason, too, but we'll come to that later. You're probably now asking yourself the question of why should we care?

Do we get enough?

I mean, we surely get enough of Vitamin D through our regular lifestyles, right? Actually, nothing could be further from the truth. Everyone is deficient! Let's jump right in for the proof (local):

Blood vitamin D levels in various European countries

The 95th percentile column is the one that interests us the most; it lists the blood amounts of Vitamin D that 95% of people in a certain country do not reach (values will be rounded, to make it easier to read). The reason we're looking at it and not the average is because it allows us to find the best case scenario. If even the "best" is terrible, then the better ones will be even worse. And so - in 15 studies and 16 different countries, the highest "best" result reached was the Netherlands' 104 nmol/L (or about 42 ng/ml). Meaning everyone there fails to maximally prevent various cancers and diabetes. But - since the average in even that "best" study is only 65.3 nmol/L (or 26 ng/ml) - many of the participants fail to even prevent heart attacks or fractures - not just diabetes and cancers. All other countries fare worse - and some much worse. Germany or the UK - for example - don't even pretend to be healthy, getting absolutely ravaged by all the listed diseases, and sometimes even rickets with their sub 20 ng/ml averages.

As the disease incidence chart showed us, the blood level of Vitamin D that prevents all of the listed diseases is 54 ng/ml or 135 nmol/L. Something none of the European countries come even close to reaching. For the 5% of remaining people that have more Vitamin D levels than listed in the 95th percentile column, it is doubtful they gain the 13 ng/ml required to reach sufficiency. And that's considering only the best case scenario - the Netherlands study. In other countries, those outliers would need like 20 ng/ml more. Therefore, we can safely say that everyone living in Europe is deficient. Note: some evidence actually shows that other diseases might require even higher Vit D levels, but we will stick to these for now - since they alone are enough to prove our point. What about other regions? It turns out that it's not just Europe that's deficient, but the entire world. I suspected it all along but I did not have good data proving it at first, but now I found it (local). Look at the data from North America:

Blood vitamin D levels in USA and Canada

There is one study - Overton and Basu (1999) - with a 122 nmol/L (49 ng/dl) Vit D average, which is really good. Kind of a low sample size and it's in the summer, meaning the level will be falling later. But okay, we'll accept it as a study which shows that at least some people in Canada are almost at the optimal level (though still failing to prevent multiple sclerosis and type 1 diabetes completely). The rest of the studies fare terribly though, as expected. Looking at the ES (95% CI) column again, we can see that the second best study is Stein et al. (2006) in which 95% of people have a Vitamin D value lesser than 98 nmol/L (39 ng/dl). Notice how there are way more participants there, and it's based on levels from the entire year - so it's more reliable. In addition to the already mentioned, we can add many cancers to the list of diseases it fails to cover. And that's the second best! The third best is Dawson-Hughes et al. (1997) where 95% of people have Vitamin D levels lesser than 82 nmol/L (33 ng/dl). Adding worse performance against fractures and many cancers. After that, we reach into joke territory with most studies showing values of <72 nmol/L (29 ng/dl) and some even under 60 - where we can add, for example, heart attacks to the list of vulnerabilities. The worst cases - those 49 nmol/L average ones - even fail to completely prevent rickets. Clearly the North American region won't be our savior. Let's try Asia+New Zealand then:

Blood vitamin D levels in Asia and New Zealand

You should know how to read these by now. The strongest study is New Zealand's Bolland et al. (2007), where 95% of people have Vitamin D values of less than 102 nmol/L (41 ng/dl). This still isn't optimal (again succumbing to cancers, diabetes, MS, fractures), and it's the best we've got in Asia. The average is higher in Vietnam's Ho-Pham et al. (2011), but the peak is a lot lower. Most of the rest of the studies do terrible, even worse than the USA ones. There are three Indian studies, and all come out lower than 50 nmol/L (20 ng/dl - corpse level!). China doesn't do much better; though one study does have an average of 77 nmol/L (31 ng/dl), the others are 33 and 25 nmol/L (13 and 10 ng/dl - the corpses are barely twitching there). Japan way outclasses China, with one study showing 78 nmol/L average (31 ng/dl), two about 70 (28) and the others about 60 (24). All that means is that it's not ripped apart by rickets - but still dies from a heart attack. Is that all Asia has to offer? A third league competition, where the winner earns the prize of a bunch of diseases; while the follow-ups just die from heart attacks. While USA / Canada has two studies that just barely get affected by rickets, this region has eight and some very heavily so! Hey, maybe we'll find more fighting spirit in Africa, so let's fly there:

Blood vitamin D levels in Africa

One big outlier in Rabbani et al. (2009), Iran; decent sample size - and in winter too! Wow, so it's possible to be not deficient all year after all, with a 116 nmol/L (46 ng/dl, very close to optimal!) average. But, this still doesn't fully cover diabetes or MS. And, as expected all the others fail - even in Iran itself, with one study having 93 nmol/L (37 ng/dl - now succumbing to all cancers and fractures) average, one 75 (30 - adding heart attacks), one 73 (29), one 60 (24 - adding falls), one 41 (16), one 31 (12) and one 29 (12; wow, the zombie apocalypse is approaching!). Other countries are even worse; Lebanon, Jordan - can those zombies even still crawl with values of 10 ng/ml or less?! Out of 16 studies, only five manage to prevent rickets completely. Wow! If Asia was a third league competition, Africa must be the children's one. Keep in mind this analysis divides by sex, so sometimes a single study can get very different results for male and female participants (Rabbani et al. (2009) and Moussavi et al. (2005) - huge differences!). And that happens because they are Muslims, and women cover themselves up, preventing Vitamin D creation from the Sun. But even the males do terribly here, overall.

Please come back to the disease incidence chart and check what all those results mean. Cancers e.g need 40+ ng/dl (100 nmol/L) to be prevented, and only those two massive Canada and Iran outliers even reach that level! And that's not even the end of the chart! Yes, even our best result - the Canada one - fails to beat the final boss of multiple sclerosis, which requires 54 ng/ml or 135 nmol/L. How sad is that we have to be happy about those still suboptimal results? And those are extreme outliers, remember; most others are barely hanging on there, being ripped apart by cancers and fractures (even the "second best" results!). And we even have a bunch of corpses with <20 ng/ml that must have surely had their bones devoured already.

By the way, I did not re-analyze Europe, though this study does have data on it, too. Because I did it already and nothing has changed, they're just as deficient in here. Also, for transparency's sake, there are more studies that weren't included in the images, because the authors decided to avoid those that lacked the 95% CI or separation by sex. Some of those Thailand ones, with their 120+ averages, are impressive indeed; but they have two values listed there - and the second ones are a lot lower, so I don't know what to think. If we take the first ones at face value, one of the Thailand ones wins the game with a 168 nmol/L average. Hooray! But the additional studies also add way more <30 ng/ml corpses into the mix, so the overall situation becomes even worse. And we have no idea those values are even reliable; at least, the study doesn't say why there are two listed for those particular studies.

So let's recap. Piece of the puzzle #1: the blood Vit D amounts Earth residents are having. Piece of the puzzle #2: the blood Vit D amounts necessary to prevent disease. #2 is (much) higher than #1, therefore people do not have enough Vitamin D in their bloods to prevent disease.

How much do we need?

If we accept we don't have enough, how much do we actually need to ingest to reach the necessary blood amounts? Fortunately, we have a study (local) that tested just that:

Graph of Vitamin D levels after a year of supplementation
For Figure 2 we restricted our analyses to those subjects that had both their baseline visit and a follow up visit between January 2009 and June 2013 and had reported not to supplement with vitamin D at their baseline visit (1205 subjects, 2410 assessments). This analysis mimics a pre-post comparison of an intervention: a comparison of observations prior to introduction to vitamin D supplementation with observations, on average, 0.98 years after the baseline visit. As such, the blue bubbles in figure 2 represent the expected 25(OH)D level of participants who have been taken oral doses of vitamin D for an average of 0.98 year since baseline.

So they took 1205 people that haven't been taking a Vitamin D supplement before, and checked their blood levels of Vitamin D after a year of supplementation. Simple and beautiful; and it tells us that if you want the optimal (50 ng/ml or 125 nmol/L in this graph) level of Vit D, then you need to take at least 8000 IU per day for a year. Going higher than 10000 doesn't appear to make much difference. The full study tested many more people and there, some weaker results start appearing:

Graph showing vitamin D dosages required to reach certain blood levels

But, maybe those weaker results just didn't take it for long enough, since there is no duration mentioned here - unlike the earlier graph, where it was specifically one year. Either way, 10000 IU per day reaches optimal (50 ng/ml or 125 nmol/L in this graph) levels in the vast majority of cases even here. There's also this caveat: These recommendations appeared 2 to 3 times higher for obese participants relative to normal weight subjects, depending on the 25(OH)D target level. So, maybe those weaker results were just obese. You need to take into account your weight before undertaking a supplement regimen; you might be easily not getting the results you think you should be getting if you are taking an amount meant for someone with lesser "baggage". E.g your ingested 6000 might effectively become 2000 - a puny dosage. Anyway, let's try to confirm these results with another study (local):

Graph showing vitamin D dosages required to reach certain blood levels, from another study

A little weaker results, it seems. Keep in mind that this study was different in a few ways, such as:

There were no exclusion criteria, and participants included both genders and a wide range of ages, nationalities and levels of health status.

The previous study recruited only healthy adult volunteers. And so, the slightly weaker results here might simply be explained by lesser response to Vitamin D3 because of bad health (chronic disease, maybe digestive, etc) of some of the participants. However, even in this study, 10000 IU per day achieved great results:

The supplemental dose ensuring that 97.5% of this population achieved a serum 25(OH)D of at least 40 ng/ml was 9,600 IU/d.

12-13000 appears to hit 50 ng/ml (125 nmol/L) more reliably, but note the lesser sample size. Maybe we should consider 10k to be the right dose for healthy volunteers, and 12-13k for the more unhealthy ones. Either way, we do have a pretty tight range, so I consider both studies to work in concert. And so, we have just added puzzle piece #3: the amount of Vitamin D we need to ingest per day for optimal blood levels.

Is Vitamin D safe?

But is that amount safe? Briefly: yes, absolutely zero danger. Quoting from the studies cited above:

Participants reported vitamin D supplementation ranging from 0 to 55,000 IU per day and had serum 25(OH)D levels ranging from 10.1 to 394 nmol/L.
We did not observe any increase in the risk for hypercalcemia with increasing vitamin D supplementation.
Universal intake of up to 40,000 IU vitamin D per day is unlikely to result in vitamin D toxicity.
Although this data set provides no information with respect to serum or urine calcium values in these individuals, at the same time it is clear that there were no clinical evidences of toxicity

No toxicity detected for 40000 or even 55000 (!) IU per day - while we're taking only 10000 or a little more. We can include safety as our puzzle piece #4. I've also explored the topic of Vitamin D safety in much more depth in my Wikipedia report (start reading from The entire "Excess" section is a sham.).

Is food a good source?

Where do we get that amount from, though? Surely, we've all heard our beloved governments pretend food is a source. Tl;dr version: animal products have too little, while the mushrooms' version is different and doesn't work as well. Look here (local):

List of foods containing Vitamin D and their amounts

Even if you formulated your entire diet out of the above mentioned ones, I doubt you'd be able to get an amount needed for therapy (eg 10k). In realistic circumstances, I bet you couldn't manage to get even 2000 IU per day. You'd need to shove 600g sardines down your mouth without getting tired of it...every day! What about the CLO? One teaspoon gives 700 IU average, so just take more, right? Well, first of all, that would quickly get expensive; Moller's CLO costs 22 euro (archive) (MozArchive) per 250ml, and to get 2000 IU Vit D you'd need to take 25ml. Not everyone has 66 euro to blow every month - and this is for the bare minimum amount of Vit D that's useful to take. 5000 - for example - would cost you 165 euro per month. And second - even if you were willing to spend that much money - CLO is full of PUFA, and taking that much is very very dangerous. You'd give yourself more serious disease with that much CLO than with a lack of Vitamin D. Therefore, either way, you would just keep degrading health-wise if you relied on animal products as your source of Vitamin D - since the 2000 IU per day that you could realistically get from fish can only sustain a low blood level - and CLO is toxic in the amounts required. However, the mushrooms - actually - have a lot more than it seems. Though the above image lists only 1600 IU in 100 grams of sun-dried mushrooms, which is not that impressive (remember that the dried ones weigh less, so that 100g is equivalent to like 700g of fresh) - I was just reading this paper (local) which turned the whole issue around for me. The first thing that caught my attention was this:

At midday in mid-summer in Germany, the vitamin D2 content of sliced mushrooms was as high as 17.5 μg/100 g FW after 15 min of sun exposure and reached 32.5 μg/100 g FW after 60 min of sun exposure

So all I need to do is put a few of the widely available button mushrooms outside for an hour and I get 1300 IU additional Vitamin D per day. Still little, but at least I can use that as a supplement to my supplement - and maybe skip the supplement on the days I'm outside more. The really interesting part came right after the above:

An unpublished Australian study on whole button mushrooms determined the vitamin D2 content after exposure to the midday winter sun in July in Sydney (personal communication, J. Ekman, Applied Horticultural Research, 12 August 2013). Sun exposure to a single layer of small button mushrooms was sufficient to generate 10 μg D2 / 100 g FW after 1 h, while large button mushrooms took 2 h to generate the same amount of vitamin D2.

Remember the Vitamin D winter (a time when humans cannot generate Vitamin D from the sun)? Well, mushrooms say "fuck that, we're superior to you pathetic humans" and keep producing regardless. There's still one little problem, namely that the amount is just not satisfying. Assuming you were able to eat 150g of solar activated mushrooms per day, that's still only about 2000 IU of Vit D - which might keep you from going seriously deficient, but isn't enough to sustain a healthy level. However, the mushrooms have one more trick up their caps yet:

Sun-drying is one method used for drying mushrooms in Asian countries. Analysis of vitamin D2 and ergosterol content of 35 species of dried mushrooms sold in China revealed they contained significant amounts of vitamin D2, with an average of 16.9 μg/g DM (range of 7–25 μg/g DM [48]). No details were provided on the method of drying, nor the time since the initial drying. The moisture content of the commercial dried mushrooms varied, although the majority contained 3–7% moisture.

Wow! An average of 680 IU Vit D per GRAM! of dry mushrooms is fucking insane. Since they tested 35 species, presumably all mushrooms are capable of this - including the common white button. So, the earlier low results were just because they didn't let them sit in the sun long enough. If you do (which is required for them to dry), then you get an absolutely insane amount of Vitamin D. Just 10 grams of the dried mushrooms (equivalent to about 70 grams fresh, or two button mushrooms - Commercial dried mushrooms [...] have about 15% of the original weight of fresh mushrooms) gives you almost 7K IU Vit D per day average. And since they are dried, it preserves well:

Three types of mushroom (button, shiitake, and oyster) exposed to a UV-B lamp and then hot air-dried, had relatively good retention of vitamin D2 up to eight months when stored in dry, dark conditions at 20◦C in closed plastic containers

Though this quote speaks about hot air drying, I don't see why that would make a difference. Here's one more source (archive) (MozArchive) with similar results, for good measure:

The third set of mushrooms was dried outdoors in the sunlight with their gills facing upwards for full sun exposure. The most vitamin D was found in shiitake dried with gills up that were exposed to sunlight for two days, six hours per day. The vitamin D levels in these mushrooms soared from 100 IU/100 grams to nearly 46,000 IU/100 grams

Again preserving well:

Most interesting to me is that when we tested our mushrooms nearly a year after exposure, they preserved significant amounts of vitamin D2.

Is this the cheap, natural, DIY Vitamin D supplement we've all wanted? Well, there's one catch: it's Vitamin D2 and not D3 - though how much this actually matters is debated (I'll research it soon). UPDATE: I did research it, and found some beneficial effects of Vit D mushroom ingestion (animal studies, because sadly, I cannot find a single human one that bothered to test relevant dosages):

Bone building (archive) (MozArchive):

Femur BMD [bone mineral density] of the experimental group was significantly elevated compared to initial femur BMD of the study group.

Scaled to weight, the Vit D dosage used in this study would be equivalent to about 10000 IU in a human.

Immune system activation (archive) (MozArchive):

Plasma 25-hydroxyvitamin D (25OHD) levels from UVB-exposed mushroom fed rats were significantly elevated and associated with higher natural killer cell activity and reduced plasma inflammatory response to LPS compared to control diet fed rats.

This study used 600 IU, but still somehow had positive effects.

Improvement in metabolic markers (archive) (MozArchive):

A significant decrease in serum triglycerides (from 103 to 75, 69 and 72 mg/dL), total cholesterol (from 267 to 160, 157 and 184 mg/dL), and LDL cholesterol (from 193 mg/dL to 133, 115 and 124 mg/dL) along with an increase in the HDL/LDL ratio, and improved glucose levels were documented.

Prevention of liver damage (local):

that the proportion of severe liver injury (defined as ALT >2000 U/L) was 100% at the first three groups (untreated, vitamin D, and nonenriched mushroom extract), but was dramatically decreased to 0% at the enriched mushroom extract-treated group, demonstrating the synergistic effect of the two different treatments, from the second experiment.

If I understood it right, the mice were fed the mushroom meal 3 times per day, with 1.125 IU Vit D per meal, meaning 3375 IU overall. The mushrooms were also shown to work better than synthetic supplements.

So, those are some of the positive effects I've found. However, I cannot in good conscience fail to mention the fact that the mushroom version of Vitamin D is absorbed and utilized worse (archive) (MozArchive) in humans:

After standardizing to 100,000 units of drug, increases after cholecalciferol (2.7 ± 0.3 ng/ml) were more than twice as great as those from ergocalciferol (1.1 ± 0.3 ng/ml)

This has been shown over (archive) (MozArchive):

Both produced similar initial rises in serum 25OHD over the first 3 d, but 25OHD continued to rise in the D3-treated subjects, peaking at 14 d, whereas serum 25OHD fell rapidly in the D2-treated subjects and was not different from baseline at 14 d.

And over (archive) (MozArchive):

Subcutaneous fat content of D2 rose by 50 μg/kg in the D2-treated group, and D3 content rose by 104 μg/kg in the D3-treated group. Total calciferol in fat rose by only 33 ng/kg in the D2-treated, whereas it rose by 104 μg/kg in the D3-treated group.

Meaning, the D2 you ingest will increase your blood levels by less, and also get used up faster, than the equivalent amount of D3. There are also indications (archive) (MozArchive) that D2 cannot substitute for all the things Vitamin D3 does:

Surprisingly, gene expression associated with type I and type II interferon activity, critical to the innate response to bacterial and viral infections, differed following supplementation with either vitamin D2 or vitamin D3, with only vitamin D3 having a stimulatory effect.

This study is really the one that made me reconsider the reliance on mushrooms for Vitamin D intake. Because, we can kind of deal with the lesser increases or faster disappearance of Vit D by just taking more of it every day. But if it doesn't actually fulfill all the needed jobs, that's a tougher obstacle to clear. And for me, immune support was the main reason for recommending Vitamin D in the first place. Unfortunately, it appears that the mushroom version doesn't do as well in terms of it, in comparison to the Sun.

I really wanted the mushrooms to be viable, but they might not be that great, after all. They do contain a lot more Vitamin D than the animals, in addition to their wide availability, DIY nature and good preservation capability. However, so what if they cannot perform all the needed jobs? Which is what would make sense, since D3 is the form we naturally make when exposed to the Sun's rays and have adapted to it through millions of years of wild living. Very little has been as constant during the existence of life on Earth as the Sun, and we should expect our biology to be heavily reliant on it. Now, it's still theoretically possible that the mushroom version will eventually reveal itself as an adequate replacement for the Sun one, but it's going to take some really convincing research to show that - one that doesn't exist at the moment. Though it has been shown to fulfill some of the relevant jobs, we can't be sure that it can do all of them - and there are already indications that it can't.

To be quite honest, almost all the Vitamin D2 research that is currently done is on the synthetic version, which has already been shown to matter in that liver study I mentioned. And though this is a mistake that should be remedied, we can't assume that's going to resolve all the problems. That would be shooting in the dark and going against millions of years of evolution. In the end - though supplementing with the mushroom powder should be safe - I can't in good conscience recommend it as a total replacement for the Sun. Maybe it's time to realize that the problem is our unnatural living conditions, and that the solution is just going outside.

If you want to experiment with this anyway, and use the white button mushrooms for it, then you need to cut them up first - as it's the brown insides that generate Vitamin D; and by default, the cap almost fully covers them. After you're done preparing the mushrooms, I recommend grinding them with an electric herb grinder; this will heavily reduce the space taken. You can then just take a teaspoon of powder every day you need additional Vit D. If you do not leave the house enough, and don't have access or don't want to take supplements, then this is still going to be much better than a complete Vitamin D starvation (a reality for many people these days).

Is Sun a good source?

Fortunately, there is another source - the Sun - which is where Vitamin D3 originally comes from, anyway. Getting enough from it is trickier than it seems, though. For a start, there is a concept called the Vitamin D winter (archive) (MozArchive), meaning the part of the year during which the Sun doesn't generate any Vitamin D:

Showing the concept of Vitamin D winter, in terms of how many days during the year the Sun doesn't generate any Vitamin D in different European capitals

And so, if you just happen to be deficient during the Vitamin D winter (half a year or more for some regions), the only option remains supplements. But what if it's summer? Well, you still need to actually get outside, and today's office or homebound people might have a problem doing that every day during the proper hours (from 10 to 15, usually). Even if you do end up doing so, you need the proper clothing to generate enough Vitamin D:

Chart showing the percent of Vitamin D generation with different body parts exposed

Though you can theoretically generate Vitamin D in autumn or spring, people already cover themselves up too much during those seasons. So, if you want to rely on the Sun for becoming Vitamin D replete, the short (for most regions in Europe or Asia e.g) period of summer is when you need to store enough for you to survive the Vitamin D winter, during which your levels will be dropping every day. This is doable in principle, but not with modern lifestyles (try telling the hikikomori he has to be naked in the Sun for an hour every day). And again - if you're reading this during the other seasons - you can't afford to wait since the lack of Vitamin D makes you susceptible to pretty much every health problem that exists.

It would be great if we could rely on the Sun as our only Vitamin D source (hey, it's free; and cannot be affected by false advertising, or laced with who knows what additives) - and probably healthier for various reasons - but the vast majority of us will not be able to manage it in this day and age. Hell, if you look at the disease incidence chart - even Outdoor workers in late summer and Tribal East Africans do not reach 50 ng/ml. Do you think you can, with the Sun alone? However - with my newfound appreciation of mushrooms as a Vitamin D source - the "weaknesses" of the Sun-only approach are much more managable. On the days you don't go outside enough (and during the Vit D winter), just eat more of the dried mushrooms you've stored. On the days you do, you can take less or skip them altogether. This mix should make it viable to replace supplements, especially since the Sun generates Vitamin D3, which should defuse the worries about the mushroom D2 being inadequate.

What about supplements?

I do not like using those, and wouldn't even consider it for any other nutrient, as they are all relatively easily gained from a well prepared diet. With this one, it's pretty much impossible to get enough from food, regardless of how hard you try. So the Sun remains, except modern lifestyles also make this option pretty much impossible; and even if they didn't, it just doesn't shine hard enough for half the year in many countries. As you can see, this situation is quite special.

You have to be careful with the supplements, though, because the companies are not your friends and are in it just for profit. So you can expect them to cut corners during production (archive) (MozArchive); resulting in less, more, or none of the advertised ingredient(s) - or even the additions of other ones that they think will give you the effects you're looking for:

Twenty-three of 57 products (40%) did not contain a detectable amount of the labeled ingredient. Of the products that contained detectable amounts of the listed ingredient, the actual quantity ranged from 0.02% to 334% of the labeled quantity
Seven of 57 products (12%) were found to contain at least 1 FDA-prohibited ingredient (Table). Five different FDA-prohibited compounds were found, including 4 synthetic simulants, 1,4-dimethylamylamine, deterenol, octodrine, oxilofrine, and omberacetam.

But, that was about sport supplements - what about Vitamin D? Well, do you think that the health industry is somehow immune to this issue? Then you're in for a surprise (archive) (local):

One of the supplements (vitamin and mineral Formula F; Table 1) was purported to contain 1600 IU (40 mcg) vitamin D, 99% as D3. Analysis by UV spectrophotometry and HPLC revealed that each capsule contained a significantly higher amount of 186,400 IU (4,660 mcg) vitamin D3
In addition to this manufacturing error, there was an error in labeling recommending 10 capsules instead of one capsule per day. Thus, the patient consumed 1,864,000 IU (46,600 mcg) of vitamin D3 daily for 2 months, more than 1,000 times what the manufacturer had led the patient to believe he was ingesting.

Heh. By the way, he didn't die. No kidney failure or any of those other monsters the mainstream keeps scaring us with, either. But that isn't the point. The point is, you can easily fall into a trap in this unregulated market. So, just be careful is all I'm saying. Try buying from a reputable company, if that even exists. This site seems good for recommendations. But remember, that nothing is totally without risk. And the risk of being hurt by Vitamin D starvation is surely incomparably higher than what happened to this guy (still, no long term damage with these cosmic dosages!). By the way, this issue seems to be mostly USA-specific; a while back an independent testing company checked many of the Vit D supplements available in Poland, and they were all correctly labeled and without random unlisted crap thrown in there. But again, the only way to be really sure that you won't harmed is to rely on the Sun.

If you decide to use a supplement, drops are more effective than tablets and capsules (archive) (MozArchive) in raising blood Vitamin D levels:

Changes in 25(OH)D per mcg D3 administered, based on the results of third-party analysis, were TAB = 0.068 ± 0.016 ng/mL 25(OH)D/mcg D3; DROP = 0.125 ± 0.015 ng/mL 25(OH)D/mcg D3; and CAP = 0.106 ± 0.017 ng/mL 25(OH)D/mcg D3

Other issues

Vitamin C

Another nutrient in which it is very easy to become deficient. According to this UK nutrition database, all meats, eggs, milk products, nuts, seeds, grains (and their derivatives, like breads), herbs, spices, and oils contain a big fat zero of it. So you're left with fruit and vegetables. If you think you're covered because you eat your "five a day" (or whatever), I'll sadly have to douse that fire. I used to think that too, but when I finally decided to check it out, I was shocked to discover just how few items are significant sources of Vitamin C. Many common fruit and vegetables (such as carrots, celery, cucumbers, lettuce, onions, apricots, grapes, watermelon, pears, plums...) actually contain very little. Some types of beans do have relevant amounts, but I don't think many people base their diets on them (and still, many have zero). And remember that Vitamin C is extremely heat-sensitive (local) (The most dramatic AA losses occurred during the cooking stage (mean loss 58%, standard deviation (SD) 19.5%, range 33 - 81%), with broccoli showing the greatest mean loss of 81% (SD 2.9%)) and beans aren't eaten raw, which further ruins any plans of relying on them.

Time is another degrading factor; two weeks of keeping chili peppers at 5 degrees Celsius temperature (similar to what a fridge has) kills over half of their Vitamin C (archive) (MozArchive). Colder (freezer) temps do a little better, but still bring significant losses. So, it is best to eat all your produce fresh; but, in a regular store it's already not fresh (archive) (MozArchive) when you buy it. Losses due to not enough chewing / incomplete digestion also have to be considered (use a juicer / blender). And if you take all of that into account, it is easy to see how you might not even reach the government recommended 75 / 90mg per day. There are also indications we need more (archive) (MozArchive) at least in certain disease states - The initial mean +/- SEM baseline plasma ascorbic acid concentration was depressed (0.11 +/- 0.03 mg/dl) and unresponsive following 2 days on 300 mg/day supplementation (0.14 +/- 0.03; P = 1.0) and only approached low normal plasma levels following 2 days on 1000 mg/day (0.32 +/- 0.08; P = 0.36) [...] We confirmed extremely low plasma levels of ascorbic acid following trauma and infection.

To hit the requirements, I drink a lot of pineapple juice, and assuming all of the Vitamin C ends up in the juice, I would need about 500ml - or about one medium pineapple - per day. By using the richer sources such as clementines, kiwis, oranges, strawberries, cabbage, or chili peppers, you could get by with about 200-250g of the relevant items (some databases (MozArchive) report many times higher Vit C values for pineapple, which would make it join this club). To cover the destruction of Vitamin C by long storage in stores and in your own fridge - as well as the possible variation between countries, etc - double the amounts. This is assuming you just want to hit the government's recommendations, instead of shooting higher for better disease resistance. Realizing all this, it should be obvious that your average person basing her diet on breads, meats, oils and an occassional vegetable doesn't have a prayer of reaching sufficiency. I do not like relying on supplements if it's not absolutely necessary; you have to trust a company to put in there what it has claimed to, and deal with the additives, etc. Many (MozArchive) (archive) people (MozArchive) (archive) report (MozArchive) bad reactions (MozArchive) to synthetic Vitamin C supplements. There is also evidence (archive) (MozArchive) that Vitamin C from natural sources is utilized better than synthetic - In contrast, the urinary excretion of ascorbic acid at 1, 2 and 5 h after ingestion of acerola juice were significantly less than that of ascorbic acid. And fortunately, there still exist rich natural sources of Vitamin C, so preventing deficiency is not at all insurmountable.

Selenium

We are all likely deficient in selenium. Anti-cancer effects from eating 200 mcg per day have been found (local):

Data from the Nutritional Prevention of Cancer randomized trial have shown a significant protective effect of supplementation with 200 μg Se/d, as high-Se yeast, on cancer incidence and mortality with the most notable effect being on prostate cancer, with lesser effects on colorectal and lung cancers.

Another study (archive) (MozArchive) showed beneficial effects for diabetes:

A 200 μg/day selenium supplementation among patients with T2DM and CHD resulted in a significant decrease in insulin, HOMA-IR, HOMA-B, serum hs-CRP, and a significant increase in QUICKI score and TAC concentrations.

And yet, the average is a lot lower (archive) (MozArchive):

The level of selenium intake in Poland ranges from 30 to 40 µg/day [70]. In Spain, the intake of selenium is 44–50 µg/day, in Austria it is 48 μg/day, while in Great Britain it is 34 µg/day [30,78]

The best food source are brazil nuts, with 50 - 300mcg per nut (archive) (MozArchive). Meaning one per day should provide enough on average.

Magnesium and calcium

Scaled to weight, a wild howler monkey ingests 57 times more calcium and 38 times more magnesium (archive) (local) per day than what is recommended for us:

Amounts of minerals that a wild howler monkey eats per day compared to human recommendations

In that case, even massively increased amounts compared to what we ingest today shouldn't be harmful (they aren't for those monkeys) - and might be beneficial. Even if we assumed that we metabolize magnesium 10 times better, that still leaves us with a 4 times increase that we probably should be getting. The commonly ingested averages (archive) (local) are very close to the recommended amounts, which seem inadequate:

Estimates of people's magnesium intakes in various countries

This could explain why heart disease is the most common ailment of today's people, since magnesium is involved in a lot of heart functions (archive) (MozArchive).

Manganese

Probably the most deficient nutrient in the usual diet, since the vast majority of common products (white bread, meats, oils, eggs etc) have very little of it. I'll let Jane Karlsson (archive) (MozArchive) (an expert on this topic) speak:

The western diet looks almost as if it was designed to be Mn deficient. Lots of animal foods which have hardly any and can be very high in iron; lots of saturated fat which inhibits Mn absorption and increases iron absorption https://ncbi.nlm.nih.gov/pubmed/11697763 and lots of refined carbs whose Mn has been removed and replaced with … iron.
The importance of Mn in cell biology can hardly be overstated. MnSOD actually prevents aging.
I find this astonishing. It means Mn deficiency is arguably the most important cause of all the age-related diseases we see today.

And from another thread (archive) (MozArchive):

The average manganese intake in the US is 2 mg/day. Surprisingly, the RDA is also 2 mg/day. According to the Linus Pauling Institute, it was decided on the basis of no evidence that the average intake was enough. It might be enough in a low-iron diet, but the western diet is very high in iron.
Manganese deficiency is implicated in diabetes, and the Ma Pi diet which apparently cures diabetes has 16 mg/day of manganese. This is 8 times more than the RDA. https://www.ncbi.nlm.nih.gov/pubmed/22247543

In some old studies, manganese containing items have been shown to manage diabetes better than insulin in some cases:

He was treated with soluble insulin and long-acting insulin intramuscularly in large doses (100-200 units daily), but he responded poorly.
With an interest born of despair, we allowed him to prepare in the ward an extract by boiling the green leaves of alfalfa in water, and to drink the infusion. By this time his blood-sugar level had reached 648 mg. per 100 ml. Two hours later, he had clinical signs of hypoglycaemia and his blood-sugar was 68 mg. per 100 ml. The test was repeated thereafter on twelve occasions, the alfalfa extract being administered when his blood-sugar varied between 190 mg. and 580 mg. per 100 ml. The infusion was also given at different intervals after food and at varying times of the day. On each occasion there was the same predictable hypoglycaemic response
With 10 mg. of manganese chloride orally three times a day, fairly satisfactory diabetic control was obtained and it was possible to discharge the patient on this regimen.

There wasn't a cure (but neither is insulin). Either way, a promising area of research that still hasn't been taken up, I think. Lots of unanswered questions in that paper, and overall. The authors thought that the patient wasn't deficient in manganese:

It followed, therefore, that the patient was absolutely or relatively deficient in manganese. As there was no dietary deficiency, and as the daily requirement of manganese is very small, an absolute deficiency seemed unlikely, although this possibility could not be entirely excluded, because of the observation of an increased loss of manganese in the patient’s urine.

But again, this is assuming that the official requirements have been correctly set (they haven't).

The single best source of manganese is pineapple juice, and if you drink that, there is no need to worry about being deficient. Otherwise, whole grains.

Iodine

A similar trick was pulled here as with Vitamin D. It was assumed (MozArchive) that iodine has only one job in the body, and if that job was fulfilled, the entire requirement for the day would be met:

Thyroidal radioiodine accumulation is used to estimate the average requirement. Turnover studies have been conducted in euthyroid adults (Fisher and Oddie, 1969a, 1969b). In one of these studies, the average accumulation of radioiodine by the thyroid gland for 18 men and women aged 21 to 48 years was 96.5 μg/day (Fisher and Oddie, 1969a). The second study involved 274 euthyroid subjects from Arkansas. The calculated uptake and turnover was 91.2 μg/day (Fisher and Oddie, 1969b).

And those two studies from 1969 (!) have decided the human iodine requirements apparently for all time. And yet the same document says:

Observations in several areas have suggested possible additional roles for iodine. Iodine may have beneficial roles in mammary dysplasia and fibrocystic breast disease (Eskin, 1977; Ghent et al., 1993). In vitro studies show that iodine can work with myeloperoxidase from white cells to inactivate bacteria (Klebanoff, 1967). Other brief reports have suggested that inadequate iodine nutrition impairs immune response and may be associated with an increased incidence of gastric cancer (Venturi et al., 1993).

So - if iodine also has immune boosting and anti-cancer effects, then why are only its thyroid-related jobs considered for determining adequate intakes? Again, Thyroidal radioiodine accumulation is used to estimate the average requirement. But hey, it's now 2024, and we have more direct evidence that iodine is involved in beating infectious disease (archive) (local), for example:

In the lungs of KI[potassium iodide]-treated lambs, we found reduced hRSV A2 RNA levels and a reduction in gross and histological lesions compared with untreated lambs infected with hRSV A2. After 6 days of I 2 supplementation, treated lambs had significantly increased [I2] in tracheal ASL compared with nonsupplemented control lambs. These findings suggest that I2 administration is associated with reduced RSV disease severity.

This also puts the whole Covid situation into a different light. The seafood-eating Japanese had an extremely low death rate (archive) (MozArchive) of 0.22%. Anyway, how much iodine should we be getting? Annoyingly, the issue hasn't been researched as much as I'd like, or very much at all. Iodine deficiency was found to be common in:

It is also important to realize that using the mean to judge iodine deficiency in a population is flawed. Because, some people will score (way) above it, and some below. Then, those few that have ingested a lot of iodine will mask the many that had too little. This is exactly how they hid the extreme rate of iodine deficiency in that Germany study. Because if you read the entirety of it, you find this quote: Moreover, 283 (36%) probands had mild iodine deficiency with urinary iodine excretions between 50–99 µg/g. Yet the mean was borderline adequate...while 36% of people were deficient...hahaha. And again, all of this is according to the surely understated official standards. I'd assume that the real prevalence of deficiency - taking into account all of iodine's jobs - would be more than 80%. And it's hard to see it otherwise, as the common diet contains basically no iodine. Fruit, vegetables, nuts, seeds, grains, and spices all have none or very little. Land meat has some but not enough (you'd need to eat about 2-3kg of it per day to fulfill the official iodine requirements with it alone). Some people look at the nutrition tables and see that milk has a lot, so they dismiss the issue. But it actually might not be that good of a source (archive) (MozArchive) because:

Milk iodine concentrations in industrialized countries range from 33 to 534 μg/L and are influenced by the iodine intake of dairy cows, goitrogen intake, milk yield, season, teat dipping with iodine-containing disinfectants, type of farming and processing.

Teat dipping with iodine-containing disinfectants...seriously? Anyway - unless you're sure that the cows in your area are having that teat dipping done - you shouldn't rely on milk. Eggs suffer from the same issue, but I guess even moreso because there, the teat dipping doesn't come into play. But it gets worse; much of the iodine is - as expected from what happens to other nutrients - destroyed by cooking (archive) (MozArchive):

It was found that the mean losses of iodine during different procedures used was 1) pressure cooking 22%, 2) boiling 37%, 3) shallow frying 27%, 4) deep frying 20%, 5) roasting 6%, 6) steaming 20%.

Knowing all this, it's obvious that the average person cannot even sniff sufficiency through the common food items. The only significant sources of iodine are seaweeds and fish. It's how the Japanese (archive) (MozArchive) manage to reach intakes of 1-3mg per day (By combining information from dietary records, food surveys, urine iodine analysis (both spot and 24-hour samples) and seaweed iodine content, we estimate that the Japanese iodine intake--largely from seaweeds--averages 1,000-3,000 μg/day). Even the compromised Institute of Medicine admits 1mg per day is non-toxic, so aiming to replicate the Japanese should be safe. We really don't know how much we need, as the research to discover true iodine requirements in humans has not been done. People care so little about nutrition that most food items haven't even had their iodine contents tested at all. In a sane world, it should be the most developed science out there, since it's the basis of pretty much all disease. But, people would rather do wars and space missions and "marketing science" and other useless stuff. Anyway, we're all getting too little iodine unless our diet already includes ample seafood. We just don't know how much we're lacking, and what are the exact effects at different intakes.

This nutrient isn't like Vitamin D, however, and you can definitely get too much and hurt yourself. Though some people have benefitted from iodine megadoses, others have been wrecked by them. Even the megadose using doctors have found iodism (iodine poisoning) in a significant percentage of their patients - a positive correlation was found between those 2 parameters: zero percent iodism at a daily amount of 1.4-2 mg; 0.1% iodism with 3-6 mg daily; 0.5% with 9 mg and 3% with 31-62 mg (this is from the book Iodine for Greatest Mental and Physical Health). So, I don't recommend the megadoses (but, if you look carefully, <=2mg was completely safe). If iodine works like every other nutrient, it's going to be used up more in disease states, which is the reason I suspect some can endure the megadose and some can't. Lacking companion nutrients is another possibility; selenium (archive) is a candidate. Worrying about iodine poisoning for the common person is totally pointless, though, as it's pretty much impossible to reach those dangerous amounts through food; the real issue is avoiding severe deficiency while eating the "regular diet". As usual I recommend food sources, which appear to be utilized better than supplements:

Iodine excretion of seaweed vs supplements

If you're buying seaweed products, the label should list the iodine content.

Coconut oil

This is definitely the best oil to use, and the others cannot even lick its boots. If I had to rate the oils in terms of healthfulness, the list would look like this:

Coconut oil has proven to have benefits against pretty much everything you can think of (archive) (MozArchive) (and even what you cannot think of). This probably means it is affecting some fundamental processes inside the body - Ray Peat has some ideas about that (archive) (MozArchive). On the other hand, you can barely find even one benefit for those other fats aside from providing a lot of calories. Olive oil does have some antioxidants but no proven beneficial effects in scientific studies as far as I can see (yes, I actually looked a while back), and the significant amount of PUFA might still be dangerous when fried. Butter is probably harmless but doesn't do much other than loading you up with calories and an irrelevant amount of a few vitamins. The reason other animal fats are worse is because they have more PUFA and it is actually the fat in an animal that is the storage organ for toxins. The only reason coconut oil isn't recognized as the elixir of Gods is because of the cholesterol scare (even though CO won't raise cholesterol by a relevant amount, or at all) that somehow can't die yet.

Fermented cod liver oil

Quite a ruckus happened in natural health circles back in 2015 when Kaayla Daniel did independent testing and came up with a report exposing FCLO as a total scam (I don't necessarily agree with all the claims in there , though). Why am I bringing this up now, though? Well, this crap is still being marketed and sold as a health food. And the Weston A. Price Foundation still shills it (archive) (MozArchive) on its website, with fake lab tests to support it (this - in my opinion - makes everything written on that site suspect). This fraud is kind of unique in that literal rotting is twisted into some kind of a health promoting process. At least other frauds, like the overpriced Vitamin C tablets, are usually harmless. On the other hand - if you read Kaayla's report - it proves that many toxic substances created by fish corpse decomposition exist in the FCLO. When testimonies started piling up (archive) (MozArchive) that this stuff was harmful to human health, the usual suspects doubled down instead of dropping the product.

Health matters not in a world based on the profit motive, except as an empty promise from the product manufacturer. Except with FCLO, it's even worse than usual because not only has the promise not been fulfilled, it's been completely reversed when it became obvious that people's health has been sacrificed under the altar of the almighty dollar. You pay almost 40 US dollars (archive) (MozArchive) for a bottle of something that poisons you and eventually kills you (archive) (MozArchive). Imagine how many people have spent fortunes on this rotten poison pretending to be a health food, hoping to fix their health problems as they were promised - only to get cancer or heart failure and die. All while the sellers and the shills raked in the cash. What a cruel joke.

When are people finally going to recognize the profit motive as the main contributor to such situations? I mean, Green Pasture and the WAPF shills have discovered a gold mine with this product and are not going to abandon it (though, some WAPF members (archive) (MozArchive) did leave the organization once they realized what's up, showing that ethics prevail at least sometimes), knowing how much power money gives you in this world. It's not that most or even a significant amount of people are inherently evil; it's that the world rewards evil behaviors like fraud by giving the perpetrators a lot of money to allow them to cease working and not worry about survival, enjoy stability, high social status, trips, premium items, etc. I bet that a lure this attractive pulls in a lot of fish.

I don't think that all the nuances of this product have been figured out. We don't really know how much of the relevant vitamins are actually in it. Not all of the samples were rotten; we don't know why that is so, either. Maybe they were adulterating them with other oils or safer production processes were figured out later. But there's too much secretiveness on the part of the manufacturers, which is expected when profit motive is involved. And, people have already died from this, so it's simply practical to avoid it, either way. Even if there weren't any production issues, etc. the fish liver (from which the FCLO is made) still contains an unphysiological amount of PUFA for humans. Find your vitamins elsewhere, like the Sun and the juice.

Obesity

It doesn't happen because of overeating. This should be obvious - after all, everyone of us knows someone who eats however much they want and doesn't gain weight, as well as the opposite: the person who eats little but stays fat. Matt Stone's The Calorie Myth series should put that idea to rest. Here is a more direct disproof:

Going back to this natural diet has changed gorilla behavior. Before, gorillas only ate during a quarter of their day because the food was so packed with nutrients. Now at Cleveland, they spend 50-60 percent of their day eating which is the same amount as in the wild. With all this extra eating, the gorillas have doubled their caloric intake, yet at the same time have dropped 65 pounds each. This brings their weight more in line with their wild relatives.

The currently accepted model of obesity (archive) (MozArchive) is based on the idea that the more food you put inside your mouth (and the less that is "burned" through exercise), the more will end up inside your fat cells and increase your fat mass (we could call it the "gas tank" model of obesity). If you read that link carefully, you will see that even when they admit some factor other than overeating is responsible for obesity (e.g medications), they still try to spin it as it "making you eat more". So, the current theories cannot look beyond gluttony and laziness as explanations. They, of course, need to assume that the only function of bodyfat is energy storage - which is totally wrong (for example, toxins are stored in the fat tissue (archive) (MozArchive)). If we realize this, then isn't it rational to assume that the body increases the fat size because it wants to have a place to stick the toxins, instead of the person getting fat because the "energy balance" of their fat tissue became positive?

The mainstream theories also assume that energy storage (or "burning") is the only route food can go through. Bowel movements alone completely destroy that idea, when you realize that fats can come out in the toilet (I mean, do you really think that if you drank a tub of olive oil, you'd absorb everything from it?). I cannot find science to directly test that, but I have found (archive) (MozArchive) a study that shows a herbal formulation increasing the amount of fats that go down the toilet. Another example is how a high fiber diet more than doubles (archive) (MozArchive) the amount of fats and carbohydrates dumped through the bowel route. Therefore, it is obvious that the amount of ingested food doesn't directly determine the amount of fat gained (as in the previously mentioned "gas tank"). Now, this doesn't mean that those ways will necessarily make you lean - but they do give the body a way to dump energy through a route other than "burning", changing the focus from obesity being caused by an influx of food to the conditions of the body.

It is universally accepted that every other tissue (muscle, bone, brain, tumor, etc) has many mechanisms in the body regulating its size. Why - then - think that the body somehow loses its regulation capabilities for only the fat tissue, and the problem is suddenly about human behavior? Saying that someone became fat because food went there is like saying that someone grew hair or a tumor because food went there, instead of being the result of complex biological mechanisms. Yet that is what everyone (including scientists, fitness trainers, nutritionists, or other experts in the field who should really know better) is doing today.

It annoys me greatly how the caloric theory has consumed obesity thinking for the past century when there hasn't been any real evidence to show that it's valid. Even though every other disease is rightly recognized as a result of a multitude of different factors, obesity has been reduced to gluttony and laziness. I wonder if this isn't just a hidden desire to play the blame game against fat people, finally having a chance to surface. Either way, as long as that attitude persists, obesity research won't progress and the people suffering from it won't get the help they need and deserve. By the way, journalist Gary Taubes (archive) (MozArchive) has made many of the same points as me here - but fumbled right at the finish line by blaming obesity on the increased carbohydrate intake. However, his overall influence is very positive - unlike the "energy balance" dimwits who have imprisoned obesity research for the past century. After all, how can you figure out the factors making people fat when you have already decided it all boils down to their mental failings that make them eat too much and sit on their asses?

What to do if you're fat, then? Well, ensure you are eating a nutritious diet, for one. Vitamins, minerals, etc. are all involved in countless biological functions in your body, including body fat regulation. All other diseases are affected by low nutrition levels, so why expect obesity to be different? Again, don't try to starve obesity; it makes as much sense as trying to remove your chest hair by not eating - while ignoring the hundreds of hormonal, etc factors behind chest hair. Actually, Matt Stone has had success (archive) (MozArchive) with the overeating approach to weight loss! Yes, you read that right. When you accept that the body is able to regulate its own weight when you give it what it needs, this is expected. I mean, how can anyone think that long term starvation (the Minnesota starvation experiment (archive) (MozArchive) tested this, by the way - with the predictably abysmal results) is a good idea? Yet that is what all the world's authorities recommend to lose weight.

Obviously avoid ingesting too much of anything toxic like seed oils (archive) (MozArchive) or SSRIs (MozArchive) which increase the amount of fat tissue in your body (which in those cases is seemingly trying to protect you by stashing away the toxins). Add coconut oil which seems to send the body into overdrive in terms of using instead of storing energy. But I think it's because of the overall good effects it has on hormones etc, and thinking that it's simply "speeding up metabolism" is just falling yet again into the caloric trap we've tried hard to avoid (because, if we stopped using coconut oil, then surely "energy expenditure" would decrease, and we'd get fat again? But no, that's not necessarily so if you take bodily regulation into account). Of course, I don't claim to have all the answers for obesity, but it's a start. There are surely more factors other than eating, like good sleep, lessening needless stress, etc. Coming up with anything more substantial will first require removing the grip that the caloric theory has on all things obesity.

Ketogenic diets

They are unsustainable, pointless, and dangerous. You are required to remove from your diet food groups that are not only harmless but very beneficial. Whole grains, fruit, beans, potatoes and honey - for example. Certainly anything close to a "normal" life will not be possible anymore. You are pretty much left with just meat, eggs, butter, cheese, nuts and some vegetables. But even the nuts and vegetables have to be curtailed (archive) (MozArchive)

Be more careful with slightly higher-carb vegetables like bell peppers (especially red and yellow ones), brussels sprouts and green beans to stay under 20 grams of carbs a day. The carbs can add up. For instance, a medium-size pepper has 4-7 grams of carbs.

You really need to autistically count the carbohydrates (and protein) in everything that passes your lips, every second of every day. Even that, seemingly, isn't enough (archive) (MozArchive) to bring about ketosis. Wow, that link is really gold. Someone was eating all the right keto foods, yet still wasn't in ketosis when he measured his ketones. So he had to drop the keto angel broccoli. Hahahaha...oh wait, it's not funny. You're denying yourself everything and yet still can't reach your prized ketotic state. And of course, if you dare to put a few slices of bread in your mouth, it's back to square one...for a week (archive) (MozArchive). Do you know why it works like this? Because ketosis is fucking unnatural (to clarify, the only reason it even exists as a metabolic state, is so that our energy-intensive brain can keep itself running during long-term starvation)! If you look at hunter-gatherer groups (current or extinct), not a single one of them is in ketosis - not even the Inuit (local) - Yet seminal studies carried out on Inuit subjects in the early twentieth century yielded surprising results from a metabolic perspective. Low ketone bodies in the breath and urine were observed in the fed state [...] These results suggest that the traditional Inuit diet may not actually be ketogenic, as commonly assumed, despite being very low in carbohydrate. Yet the keto cultists will try everything to reach that state for no reason, including guzzling butter (archive) (MozArchive) or eating these keto delicacies:

Photos of dumb keto recipes

The reason you crave those abominations is because your body wants the real thing! It is begging to have a truck of carbs delivered down its throat. UPDATE April 2024: real life proof this actually happens:

Anyone else's dreams involve eating a bunch of carbs, or is it just me? (archive) (MozArchive)
But... I dream of carbohydrates. Literally. I go to sleep and, most of the time, a key feature of the dream is cheating on the keto diet and shoveling down a bunch of carbs. At first it was cake and ice cream. More recently it's less desert and more stuff like noodles.

All the commenters had similar stories. Don't you see how keto is just an eating disorder? Drop it and eat all the carbs you want, they won't kill you. In fact they are an essential nutrient (archive) (MozArchive); here we have a keto talking point refuted over a decade ago that the cultists still haven't picked up on. And of course, if carbohydrates were an essential nutrient, you'd expect there to be many side effects from not ingesting them. And that is exactly what we see; hey, the cult leaders have even graciously collected them for us:

List of keto side effects according to the keto supporters

Fucking induction flu? Have you ever heard of a "high carb induction flu" or a "vegan induction flu" or maybe a "grain eater induction flu"? Why are we eating a diet that gives us a flu? Or for that matter any of those other listed side effects? Because I haven't heard of constipation or performance decreases for any other diet (aside from long term malnutrition). Anyway, let's check out what the physical performance (archive) (MozArchive) page from the keto cult leaders says:

The second cause of reduced early performance isn’t as quickly fixed. It simply takes time for your body – including your muscles – to shift from burning sugar to burning primarily fat for energy. It can take weeks or even months. After the adaptation period, some may see significant benefits (see below).

So they know very well that keto destroys physical performance, but they pretend that magical metabolic changes will happen months down the road that will reverse all the bad effects (but even they admit that it's only some that will see significant benefits). Of course, those metabolic changes don't exist. Intense exercise relies on glycogen, which a ketogenic diet doesn't provide in enough quantity - because biochemically, it is created from carbohydrates:

Comparison of glycogen stores on diets of various carbohydrate amounts

How do you expect to do intense exercise with almost 3 times less stored glycogen than usual?

Questions remain regarding if the adaptation period was long enough, but suffice it to say that this is still an area of debate without a clear answer.

We do have a clear answer - athletes need lots of carbohydrate because nothing else supports the glycogen-depleting activities of sports:

Graph showing high fat diets lower athletic performance even after a 7 week long adaptation period

Figure comes from here (local). As you can see, after 7 weeks of "fat adaptation", the performance of the "fat adapted" subjects was still a lot worse than the carbohydrate-ingesting ones. Many similar studies exist, too. You could say you need even more months of adaptation, but there is no evidence (or even a biological mechanism) that shows that such a prolongation would have a point. And many athletes have already tried (archive) those long-term ketogenic diets with no magical metabolic improvements. There is a reason why world-class sports players are not using ketogenic diets. Edit: I actually tried to find contrary examples just now, to be sure there really weren't any. This site (archive) (MozArchive) claims that Kobe Bryant and LeBron James have done or are doing ketogenic diets. That's exactly the support the ketogenic movement needs, to defuse the worries of would-be athletes joining it, right? Unfortunately, those two do not provide that support. Though Kobe Bryant did seemingly switch to a low(er)-carbohydrate diet (archive) (MozArchive), the article doesn't mention the exact amount - but it certainly isn't ketogenic:

Sugars, specifically anything with corn syrup, should be avoided, and the intake of carbohydrates has been scaled down, consumed in moderation.

If it's consumed in moderation, then for an athlete that's probably like 150-200 grams per day, way beyond the point at which you're kicked out of ketosis. What about LeBron? Well (archive) (MozArchive):

"I had no sugars, no dairy, I had no carbs. All I ate was meat, fish, veggies and fruit. That's it. For 67 straight days," he shared.

If he ate fruit, then he wasn't in ketosis. Probably a similar carbohydrate ratio to Kobe's. Besides, this was off-season; during which the extreme performance requirements of top-level play do not come into play. I think it's time to admit that playing professional sports is simply impossible while on keto. I mean, you can be sure that the top athletes - whose legacies and standards of living depend on what they show on the field - have already explored that path, but realized that it just doesn't bring the necessary results. It might work for weight loss in the off-season; but so would raw food veganism or any other restrictive diet. Low carb - by the way - is entirely different to keto, at least as far as sports performance is concerned; because, you still have quite a bit of glycogen storage then. Anyway, let's now check out what the keto cultists have to say about constipation (archive) (MozArchive):

The prevalence of constipation on a low-carb or keto diet can be as high as 50% according to some studies. Clinicians familiar with low-carb diets, however, feel it is closer to 25%.

Yeah, that 25% figure surely saves keto when you realize that the regular prevalence of constipation (archive) (MozArchive) is 12%. And keto is supposed to be a health diet, so it should score better than the average!

Anyway, one of the ways the cult leaders recommend to cure constipation is to eat more fiber - but hey, wasn't eating broccoli kicking people out of ketosis? There are some additional ways, but why deal with all of that when you don't have to? You can just eat fruit, vegetables, grains...and all the stuff will get pushed out. There's no way constipation is going to persist on a diet full of plants (unless you're lacking nutrition). By the way, herbalists recommend having a bowel movement at least twice a day; so the real prevalence of constipation in keto dieters (or anyone) is a lot higher than it seems.

Moving on, here (archive) (MozArchive), in 5. Can you get nutrient deficiencies on low carb? - the cultists assume that the keto diet is nutritionally complete. Guess what? The nutrient that's missing in the keto diet are the fucking carbs! And as has been already shown by the multitude of keto side effects, they are very important. Another one is manganese, which is almost completely missing from a diet heavy in meat, eggs, fats and some vegetables - but is plentiful in grains and some fruit. And fiber (the amount in your usual meat+fat ketogenic diet is ZERO, the puny amount of vegetables adds only a little; remember that more than 500g or so of plant matter will kick you out of ketosis already), and vitamin C (same as fiber), and potassium (in keto you need supplements, plant eaters will reach sufficiency easily from potatoes, bananas, juice, or beans). Hell, did you know the that keto diet is literally the only one that's deficient in sodium of all things? The cult leaders even admit (archive) (MozArchive) that In addition, on a low-carb or keto diet, your sodium needs may actually increase, due to increased losses via the kidneys. Why eat a diet that pisses away all your precious minerals? By the way, this fits right in with fructose increasing mineral retention (archive) (MozArchive) - Fructose affects the body's ability to retain other nutrients, including magnesium, copper, calcium, and other minerals. Comparing diets with 20% of the calories from fructose or from cornstarch, Holbrook, et al. (1989) concluded "The results indicate that dietary fructose enhances mineral balance.". However, the rats on the sucrose diet, also vitamin D deficient, had normal levels of calcium in their blood. The sucrose, unlike the starch, maintained calcium homeostasis. And a ketogenic diet will have zero (or effectively zero) fructose.

For the cherry on top, let's look at how the cultists justify (archive) (MozArchive) the destruction of the thyroid done by the ketogenic diet:

6. Can low carb damage your thyroid? Not likely. If you eat a well-formulated low-carb diet, it’s very unlikely it will affect your thyroid negatively.

Although some studies of low-carb, high-fat diets have shown a decrease in the active thyroid hormone T3, it seems unlikely that this represents a clinical problem.
For instance, some hypothesize that our bodies become more sensitive to thyroid hormone and therefore have a different “normal” range. Others suggest fat is more metabolically efficient, and therefore less thyroid hormone is required to metabolize it.

Is there any evidence - or even a plausible biological mechanism - for this imagination? If not, then why is it even being brought up? This logic would never be used in case something like hemoglobin was out of range, and a "danger!" signal would sound in your brain, if you were sane. Especially if a wave of negative effects was staring you in the face (see later). But the keto cultist would instead come up with a story such as that you needed less blood while on keto, and boom, problem solved (or more like swept under the carpet). Anyway, messing with the thyroid hormone doesn't seem smart knowing how important it is (archive) (MozArchive):

Broda Barnes, more than 60 years ago, summed up the major effects of hypothyroidism on health very neatly when he pointed out that if hypothyroid people don't die young from infectious diseases, such as tuberculosis, they die a little later from cancer or heart disease.
Arthritis, irregularities of growth, wasting, obesity, a variety of abnormalities of the hair and skin, carotenemia, amenorrhea, tendency to miscarry, infertility in males and females, insomnia or somnolence, emphysema, various heart diseases, psychosis, dementia, poor memory, anxiety, cold extremities, anemia, and many other problems were known reasons to suspect hypothyroidism.

To finish this off, let's enter the lion's den and check out what the actual keto dieters are reporting:

These are all signs of low metabolism AKA low thyroid function. You can enter the keto subreddit and find a complaint basically every second thread (it is an interesting game to play for a while, but gets boring quickly). And they could all be avoided by...not going keto. See, the fundamental problem with the keto diet is that it is completely pointless to do. The only "rationale" for it is that "carbs are bad" which doesn't have a shred of evidence for it. Carbohydrates have been exonerated from everything they have been accused of. They do not cause obesity, diabetes, heart issues or anything else (except proper functioning of the human body :D). Many populations eat moderate or high carb diets and enjoy very good health (such as the historical Hunza). And if so, why burden yourself with all the restrictions that the keto diet requires, if it doesn't even bring any positive effects? Not even diabetes (local) - which is an often advertised reason to do keto - shows an advantage over plant-based (and non-carb limiting) diets:

Ma-Pi 2 diet derived 72% of energy from carbohydrate, 18% from fat, and 10% energy from protein, fiber equal to 30 g/1000 kcal
All patients in the Ma-Pi 2 diet group had their glucose levels reduced to the point of being comparable to subjects without type 2 diabetes (target values), following 21-day intervention in a supervised environment.

Beans

These posts (archive) have engraved themselves in my mind since I read them:

Recently Per Wikholm’s old friend and co-author came out with a book about beans and resistant starch in Sweden. He wrote it with a guy who suffers from T1D. The guy simply started eating 1/2 cup of beans with every meal, and this caused ridiculously stable blood sugars, which he didn’t have even on low carb.
I’m reading the book now. The T1D guy was on 100 units of insulin a day doing low carb. After adding the beans to each meal he’s down to 20 units a day (the authors comments that this is counterintuitive, adding carbs gives less insulin). Some days he can go completely without “food insulin”, only taking the baseline insulin. He also lost lots of weight.

An insulin dependent diabetic being able to drop 80% of his insulin dosage is just insane. This is in comparison to a low carb diet, remember - proving that the latter is not the best approach for diabetes. Modern medicine cannot touch those results, either.

Whole vs refined grains

Graph showing the loss of nutrients from refining grains

Speaks for itself. Please remember that all junk food, white bread and white rice is refined.

Cholesterol

From an interview with Ray Peat (archive):

The high cholesterol that develops in most people as they age is another thing that, in the '30s and '40s, many researchers recognized that high cholesterol was nothing but an indicator of low thyroid, the same way low blood sugar was mostly an indicator of low thyroid. There were published studies in the middle 1930’s which showed that when you took out someone’s thyroid gland, immediately the cholesterol went up, and when you gave them a thyroid supplement, immediately the cholesterol goes down.

So, high cholesterol really exists because your body isn't converting it to the steroid hormones, etc - which biochemically are all made from cholesterol. And guess what makes someone low thyroid? A keto diet (among other things). In contrast, a high carb diet with lots of nutrition would likely be curative. Statins are very harmful (archive), and should probably never be taken.

The Paleo movement

The idea behind it is that we're adapted to a completely different environment than the one we're currently living in, and this disconnection is making us sick. Sounds great, and following this basic rule would mean that you:

But the execution of this movement could barely have been worse. Look, they fell right at the first hurdle. Here is a recommendation from Loren Cordain's (the person who invented the Paleo diet) website:

Healthy oils (olive, walnut, flaxseed, macadamia, avocado, coconut)

Remind me when did those oils get invented? They can't exist without heavy technological processing that is very recent (less than 200 years) - exactly what the Paleo diet was supposed to protect us from.

I totally believe that - if people moved their diet and lifestyle in the direction of actual Paleolithic people - they'd be a lot more healthy than average. But that isn't what happened in the Paleo movement, and it seems that it was destined to fail. It is unfortunate that early on, it got taken over by the low carb / keto cultists (such as Mark Sisson, who still had lots of good lifestyle advice but pointlessly restricted carbs). Do you really think your ancestors would have dropped that juicy pineapple to the ground just because they thought carbohydrates were the devil? Lol (archive):

There’s evidence that several of the fruits we enjoy eating today have been around for millennia in much the same form. For example, archaeologists have uncovered evidence of 780,000-year-old figs at a site in Northern Israel, as well as olives, plums, and pears from the paleolithic era.

Paleolithic people also ate grains and tubers (archive):

Survival may have hinged on oats some 33,000 years ago at the Italian cave called Grotta Paglicci. Inside the cave, archaeologists have uncovered paintings and what must have been a cherished tool: a sandstone pestle about 5 inches (11.8 cm) long. Analysis reveals the pestle was studded with starch granules from a cornucopia of plant materials, including grasses similar to millet and what might be acorns, the researchers report in this week’s Proceedings of the National Academy of Sciences. But the most common starch was from oats.
Pre-agricultural people also carbo-loaded on the tubers of the purple nut sedge, a noxious weed; underground stems of the cattail, which may have been ground into flour; and the seeds of wild wheat.

Which were one of the factors (archive) that enabled the human brain size to increase (and create civilization as we know it):

Dr Hardy explained that after cooking became widespread and the salivary amylase genes multiplied, this increased the availability of dietary glucose to the brain and foetus which, in turn, allowed the acceleration in brain size which occurred from around 800,000 years ago onwards.

And so, the real Paleolithic diet does not provide the rationale for rejecting fruit, potatoes, or grains. And yet the Paleo website (archive) even claims that keto or carnivore are diets similar to Paleo! Lolwut? Not a single traditional culture ever ate either keto or carnivore. Actually, the Paleolithic principles should prevent disordered eating like keto since the ancient man ate everything he could find. The Paleo movement should have distanced themselves from the keto cultists immediately; this might have been the factor that would have kept it alive and relevant. But the supporters didn't manage to figure that out, which spelled their doom.

Even in an alternate reality where Paleo did not become symbiotic with low carb and accepted fruit, grains and tubers as healthy parts of the diet - a true Paleo diet is actually impossible to do, and probably suboptimal anyway. Look, most of the stuff available in grocery stores (which Paleo man did not have, of course) has been selectively bred over centuries - and I don't see many Paleo authors recommending to base your diet on wild foods (which do have more nutrition according to some studies). In a wild situation the big problem for people (or any creature) is finding enough energy to sustain themselves. If you listen to this interview, for example - you will see how a hunter-gatherer's major focus is on getting food and not becoming food himself; agriculture bypasses this problem. It is also important to realize we've already broken nature hundreds of thousands of years ago (archive) when we started using fire, which has allowed us to extract more value out of things like nutsedges (archive) or eggs (archive). No other animal is able to do that, which necessitates relying on their in-built digestive capabilities to break down their foods, wasting resources that could have been used to develop a bigger brain (see for example the expensive tissue hypothesis (archive)). Juicers, by the way, could be considered fire++; literally take 15 carrots that you'd never be able to eat in one sitting, throw them in there and just gulp down the result. I wonder if juicing will create Ubermensch :D. Either way, the nature break allowed us to improve, and using cooked foods for your Paleo diet is an admission that nature does not have all the answers (some people did seemingly realize this and came up with the Raw Paleo Diet). This doesn't mean nature is now useless; we're still dependent on biological stuff as long as we're biological organisms - so Paleo does have a point here. Basing your diet on mostly industrial products, ignoring exercise, sleep cycles etc. will never be good for us. But the effectiveness of fire - at least - does mean nature supremacy is not absolute.

Again, I don't even want to be overly critical here. Paleo could really have been great (if its supporters stuck to what it actually means), so consider this section more like mourning a good friend that died instead of an attack. And this is all because of the LC / keto cultists as well as the businessmen that came in down the road to shill their worthless supplements (archive) (I'm sure archeologists are going to find the magical disappearing supplement factories from the Paleolithic era soon).

Plants DO contain Vitamin B12!

...if they have been grown in pooped-on soil (archive):

Addition of cow dung at the rate of 10 g kg-1 increased the B12 content in barley kernels by more than threefold (from 2.6 to 9.1 ng g_1 DW) and in spinach leaves by close to twofold (from 6.9 to 17.8 ng g_1 DW)

This means that vegans don't necessarily have to be deficient in B12. The problem is with our modern farming practices, and not something inherent. And so, if you ate wild plants (which grow on ground that is pooped on all the time by birds, etc) even in a city, there shouldn't be a problem with getting enough B12. Hooray for Paleo :D. You could also eat poop directly for even better results:

A relatively large portion of the B12 produced in the gastrointestinal tract is, however, excreted through feces or urine. In sheep, for example, 95% of the B 12 produced in the gastric tract is not absorbed and is excreted by the feces (Friedrich, 1975), which in most cases lands on soil.

This is why dogs, rabbits, etc. eat their own poops, I guess. And we could do the same for our own fully natural B12 supplement, if we could hold the disgust. The animal poops can contain even 100 times more B12 than something like milk or meat, according to the table in this study. However, I don't recommend eating poop for real; I mean, there is a reason the body gets rid of it - all the stuff it doesn't want gets dumped there. It is also full of bacteria that normally live in your colon, but probably shouldn't be ending up in your mouth and stomach. But, it is a possibility if you're desperate for a B12 source, especially as a vegan. Though technically poop isn't vegan, but if you're eating your own, there is no ethical problem. Hey, this is the Dig Deeper Club, and here we grind issues down to dust sometimes purely for curiosity reasons...

Is animal protein superior to plant?

No. Arbitrary, worthless scoring systems like PDCAAS or DIAAS are used to shill meat protein as some kind of peak to reach, that plants supposedly can't. They are both based on the absolutely retarded assumption that you will only eat one food per day, every day. I'm not making this up; look at how the Food and Agriculture Organization calculates the DIAAS (archive):

DIAAS % = 100 x lowest value [(mg of digestible dietary indispensable amino acid in 1 g of the dietary protein)/(mg of the same dietary indispensable amino acid in 1 g of the reference protein)]

Meaning, it doesn't matter how much amino acids a food has overall; if it's low in one of them, that becomes the food's grade. It's like being a student in school who gets A (UK/USA) or 6 (Euro countries) grades in all subjects except mathematics, where he receives the lowest grade (F / 1). And insanely, that student's "overall grade" would also be F or 1 according to DIAAS system. Its flaws become apparent when you realize that you could just eat any other food for the missing amino acid(s); which is what every normal person does - eat many foods per day. In that case, the defects of a single food don't matter anymore. As what the body cares about is having all the needed amino acids available, which does not require receiving them from the same food or at the same time.

But it gets worse. These scoring systems completely miss the possibility that excesses of certain amino acids might be harmful. And that just so happens to be the case. Ray Peat has nicely explained how limiting tryptophan and methionine has longevity and other benefits:

When rats were fed a diet completely lacking tryptophan for a short period, or a diet containing only one fourth of the “normal” amount for a more prolonged period, the results were surprising: They kept the ability to reproduce up to the age of 36 months (versus 17 months for the rats on the usual diet), and both their average longevity and their maximum longevity increased significantly. They looked and acted like younger rats. (A methionine-poor diet also has dramatic longevity-increasing effects.)

This totally turns the DIAAS on its head! The exact amino acids that add to a food's DIAAS score are the ones that should be limited. Hey, maybe we should use the anti-DIAAS as the rating system :D. It is the plant-based proteins that are low in the "bad" amino acids, by the way, with the animal proteins being universally high:

Amino acid breakdown of foods, part 1 Amino acid breakdown of foods, part 2 Amino acid breakdown of foods, part 3

Credit to Travis from RPF for the images. It is kind of funny that the meat lover considers methionine as the "limiting" amino acid, preventing plant protein from reaching high quality...when it's exactly the opposite. By the way, the FAO pretty much admits they have no idea about human amino acid requirements:

Determine amino acid requirements in different conditions and circumstances, such as in children, pregnancy, aging and exercise, as well as gender effects.
Investigate the role of specific amino acids as regulators of metabolism and other functions in various physiological and clinical states
Explore the implications of dietary protein quality on lifetime health and longevity.
Because the actual metabolic demand and requirement for amino acids is complex and not fully understood, any approach to predicting protein quality will likely be imperfect to a greater or lesser extent.
In fact no evidence of relationships between protein or amino acid intakes and health and/or disease was found which was sufficient to identify intakes associated with either optimal health or to reduce the risk of developing chronic disease.

Quotes #3 and #5 show that health has not been taken into account during the design of the DIAAS. And yet, they feel confident in setting it as some master measure of protein quality. Even to the point of requiring nutritional labels to be designed according to the DIAAS:

It is recommended that no nutrition claim should be allowed to be made for source/high protein for proteins with DIAAS less than a certain cut-off (e.g. 75).

Oh, and if you're worrying about plant protein being less digestible, don't be. The ileal digestibility - which measures how much of the ingested protein has escaped digestion - is comparable to that of animal proteins. Here's egg protein:

The true ileal digestibility of cooked and raw egg protein amounted to 90.9 +/- 0.8 and 51.3 +/- 9.8%, respectively.

~91% for cooked and ~51% for raw. Just take note of the cooked, since few people eat their eggs raw. Ok, now let's check the score for the milk protein:

but the ileal digestibility did not differ among groups (94.5–94.8%)

~95%. Now let's compare to the plant proteins. Starting with pea protein:

The true gastroileal absorption of pea protein was 89.4 ± 1.1%.

~89%. Let's continue with soy:

Soy and milk amino acid digestibility side by side

Worst digesting amino acid still at ~89%, so no relevant difference compared to the animal proteins. Some other plants might do worse; data is lacking. But it shouldn't be assumed that all animal proteins digest well, either (as already proven by the laughably bad score of raw egg protein).

Non-restrictive eating

Almost every diet out there (whether that's low carb, paleo, ketogenic, carnivore, fruitarian, starchivore, raw, gluten-free, OMAD / warrior diet, GAPS, Peatarian, food combining) focuses on what food groups to restrict, or when. And yet we're sick at seemingly unprecedented rates, and our restrictive diets don't seem to be helping. Yes, I understand that the common diet is not necessarily a healthy one, but it still restricts many things. After all, do people on the civilized diet eat enough fruit, beans, or whole grains? And well, the proposed solutions just move the restrictions elsewhere. Maybe it's time to rethink the entire approach?

The human body has requirements for nutrients, and those are often seriously understated by the mainstream charts. By loading up on everything - whole grains, fruit, honey, leaves, nuts, underground storage organs, beans, eggs, cheese and coconut oil - you ensure all those are satisfied (mild quantities of meat are fine in terms of health, though come with the obvious ethical problems). All the listed restrictive diets will be lacking in one nutrient or another, leading to one deficiency disease or another (and they won't always have a diagnosis / medical name). Please remember that our physiological knowledge is not complete and might never be. Many things that are often considered "nonessential" by certain groups have proven benefits (carbohydrates, increased amounts of manganese / selenium / magnesium / calcium, vitamin C, short chain fats, fiber, spice ingredients...). There are also nutrients still being discovered, thousands of phyochemicals in plants etc. With this approach, you also prevent possible toxicities, since you are eating a little of everything instead of a lot of one or two things. And a lot of antioxidant nutrients might buffer the possible harms of anything bad you might be ingesting in excess.

People often start a restrictive diet, seemingly get results (because they removed an offending item or just added nutrients to their previously deficient diet), but eventually encounter problems. By eating a variety of food, you ensure that you ingest everything that exists, instead of relying on a diet theory to tell you exactly what you need or don't need and in what quantities (does anyone really think they are gods when they make such pronouncements?). This is pretty uncharted territory since most people seem to like their puritanism of avoiding animal products, grains, etc. But Matt Stone has had success (archive) with this approach in his High Everything Diet / Rehabilitative Rest and Agressive Re-Feeding / Diet Recovery. He seemed to like significant quantities of processed food, though, which might still be fine in an otherwise adequate diet. Optimally I might move the approach towards more natural products without becoming obsessive about it. This strategy might not be viable if you already have some disease that would necessitate restricting things; but at that point, you probably already know what you have to limit. But also don't assume that something necessarily has to be restricted, like sugar in diabetes (see the beans results). Find a way to get in as much nutrition as possible considering the ailments you have.

Back to the front page