Movatterモバイル変換


[0]ホーム

URL:


Skip to main content
NCBI home page
Search in PMCSearch
As a library, NLM provides access to scientific literature. Inclusion in an NLM database does not imply endorsement of, or agreement with, the contents by NLM or the National Institutes of Health.
Learn more:PMC Disclaimer | PMC Copyright Notice
NIHPA Author Manuscripts logo
. Author manuscript; available in PMC: 2019 Dec 18.

The Cognitive Effects of Micronutrient Deficiency: Evidence from Salt Iodization in the United States

James Feyrer*,Dimitra Politi,David N Weil
*Department of Economics, Dartmouth College, and NBER.
School of Economics, The University of Edinburgh.
Department of Economics, Brown University, and NBER.

Email:james.feyrer@dartmouth.edu

Issue date 2017 Apr.

PMCID: PMC6919660  NIHMSID: NIHMS1013198  PMID:31853231
The publisher's version of this article is available atJ Eur Econ Assoc

Abstract

Iodine deficiency is the leading cause of preventable mental retardation in the world today. The condition, which was common in the developed world until the introduction of iodized salt in the 1920s, is connected to low iodine levels in the soil and water. We examine the impact of salt iodization on cognitive outcomes in the US by taking advantage of this natural geographic variation. Salt was iodized over a short period of time beginning in 1924. We use military data collected during WWI and WWII to compare outcomes of cohorts born before and after iodization in localities that were naturally poor and rich in iodine. We find that for the one quarter of the population most deficient in iodine this intervention raised IQ by approximately one standard deviation. Our results can explain roughly one decade’s worth of the upward trend in IQ in the US (the Flynn Effect). We also document a large increase in thyroid-related deaths following the countrywide adoption of iodized salt, which affected mostly older individuals in localities with high prevalence of iodine deficiency.

Keywords: Cognitive ability, Flynn Effect, human capital, productivity

1. Introduction

Deficiencies in the consumption of micronutrients, the vitamins and minerals required in trace amounts for proper metabolic functioning, are a major contributor to poor health in developing countries.1 Elimination of deficiencies in micronutrients such as vitamin A, zinc, iron, and iodine has been a major focus of development efforts around the world. Nutritional supplementation programs are widely seen as cost-effective interventions with potentially large effects on human capital, though their impact can be hard to evaluate, since their benefits may not be observable for several decades.2 Until the second half of the 20th century micronutrient deficiencies were a problem in currently developed countries as well. This paper focuses on the historical experience of the United States and the impact of its successful eradication of iodine deficiency.

The World Health Organization estimates that nearly 50 million people suffer some degree of mental impairment due to iodine deficiency, and that this is the leading cause of preventable mental retardation in the world. Eliminating iodine deficiency around the globe is an ongoing project, largely through the introduction of iodized salt.3 In the US iodine deficiency was a significant problem in the early part of the 20th century with rates of deficiency similar to those faced by the worst afflicted developing countries today. The United States virtually eliminated iodine deficiency with the introduction of iodized salt in 1924.4

The most important detrimental effects of iodine deficiency are on the cognitive development of fetuses in utero, and these effects are not reversible later on in life. Eradication of iodine deficiency is thus a candidate explanation for part of the Flynn Effect, the gradual rise in measured IQ that has been observed in developed countries over the course of the twentieth century. As we show below, the benefits of iodization were large, but there were also very significant health costs (several thousand premature deaths) associated with it.

We exploit the interaction of pre-existing geographic variation in iodine deficiency with a national program of salt iodization to identify the program impact. Iodine deficiency is linked directly to geography through the food and water supply. In adults, the most noticeable symptom of iodine deficiency is goiter, the enlargement of the thyroid gland. Prior to salt iodization, endemic goiter was present in regions of the US where the iodine content of the soil and water was low. In 1924 iodized salt was introduced in the United States explicitly to reduce the goiter rate. Since there are large in utero effects of iodine deficiency, we expect to see a significant difference in cognitive ability between those born before and after the introduction of iodized salt in locations with low levels of environmental iodine. Those living in high-iodine regions provide a control group.

We use two unique data sources to look at the effects of iodine deficiency eradication on cognitive ability. From World War I draft physicals we know the incidence of goiter in recruits for 151 geographic regions before the introduction of iodized salt. This provides us with a measure of spatial variation in iodine deficiency prior to treatment.5 Our outcome measure is provided by an extensive data set of men who enlisted in the Army during World War II. Upon enlistment, each recruit took the Army General Classification Test (AGCT), a forerunner to the Armed Forces Qualification Test (AFQT). The Air Force was assigned draftees with significantly higher average test scores than the Ground Forces. The probability of assignment to the Air Force rises significantly for those born in high-goiter (i.e. low-iodine) locations in the years after the introduction of iodized salt. Using information about average scores of Air and Ground Force recruits we infer a one standard deviation increase in average test scores in these regions. Our results explain roughly one decade’s worth of the Flynn Effect in the US.

In addition to the positive cognitive effects of iodizing salt we document a significant negative side effect. Chronic iodine deficiency followed by a sudden large intake of iodine can cause iodine-induced hyperthyroidism, which can be fatal. We find a large rise in thyroid-related deaths in iodine deficient regions following the introduction of iodized salt. We estimate that at least 10,000 deaths over the period 1925–1942 were the result of iodization.

The paper proceeds as follows:Section 2 provides some background on iodine deficiency disorders.Section 3 outlines the history of salt iodization in the US.Section 4 describes our data and provides some background on their collection. InSection 5 we present previously undocumented evidence on the spike of thyroid-related mortality following iodization.Section 6 explains our identification strategy andSection 7 presents our results. InSection 8 we present two robustness checks which corroborate our main findings.Section 9 puts our results in the context of related research on the effects of iodization on cognitive ability.Section 10 concludes.

2. Iodine Deficiency Disorders

Recent work has shown that the quality of maternal health and nutrition during pregnancy has persistent effects on adult health outcomes.6 Iodine is one of the “big three” micronutrients, deficiencies in which are a major source of ill health in developing countries (the other two are vitamin A and iron). Iodine deficiency is the leading cause of preventable mental retardation in the world. Nearly one third of the world’s population are at risk, in the sense that their iodine intake is considered insufficient. 241 million school-age children receive inadequate amounts of iodine in their diet (Andersson et al. (2012)).

Iodine Deficiency Disorders is a term used to describe a set of conditions that result from inadequate intake of iodine. The thyroid gland contains 70–80% of the total iodine content in the human body (Fleischer et al. (1974)). The thyroid gland uses iodine to produce thyroxin, a hormone that regulates metabolism. When there is too little iodine in the diet, the thyroid compensates for the shortage by enlarging in order to produce more thyroxin per unit of iodine. When dietary iodine is only slightly inadequate, the enlarged thyroid will be able to produce sufficient thyroxin for normal body functioning. This enlargement is known as euthyroid goiter. At lower levels of dietary iodine, the enlarged thyroid will produce inadequate thyroxin, a condition known as hypothyroid goiter, characterized by slow metabolism, lethargy, and weight gain.

Goiters due to iodine deficiency are often reversible after an increase in dietary iodine. However, in some people with this condition, increased iodine consumption results in the thyroid gland producing excessive quantities of thyroxin, resulting in hyperthyroidism: an overly accelerated metabolism, with symptoms including rapid heartbeat, weight loss, temperature elevation, nervousness, and irritability. This “iodine-induced thyrotoxicosis” is most likely to occur in individuals who have experienced long periods of iodine deficiency.7

Iodine deficiency early in pregnancy causes irreversible brain damage to the foetus. Severe iodine deficiency can result in cretinism, which is characterized by “profound mental deficiency, dwarfism, spastic dysplasia and limited hearing” (Scrimshaw (1998, p.364)). In endemic areas cretinism can affect up to 15% of the population (de Benoist et al., eds (2004)). However, as Scrimshaw points out, “even in areas where cases of cretinism due to iodine deficiency in the mother are few, the linear growth of the infant, its intellectual capacity, and certain other of its neurological functions are permanently compromised to varying degrees.”8 In other words, even if iodine deficiency does not result in cretinism, an iodine-deficient population will experience a continuum of cognitive impairment. Deficient populations typically see a leftward shift in the entire IQ distribution.9 In a meta-analysis of 18 studiesBleichrodt and Born (1994) estimate that the average IQ of iodine-deficient groups is 13.5 points lower than non-deficient groups. While widely cited, this result is based primarily on cross-sectional comparisons of observational data, which makes the identification of a causal effect questionable.

Historically, the geographical characteristics of an area determined the endemicity of iodine deficiency. Ocean water is rich in iodine, which is why endemic goiter is not observed in coastal areas. Iodine is transferred from the ocean to the soil by rain. Heavy rainfall can cause soil erosion, in which case the iodine-rich upper layers of soil are washed away. The last glacial period had the same effect: iodine-rich soil was substituted by iodine-poor soil from crystalline rocks (Koutras et al. (1980)). This explains the prevalence of endemic goiter in regions that were marked by intense glaciation, such as Switzerland and the Great Lakes region of the USA. Iodine, when present in the soil, is absorbed by plants and can reach humans through the food chain. Iodine is also present in subsurface water in some locations. Finally, deposits of mineral salt (the remains of evaporated seawater) contain iodine, but this is lost when the salt is refined.

Iodine was first isolated in 1811 by Courtois. Over a century of clinical research proved iodine’s essential role in prophylaxis against iodine deficiency disorders. Doctors and public health officials have used different methods of iodine supplementation. Salt iodization has emerged as the cheapest and widest-reaching way to protect a population from iodine deficiency.10

3. Salt Iodization in the United States

Salt iodization was the first step in the systematic fortification of food for public health reasons. It was made possible by the nearly simultaneous discovery of a widespread health problem and of its underlying cause. In the First World War draft a little more than 2.5 million draftees were examined for various physical and mental shortcomings. Statistics from these physicals showed the geographic distribution of many diseases and defects, among them goiter, across the United States (Love and Davenport (1920)).

According to the draft examinations almost 12,000 men had goiter and a third of these were judged unfit for service because the size of their neck was too big for the military tunic to be buttoned (Kelly and Snedden (1960, p.34)). Most of them came from states in the Northwest (Washington, Oregon, Idaho, Montana) and the area around the Great Lakes. In Northern Michigan, for instance, more draftees were judged unfit for service “for large and toxic goiters than for any other medical disorder” (Markel (1987, p.221)). On the other hand, goiter was rare in people coming from coastal areas.

This led to multiple surveys of goiter, which confirmed the geographical variation in the prevalence of the disease. By that time there was medical and veterinary evidence showing that goiters could be reduced by adding iodine to the diet.11 These observations prompted a public debate on whether and how to provide iodine prophylaxis to the American population. Although there was discussion of possible negative side effects in the form of hyperthyroidism, the medical consensus was that small amounts of iodine in the diet were beneficial for the vast majority of iodine-deprived populations.

Public health authorities in Michigan, one of the worst-afflicted states, held a symposium on thyroid disease in 1922. The idea of salt iodization (first implemented in Switzerland in 1922) was introduced by David Murray Cowie, M.D. as a cheap and effective means of providing iodine to all population groups, regardless of social status. The Iodized Salt Committee, chaired by Cowie, was set up to investigate the matter further. The Committee agreed upon the launch of a statewide educational campaign on goiter and its prevention through iodized salt, sponsored by the Michigan State Medical Society.

The Committee also contacted the state’s salt manufacturers. The actions of the Michigan salt producers were important, because Michigan was the largest producer of salt for human consumption in the country.12 The salt producers, although convinced about the public-service character of the project, had initial qualms about its economic feasibility and profitability; it would be financially impossible to separate the salt intended for the Michigan market and then add iodine to it. Instead, the Salt Producers Association decided to launch iodized salt nationwide. Through aggressive advertising of the “new salt,” salt companies contributed to the educational campaign throughout the country (Markel (1987, p. 224)).Figure 1 shows two newspaper advertisements from this period.

Figure 1:

Figure 1:

Iodized salt advertisements

Michigan was the first state to introduce iodized salt in May 1924. Morton Salt Company, the largest producer in the country at the time, began selling iodized salt on a nationwide basis in the fall of 1924.13 At the same time public awareness of the problem, especially in those areas that were worst afflicted, was gaining momentum. Articles in newspapers and magazines around the country advocated the use of the new salt for all cooking and eating purposes, making references to successful goiter prophylaxis in Switzerland. In advertisements for iodized salt the new commodity bore the endorsement of state or national medical associations, and educational booklets were provided by the salt companies upon demand. Many newspaper sources show the generalized availability and use of iodized salt from 1924 onwards. In addition, as we show inSection 5, the rapid rise in thyroid-related mortality, not only in Michigan but throughout the country, further indicates that iodine supplementation diffused quite rapidly.

4. Data Sources

To implement our identification strategy we need data on the prevalence of iodine deficiency before 1924, as well as data on outcomes of individuals born in these areas both before and after 1924. Our primary data sources take advantage of two previously unused surveys of prime age American males in the early part of the twentieth century.

4.1. World War I: Defects Found in Drafted Men

For data on the prevalence of iodine deficiency before 1925 we use a volume entitledDefects Found in Drafted Men (henceforth referred to asDefects), published by the War Department in 1920 (Love and Davenport (1920)).14Defects summarizes the results of all the physical exams performed on draftees during World War I for both accepted and rejected men. Data on prevalence rates per 1,000 are recorded for 271 different medical conditions. The data are regional, organized by units called sections. All but the lowest population states are broken down into multiple sections. Illinois and New York, for example, are each broken down into 8 sections. Each section is defined as a collection of counties.15 In total,Defects has data on 151 separate regions of the country.

The medical condition of interest for our study is simple goiter, which is a direct result of iodine deficiency. We use simple goiter as a proxy for underlying iodine deficiency. Simple goiter is relatively common in the data, with a population weighted average prevalence of about 5 cases per 1,000. Across states, rates range from a high of almost 27 per 1,000 in Idaho to a low of 0.25 per 1,000 in Florida.Figure 2 is a histogram of the distribution of goiter across the 151 sections inDefects. Though there are no sections with a zero rate of simple goiter, about one third of the sections have rates of less than 1 per 1,000. The fact that the data are at a finer level of aggregation than the state level is important because there is significant regional variation within the high-goiter states. For example, in the five sections in Michigan the rates reported inDefects range from over 25 in the Upper Peninsula to less than 10 in Detroit and the surrounding areas.

Figure 2: Distribution of goiter rates across World War I sections.

Figure 2:

Source:Love and Davenport (1920)

The conclusion that spatial variation in goiter rates, as documented inDefects, was caused by variation in the underlying geological characteristics across locations is supported by the fact that goiter rates correlate well with measures of the iodine content of water across localities. In a paper published in theJournal of the American Medical Association in 1924, J.F. McClendon and Joseph C. Hathaway provided measures of the iodine content of drinking water from 67 localities (some of them in the same county) across the US. These measures came from lakes, springs, rivers and wells. Their paper includes US maps with the low-iodine areas being shaded, and other maps where the high-goiter areas are shaded (their data on goiter come from theDefects Book). The two shaded areas in the two maps largely overlap (McClendon (1924)).Figure 3 is a scatterplot of the log of iodine in a certain locality against the log of simple goiter in the corresponding section fromDefects.

Figure 3: Goiter and iodine content of water.

Figure 3:

Source:McClendon (1924) andLove and Davenport (1920)

Figure 3 shows that goiter prevalence is a matter of geography; goiter is higher where the iodine content of water is lower. The variability in goiter across low-iodine areas arises, to some extent, from measurement issues. The goiter data come from comprehensive physical examinations of World War I recruits, whereas the data on the iodine content of water come from selected lakes, springs, rivers and wells. It is possible that, while the water source in one area was iodine-poor, neighboring areas (belonging to the same section) had access to an iodine-rich supply of water, driving the section goiter prevalence down. Another source of variability is distance from the ocean. Even if a town had a low content of iodine in its water supply, the population in the surrounding areas would have have been spared if iodine were plentiful in the air. In our estimation we do not use the iodine content of water as our main measure of iodine deficiency because it is only available for a small number of counties. We do, however, use it as an alternative measure of iodine deficiency in a robustness check, and show that it supports our main results.

4.2. World War II: Army Enlistment Data

The World War II enlistment data are from the National Archives and Records Administration (NARA). They originated in punch cards produced during the enlistment process for the United States Army. It includes both volunteers and draftees. After the war these punch cards were converted to microfilm. In 1994 NARA hired the census department to scan the microfilm into a collection of over 9 million enlistment records from 1938–1946. Though not complete, these records represent the majority of the enlistment into the Army during this period. Of the 1,586 rolls of microfilm, 1,374 (87%) were successfully scanned, leaving approximately 1.5 million punch cards unrecorded. The missing rolls are not sequential, and there are no indications that the records are missing in any systematic fashion. In addition to the missing rolls, several hundred thousand individual records were unreadable. As far as we know, the only other study to date using this data isFerrie et al. (2012) which studies the effect of lead exposure on the outcome of World War II enlistees.

Though the format of the punch cards changed somewhat over the course of the war, the coding for basic demographic information was consistent. The demographic fields are name, serial number, state and county of residence, place and date of enlistment, place and year of birth, race, education, and marital status. In addition, the particular branch of the Army that the enlistee entered is coded. One can also infer through the serial number whether the person was drafted. There are no records for individuals who entered the Navy or Marines.

The sample we have available to us is very large and the timing of the draft is nearly perfect for our purposes. Limiting our sample to white men, we have data on over 300,000 from each birth year between 1921 and 1927, giving us extremely good coverage on both sides of the 1924 salt iodization date. Unfortunately, the data do not include the county of birth, only the state. We therefore limit our sample to individuals whose birth state is identical to their state of residence and we assume that the county of residence upon enlistment is the county of birth.16 This eliminates 24.1% of the sample. Further limiting our attention to white males born between 1920 and 1928 and enlisting between 1940 and 1946 leaves us with 2.27 million records.

4.3. Test Scores and Assignment to the Air Forces

All enlistees were given the Army General Classification Test (AGCT).17 Unfortunately, the score only appears in our data for a brief time period too early in the war to be useful for our study.18 We can, however, make some crude inferences about the test scores by examining which army branch the enlistees were assigned to.

Each test taker was assigned a grade of I, II, III, IV, or V, with class I being the highest. Skilled positions like mechanics tended to get class I or II enlistees, while lower-skill jobs, like cooks tended to get class IV or V enlistees. We do not know the particular job assignment of each recruit. However, we can identify enlistees who were assigned to the Army Air Force (AAF) versus those who were assigned to the Army Ground Forces (AGF).19 Roughly 14% of all enlistees were assigned to the Air Force over the course of the war, though this proportion varied from year to year.

Table 1 summarizes our enlistment data. It gives the total sample size and the percentage of recruits going to the Air Force for each cohort and each enlistment semester. The number of enlisted men peaks in the second half of 1942 and the first half of 1943. Men born after iodization enlisted in large numbers starting the first semester of 1943. The proportion of recruits going to the Air Forces was particularly low in 1943, reflecting changes in war demands, as ground operations in Europe began in November 1942.

Table 1:

Sample size and probability of joining the AAF by birth year and enlistment semester

Year of birth
semester of enlistment192019211922192319241925192619271928Total
Jan-June 194019.0512.906.6713.43% AAF
426230134Sample size
July-Dee 194014.6412.9313.6413.70% AAF
24580281672049273239Sample size
Jan-June 194119.1315.7820.5835.9419.39% AAF
195931765115445305255741Sample size
July-Dee 194137.7068.0360.7456.8746.52% AAF
3375766286171651153067Sample size
Jan-June 194216.4533.0745.4039.7630.3421.6925.35% AAF
73433326651199510164286483131204Sample size
July-Dee 19427.176.6116.3259.1759.9124.4417.22% AAF
105570182286148683445442948590510658Sample size
Jan-June 19435.608.654.261.671.381.608.512.49% AAF
1989422683594471527671417652871694425366Sample size
July-Dee 19437.008.448.085.694.309.0431.257.53% AAF
10404103821017514951277837023664143995Sample size
Jan-June 19443.745.666.405.234.3918.1126.3314.32% AAF
143741083091599563109694091429866125675Sample size
July-Dee 19441.061.551.570.790.311.4513.027.63% AAF
11368902168798652109381577975560138197Sample size
Jan-June 19450.911.150.870.260.130.2119.707.2410.0011.26% AAF
473442554032496767248227561222975050118861Sample size
July-Dee 194528.1025.5922.5219.4314.9410.6311.5112.1837.1815.54% AAF
86949598934610773130911576740479810219628198397Sample size
Jan-June 194637.4232.8229.7925.0019.1214.2012.0911.0527.0919.11% AAF
5320655366697583922911306191607886865417210105Sample size
July-Dee 194640.6939.5138.9236.7030.7321.5316.7810.4318.8219.25% AAF
1826198722972455311436045000201484945989890Sample size
Total14.2312.4915.6015.9710.509.5316.1710.8924.5814.00% AAF
3335893427683108202759822559621947222263452097871245542274529Sample size

4.4. The Battle for the Best Enlistees

The Army Air Force (AAF), which was still part of the Army during World War II, had a large proportion of jobs that required skilled recruits relative to the Army Ground Forces (AGF). Throughout the war, the AAF pushed to have a large proportion of the more highly skilled recruits assigned to them. In February 1942, the AAF successfully got the “75% rule” put into place. Under this rule, 75% of the men assigned to the AAF were to have scored above 100 (the median score) on the AGCT.20 From this, we can infer that individuals assigned to the AAF during this period have, on average, higher test scores than those assigned to the AGF.

Unsurprisingly, the AGF was not pleased with this arrangement, and this rule was not in place for the entire war. Though lower skilled recruits could easily be used in the infantry, the AGF was concerned about having a supply of recruits who could become high-quality combat leaders. The AGF successfully lobbied the War Department to change the rule on August 1, 1942 so that the proportion of above average men received by the AAF was reduced to 55%.21

The AAF fought back against this change by using a second test, the mechanical aptitude (MA) test, as a screen for AAF recruits. At first, they simply requested that a higher proportion of men assigned scored above average on the MA test. This was later formalized. From December 1942 until June 1943, the AAF was supposed to be assigned 55% of their new recruits from the group with scores greater than the mean on both the AGCT and the MA tests. Combining the two tests was obviously more restrictive than just using one test. In fact only 37.5% of all recruits were above average on both tests.22 This rule was allowed to expire, but it is clear that the AAF continued to get higher quality of recruits. For example, for those inducted in 1943, 41.3% of soldiers assigned to AAF were class I or II. This percentage is higher than the one corresponding to Ground Combat Arms (29.7%) and Services (36.5%) (Palmer et al. (1948)).

Figure 4 is a graph from the War Department showing the percentage of recruits assigned to the AAF with above average AGCT scores during the period of time that these rule changes were occurring. During the early part of the graph, the 75% rule was clearly in operation. At the end of July, the abolition of the 75% rule can be seen, with the AAF only getting 55% of recruits from the above average group. By September, the AAF had almost managed to return to the old proportions via the 55% mandate on both the AGCT and MA tests.

Figure 4: Percentage of Air Forces Recruits with above average AGCT scores.

Figure 4:

Source: US Air Force Historical Study #76: Classification and Assignment of Enlisted Men in the Air Arm, 1917–1945.

Additional evidence on the positive selection of men with high cognitive function into the AAF comes from an anomaly in the WWII enlistment data. As mentioned above, all new enlistees in the army took the AGCT test, but this test result was not generally recorded on the punch cards that are the primary source for the enlistment data. However, from March to May 1943, AGCT scores were recorded in the field marked “weight” for almost all recruits.23 The fact that AGCT was coded in this field for some subset of the war is suggested in the documentation. Observing the actual distribution of values in the “weight” field confirms this is true for a subset of observations. Examining observations from enlistments through 1942, the weight field has a mean of 147 and a standard deviation of 28. For the period of March through May of 1943, the mean is roughly 106 and the standard deviation is about 80. Once we eliminate values which are below 20 and over 180, which clearly do not correspond to test scores (3.7% of observations), the mean becomes 98 and the standard deviation 21. These numbers are very close to the intended mean and standard deviation of the AGCT.24

Figure 5 shows a histogram of AGCT scores for recruits entering the Air Forces alongside a histogram for all other recruits.25 For the non Air Forces group there are 175,731 reported scores with a mean of 97.7 and a standard deviation of 21.1. This distribution matches the normal distribution of the AGCT (mean of 100 and standard deviation of 20). The second panel shows the histogram of recruits entering the AAF in this period. The mean of 124.1 suggests that the AAF was, at this point, receiving substantially better recruits than the ground forces. However, as seen inTable 1, this was also a period in which very few recruits entered the Air Forces. This histogram is based on only 541 AAF recruits.26

Figure 5: AGCT scores for AAF versus all other enlistees.

Figure 5:

Source:National Archives and Records Administration (2002)

The preferential treatment of the AAF lasted until the end of 1943, when the “infantry crisis” broke out. The need for high-quality ground forces grew more acute in 1944 and lasted until the end of the war, while, at same time, air operations were not as important as in previous years. This meant that priorities between the AAF and the AGF shifted in favor of the latter, and the Army classification system was revised to allow for better-quality soldiers to join the Ground Forces.27

Figure 6 shows selectivity and demand for Air Force recruits monthly over the course of the war. The upper plot inFigure 6 shows the percentage of total recruits who had a highschool degree in the Air Force and the Ground Forces. The lower plot shows the percentage of all recruits being assigned to the Air Force. The Air Force received a significantly higher proportion of high-school graduates until November 1945, though the degree of selectivity varies. There is a spike in the proportion of high-school graduates in the Air Forces around February 1942, when the 75% rule was put into effect. The temporary decrease in the second half of 1942 corresponds to the withdrawal of the 75% rule. In late 1942 and early 1943, when the Mechanical Aptitude Test was put into use, the Air Force returned to the preferential-treatment status it enjoyed before.

Figure 6: Air Force demand and selectivity in WWII.

Figure 6:

Source:National Archives and Records Administration (2002)

The lower-figure plots demand for Air Force recruits separately for high- and low-goiter sections. Fluctuations in these curves are driven by war events that dictated specific Army needs. For example, the US first engaged in air combat, driving up the demand for Air Force recruits in the beginning of the war. Right after ground operations got underway, in the end of 1942, there was higher demand for recruits in the Ground Forces, with a corresponding decrease in the numbers assigned to the Air Force for 1943.

Table 2 gives the proportion of high-school graduates in each branch by birth year. Later cohorts were less likely to have a high-school degree than the earlier cohorts because as the war progressed the demand for manpower led to the drafting of ever younger men. There is clear positive selection into the Air Force for cohorts born before 1927. About half of the 1927 and most of the 1928 birth cohorts enlisted after positive selection for the Air Force had ended. As we will show, the fact that there wasno positive selection into the Air Force for the 1928 birth cohort is consistent with our empirical results. For our purposes, positive selection into the Air Force for birth cohorts from 1920 until 1926 gives us good coverage of pre- and post-treatment enlistees.

Table 2:

Proportion of high-school graduates by birth year and branch.

High-school Graduation Rate
Birth YearAir ForceGround ForcesTotal
192076.142.947.6
192176.141.445.7
192270.339.844.6
192360.643.746.4
192452.239.540.8
192556.935.937.9
192651.036.839.1
192733.338.537.9
192826.440.737.2
Total59.240.242.9

We take advantage of selection into the Air Forces in two ways. First, if iodine deficiency affects cognitive ability we should expect a jump in the relative rate of assignment to the AAF for recruits born after 1924 in those counties where goiter rates were high in theDefects data. Second, by exploiting the normal distribution of the AGCT test, we can interpret our estimation results in terms of IQ gain for those regions where iodine deficiency was eradicated.

5. The Rise in Thyroid-Related Deaths

As discussed earlier, the treatment designed to help goiter sufferers sometimes ended up killing them. In Europe, the potential negative side-effects of iodine treatment had been discussed as early as the nineteenth century (McClure (1934) andCarpenter (2005)) and made universal iodine prophylaxis a controversial public health measure. In this Section, we document a large increase in deaths linked to the adoption of iodized salt, which we have not seen previously discussed in the literature.

Figure 7 shows the annual rate of deaths in the US over the period 1910–1960 due to exophthalmic goiter, which accounted for the overwhelming majority of deaths due to thyroid disease over this period.28 There is an extremely large rise in the death rate at the time of iodization. Between 1921 and 1926 the death rate nearly doubled from 2.1 to 4 per 100,000. Deaths due to goiter remained elevated for at least a decade. There was also a large gender disparity in death rates. In 1926 death rates were over 6 times as high for women as for men.29 The population of the United States in 1926 was 117 million, and so the rise of approximately two deaths per 100,000 people represented an extra 2,340 deaths in that year. Over the period 1925–1942 there appear to be at least 10,000 excess deaths that resulted from the introduction of iodized salt. We have found little discussion in the literature of what appears to be a high short-term price the country paid for long-run benefits resulting from this public health intervention.30

Figure 7: US deaths from exophthalmic goiter, 1910–1960.

Figure 7:

Source:Grove and Hetzel (1968)

The medical literature suggests that iodine-induced hyperthyroidism is most common among those with long-standing iodine deficiency. Consistent with this, the rise in the death rate was highest, and persisted the longest, among older age groups. Deaths in the 25–34 age category less than doubled from 1921 to 1926, and had fallen below their 1921 level by 1935. In the 65–74 age category, deaths more than tripled between 1921 and 1926, and were still three times their 1921 level in 1935. The link between iodine deficiency and the rise in deaths at the time of iodization is also apparent looking across states.Figure 8 plots goiter deaths over the period of salt iodization separately for high- and low-goiter states. We label a state as high-goiter if its goiter prevalence inDefects is higher than 5 cases per 1,000.31 The trends between the two state groups are similar except for the period from 1925 to 1932. There is a sharp rise in deaths in high-goiter states in 1925 and 1926, which we do not observe for low-goiter states. As expected, large increases in mortality all took place in states that had high levels of goiter due to iodine deficiency. In addition, the rise in goiter deaths occurs sharply after 1924, which is when salt iodization began in Michigan. The evidence from mortality rates suggests, therefore, that iodization spread quickly around the country after its introduction in Michigan in 1924.

Figure 8: Deaths from exophthalmic goiter, by state group, 1920–1940.

Figure 8:

Source:U.S. Department of Commerce (various years)

6. Identification

Our identification strategy relies on spatial variation in the extent of pre-existing iodine deficiency along with temporal variation arising from the introduction and rapid spread of iodized salt. High-goiter areas provide the treatment group and low-goiter areas the control group. Years before 1924 are pre-treatment and years after 1924 are post-treatment.

Our baseline regression specification is the following Linear Probability Model:

yiste=α+t1924βt[goitersI(t=birthyear)t]+controlsste+ϵiste

whereyiste is an indicator variable equal to 1 if an individuali born in yeart in locality s and joining the Army at timee was assigned to the Air Force, and equal to 0 otherwise.32

The goiter rate interacted with a set of birth year dummies provides the main coefficients of interest. We choose 1924 as the excluded category, so that the coefficients of interest correspond to differences relative to the 1924 cohort. Iodized salt first became available in May 1924, and take-up, while rapid, was not instantaneous.Field et al. (2009) provide evidence that the first trimester of gestation is the most relevant period for the cognitive impact of iodine deficiency. Thus the majority of those born in 1924 would not have been treated at all. The pattern of coefficients on the interaction of goiter and birth year dummies will show how the relationship between the geologically determined level of iodine deficiency and cognitive ability changes over time. The iodization of salt in 1924 should make these coefficients significantly larger for the later cohorts. Note, however, that we do not impose any particular pattern in the coefficients of interest; we let the data tell the story of whether and how the probability of enlisting in the Air Force changed for cohorts born in affected areas, and we check whether that matches the introduction of iodized salt.

In all the regressions we include a full set ofDefects section fixed effects, as well as a set of state-specific trends. The latter control for gradual changes at the state level, which might affect each cohort’s probability of enlisting in the Air Force. In our first specification, we also include a full set of birth year fixed effects interacted with enlistment year indicator variables, to control for the changing demands of the war and the need for Air Force recruits. The interaction of birth year and enlistment year fixed effects also implicitly provides a control for enlistment age. In another specification, we include full sets of both birth year fixed effects and enlistment month fixed effects.

In our final sample we include all white males born between 1920 and 1928, who joined the Army between January 1940 and December 1946, who reside in their state of birth, and who were between 17 and 26 years old when they enlisted.33 Since our identification relies on variation within a given birth - enlistment year cohort, we only use birth year - enlistment year cells with more than 100 observations.34 Finally, because our theory relies on the Air Force getting preferential treatment, for most of the analysis we focus on the enlistment period up to November 1945, which is when positive selection for the Air Force ends.

We use two-way clustering for standard errors, at theDefects section level and the birth year level. To corroborate our findings, we conduct several robustness checks: first, we check that our estimates are not confounded by changes in demand for Air Force recruits. The enlistment year - birth year interactions should control for changing demand conditions, but we also show that our results hold for different parts of the distribution of demand. Second, we use an alternative measure of underlying iodine deficiency, the iodine content of drinking water, which is available for a few locations. The estimates from this check are consistent with the results using goiter as our measure of iodine deficiency. We also did a falsification exercise using all the other diseases that were measured at the section level inDefects, to show that our pattern of coefficients is not generally replicated with diseases that were not affected by iodization. Results from this falsification exercise can be found in anonline Appendix.35

7. Results

Figure 9 is a first graphical preview of our results. We plot the probability of joining the Air Force for each cohort of recruits, by high-goiter and low-goiter group, according to the goiter level in their section of birth. The 95% confidence band around the means is very small, because our sample is large. The high-goiter group containsDefects sections which are at the top 25% of the distribution with a cutoff of 5.4 goiter cases per 1000.Figure 9 shows a jump in the probability of joining the Air Forces for the 1925 cohort coming from a high-goiter area relative to the same cohort coming from a low-goiter area. The jump is even more pronounced for the 1926 cohort. Note that half of the 1927 cohort and all of the 1928 cohort only enlisted after the Air Force stopped receiving preferential treatment among Army commands.

Figure 9: Probability of joining the Air Forces.

Figure 9:

Source:National Archives and Records Administration (2002) andLove and Davenport (1920)

Figure 10 is a regression-adjusted graphical presentation of our results.Figure 10 plots the average residuals for each cohort - goiter group combination, after regressing an Air Force indicator variable on enlistment month fixed effects for each birth cohort separately. The 95% confidence band is, again, very narrow.Figure 10 accounts for changing demand conditions over the course of the war, and it shows that those cohorts born after iodization in a previously high-goiter area had a sudden increase in their probability of joining the Air Force (as long as they enlisted when there was positive selection into the Air Force), as opposed to those coming from low-goiter areas.Figure 10 also shows evidence ofnegative selection into the Air Force for the 1928 cohort, who enlisted during the final phase of the war.

Figure 10: Residual probability of joining the Air Forces.

Figure 10:

Source:National Archives and Records Administration (2002) andLove and Davenport (1920)

Table 3 displays estimates from a linear probability model of assignment to the Air Force. The dependent variable is an indicator switching on if the individual entered the Army Air Force (AAF). The regressors of interest are the interactions of birth year dummy variables with a continuous measure of goiter in the section where the recruit was born. The excluded year is 1924, which is when iodized salt was introduced. The coefficients correspond to changes in percentage points in the probability of joining the Air Force. Columns (1) and (2) ofTable 3 use all enlistment months in the estimation. In columns (3) and (4) we focus on the positive selection period, which ends in November 1945. Our theory relies on the Air Force getting preferential treatment in the cognitive ability of its enlistees, so we focus on this enlistment period in all other specifications. Columns (1) and (3) include interactions of birth year and enlistment year fixed effects. Columns (2) and (4) include birth year fixed effects and enlistment month fixed effects as controls.

Table 3:

Baseline results: Probability of enlisting in the Air Force

(1)(2)(3)(4)
All enlistment monthsAll enlistment monthsPositive selection monthsPositive selection months
Goiter X Born in 1920−0.318−0.312−0.02290.00668
(0.225)(0.233)(0.184)(0.173)
Goiter X Born in 1921−0.156−0.1060.04910.113
(0.156)(0.162)(0.137)(0.124)
Goiter X Born in 1922−0.134−0.122−0.0352−0.0187
(0.0972)(0.128)(0.0707)(0.0641)
Goiter X Born in 1923−0.167***−0.197***−0.135**−0.169*
(0.0520)(0.0685)(0.0533)(0.0934)
Goiter X Born in 19250.509***0.468***0.516***0.494***
(0.168)(0.127)(0.158)(0.124)
Goiter X Born in 19260.819**0.771**0.726***0.693***
(0.323)(0.324)(0.259)(0.243)
Goiter X Born in 19270.462**0.394**0.491**0.441**
(0.233)(0.198)(0.209)(0.181)
Goiter X Born in 1928−0.0501−0.152−0.370−0.518
(0.415)(0.440)(0.399)(0.337)
Constant149.9***−146.177.03***107.2
(10.80)(2,437)(16.81)(2,516)
Observations2,274,5292,274,5291,935,4441,935,444
R-squared0.1310.1340.1530.167
Birth X Enlist. year FEYESNOYESNO
Section FEYESYESYESYES
State trendsYESYESYESYES
Birth year FENOYESNOYES
Enlist. month FENOYESNOYES

Notes: Robust standard errors in parentheses:

***

p<0.01

**

p<0.05

*

p<0.1

Coefficients correspond to changes in percentage points. Disturbances are clustered at the section and birth year levels. Columns (1) and (2) include enlistment months January 1940 to December 1946. Columns (3) and (4) include enlistment months January 1940 to November 1945.

The pattern of coefficients suggests a clear break before and after 1924. Cohorts born after 1924 in high-goiter sections are more likely to join the Air Force compared to other recruits. The 1923 cohort, which was the last cohort born before iodization in high-goiter areas, was less likely to be assigned to the Air Force compared to the cohorts that would have enlisted at the same time. The positive effect disappears for the 1928 cohort. This is consistent with the fact that the 1928 cohort enlisted when positive selection into the Air Force was largely over, as can be seen inFigure 6.

The effects that we estimate are large in high-goiter areas. The coefficients for 1925, 1926, and 1927 are between 0.4 and 0.8 percentage points. The most affected areas in our sample have goiter levels of roughly 30 per 1000 cases. Multiplying the coefficients by 30 indicates that themost affected areas saw a 12–24 percentage point increase in the likelihood of joining the AAF if they were born after 1924.

InTable 4 we run a specification similar to column (3) ofTable 3, but instead of using a continuous measure of goiter, we separate the sample into high- and low-goiter sections, and include interactions of birth year dummy variables with a high-goiter indicator as our regressors of interest. We define high-goiter to correspond to the top 20% (column (1)), the top 25% (column (2)) and the top 30% (column (3)) of the population-weighted goiter distribution.36

Table 4:

Probability of enlisting in the Air Force for high-goiter sections

(1)(2)(3)
top 20th percentiletop 25th percentiletop 30th percentile
High goiter X Born in 19201.602−0.09670.905
(2.249)(2.787)(2.781)
High goiter X Born in 19212.2491.4322.784
(1.680)(2.066)(2.076)
High goiter X Born in 19220.09270.1850.467
(1.268)(1.606)(1.619)
High goiter X Born in 1923−0.930*−1.462**−0.908
(0.496)(0.680)(0.661)
High goiter X Born in 19253.796***5.580***5.267***
(1.223)(1.802)(1.582)
High goiter X Born in 19266.886***10.00***8.581***
(1.528)(2.421)(2.093)
High goiter X Born in 19274.421***6.698***3.507*
(1.654)(2.151)(2.071)
High goiter X Born in 1928−4.181−7.360−7.568
(2.904)(5.431)(5.276)
Constant59.24***67.71***54.88***
(12.35)(14.97)(15.95)
Observations1,935,4441,935,4441,935,444
R-squared0.1530.1540.154
Birth X Enlist. year FEYESYESYES
Section FEYESYESYES
State trendsYESYESYES

Notes: Robust standard errors in parentheses:

***

p<0.01

**

p<0.05

*

p<0.1

Coefficients correspond to changes in percentage points in the probability of enlisting in the Air Force for enlistment months January 1940 to November 1945. Disturbances are clustered at the section and birth year levels. Column (1) treats the top 20% of the population-weighted goiter distribution as high-goiter. Column (2) treats the top 25% of the population-weighted goiter distribution as high-goiter. Column (3) treats the top 30% of the population-weighted goiter distribution as high-goiter.

The results ofTable 4 echo the earlier results, but they are easier to interpret. Depending on how we define “high-goiter” and which cohort we are looking at, individuals born post-iodization in high-goiter areas see a 3.8–10 percentage point increase in the probability that they enter the Air Forces compared to earlier cohorts. Given that the average rate of assignment to the Air Force is roughly 14% for the entire sample, this is a large effect.

In summary, the pattern of coefficients from the linear probability model suggests a big, sudden increase in the probability of assignment to the Air Force for later cohorts born in high-goiter areas. The timing of the change corresponds to the introduction of iodized salt, and is consistent with cognitive gains in utero for the treated populations.

8. Robustness checks

8.1. Changes in Demand for Air Force Recruits

In our main specification, the birth and enlistment cohort control variables account for changes in the overall demand for Air Force recruits over time, which, as shown inFigure 6, were dictated by war circumstances. In this Section, we check directly if our main results are affected by overall demand for manpower in the Air Force, by cutting the sample across different demand periods.

InTable 5 we rerun the linear probability model in column (3) ofTable 3 limiting the sample by the proportion of recruits assigned to the Air Force in a particular month. Columns (1) and (2) looks at enlistment months when demand was above and below the median.37 Columns (3) and (4) omit periods of very high and very low demand.38

Table 5:

Robustness check: Probability of enlisting in the Air Force in different demand periods

(1)(2)(3)(4)
Above median demand for AAFBelow median demand for AAF10th-90th percentile demand for AAF25th-75th percentile demand for AAF
Goiter X Born in 1920−0.0290 0.284−0.0140−0.0999
(0.227)(0.269)(0.182)(0.140)
Goiter X Born in 19210.170 0.2370.0227−0.102
(0.173)(0.199)(0.123)(0.0768)
Goiter X Born in 19220.00252 0.1611.33e–05−0.186***
(0.108)(0.107)(0.0551)(0.0653)
Goiter X Born in 1923−0.0987−0.0339−0.0928*−0.0903
(0.0831)(0.0701)(0.0527)(0.0661)
Goiter X Born in 19250.841***0.221***0.466***0.385***
(0.273)(0.0731)(0.161)(0.130)
Goiter X Born in 19260.935*** 0.4720.628**0.458*
(0.344)(0.327)(0.266)(0.241)
Goiter X Born in 19270.736** 0.1160.392**0.141
(0.308)(0.198)(0.176)(0.161)
Goiter X Born in 19280.0709 −0.566−0.613−0.855*
(0.431)(0.423)(0.426)(0.504)
Constant176.1*** 91.0216.60***−11.89
(25.65)(21,243)(5.164)(9.083)
Observations990,063945,3811,429,928902,809
R-squared0.135 0.0910.1390.126
Birth X Enlist. year FEYES YESYESYES
Section FEYES YESYESYES
State trendsYES YESYESYES

Notes: Robust standard errors in parentheses:

***

p<0.01

**

p<0.05

*

p<0.1

Coefficients correspond to percentage point changes in the probability of enlisting in the Air Force. Disturbances are clustered at the section and birth year levels. Demand for Air Force recruits is defined as the proportion of all recruits who enlist in the Air Force in a given enlistment month, for the enlistment period January 1940 to November 1945. Column (1) (column (2)) includes enlistment months above (below) the median of the population-weighted distribution of demand for Air Force recruits. Column (3) (column (4)) includes enlistment months within the 10th-90th (25th-75th) percentiles of the population-weighted distribution of demand for Air Force recruits.

Our main results are largely unaffected when we cut the sample across different parts of the distribution of demand for Air Force recruits. The estimates for periods of below-median demand are smaller than the estimates for periods of above-median demand but the pattern of coefficients is very similar to our baseline specifications. The estimates in columns (3) and (4) show that our results are not being driven by periods of exceptionally high or low demand for Air Force recruits. The fact that coefficients for the 1927 cohort are not always robust to changes in demand is not surprising, given that only half of the 1927 cohort enlisted when there was positive selection into the Air Force in the first place.

8.2. Iodine Content of Drinking Water

The statistics compiled during the WWI draft and summarized inDefects provide a comprehensive and detailed survey of goiter prevalence throughout the US. To our knowledge, there is no higher-quality dataset on iodine deficiency across the US prior to the introduction of iodized salt. That said, goiter is not the only measure of iodine deficiency. In this Section we use the iodine content of drinking water in a given location as another measure of underlying iodine deficiency. Our data come fromMcClendon (1924), and they were collected from lakes, springs, rivers and wells across 67 localities in the US (see discussion inSection 4.1). These data are not as comprehensive as the goiter data, and they suffer from measurement issues, since the iodine content of a single source of drinking water might not be representative of a whole county’s access to iodine. They can serve, however, as a robustness check of our main results.Figure 3 shows that, as expected, the iodine content data correlate negatively with the goiter data fromDefects.

We match the localities for which we have iodine content data to the county in which they belong. Some counties had more than one measures of iodine content. In those cases we calculate the average for the county. We end up with data on the iodine content of drinking water in 59 counties across 30 states. We then merge the iodine content data with our WWII enlistment data, and run a linear probability model similar to the one in column (3) ofTable 3, using interactions of birth year dummy variables with the logarithm of iodine content in a given county. Our sample size is much smaller since we only have data from 59 counties; we lose almost 90% of observations. Instead of section fixed effects, we include county fixed effects. Similarly to column (3) ofTable 3, we include birth year fixed effects interacted with enlistment year fixed effects, and we cluster standard errors both at the county and the birth year levels. We include state-specific trends, but they are only identified for the 14 states with multiple iodine content data. In addition, only 7 of those states have more than two county observations. This is why we also estimate the model without state trends.

Table 6 presents estimation results. Column (1) includes state-specific trends, whereas column (2) omits them. We generally do not have enough power to get statistically significant results (except for the 1925 cohort in column (2)), but the pattern of coefficients matches our earlier results. Cohorts born in high-iodine counties after the introduction of iodized salt have a lower probability of enlisting in the Air Force. The relationship breaks down for the 1928 cohort, similarly to our baseline results, reflecting the end of preferential treatment for the Air Force. The log of iodine ranges from 0 to 9.8 in our data, so going from the lowest to the highest iodine increases the probability of enlisting in the Air Force by 12.8 percentage points (using the point estimate for the 1925 cohort in column (2)). This estimate is very similar to the lower bound suggested by our baseline results.

Table 6:

Robustness check: Probability of enlisting in the Air Force using the log of iodine content of drinking water in 59 counties

(1)(2)
Air ForceAir Force
log iodine X Born in 1920−0.9640.581
(0.831)(0.538)
log iodine X Born in 1921−0.7710.178
(0.718)(0.574)
log iodine X Born in 1922−0.398−0.0141
(0.585)(0.576)
log iodine X Born in 1923−0.378−0.315
(0.234)(0.251)
log iodine X Born in 1925−1.454−1.373*
(0.978)(0.725)
log iodine X Born in 1926−1.830−2.720
(1.426)(2.193)
log iodine X Born in 1927−1.048*−1.776
(0.563)(1.276)
log iodine X Born in 19283.342***1.206
(1.239)(0.875)
Constant−95.50***41 12***
(24.50)(5.993)
Observations224,022224,022
R-squared0.1750.158
Birth X Enlist. year FEYESYES
County FEYESYES
State trendsYESNO

Notes: Robust standard errors in parentheses:

***

p<0.01

**

p<0.05

*

p<0.1

Coefficients correspond to percentage point changes in the probability of enlisting in the Air Force during the enlistment period January 1940 to November 1945. Disturbances are clustered at the county and birth year levels.

9. Interpretation of Coefficients

In this Section we translate the results of our regressions inSection 7 into a measure of the implied intelligence increases in the population following iodine supplementation. To do so, we construct a simple model of the selection in to the Army Air Forces. We write the model in terms of IQ scores, which are a more conventional measure than the AGCT scores that were actually used in the selection of recruits.

Consider a pool of recruits composed of two populations. A fraction (1 −ϕ) of recruits are from regions in which there is no iodine deficiency. Let their intelligence be represented by a random variable IQ that is distributed normally with mean of 100 and standard deviation of 15. A fractionϕ of recruits are from iodine deficient regions. Prior to treatment, their intelligence is given by the random variableIQd, distributed normally with a mean of X and a standard deviation of 15. The average level of IQ in the population would beIQ¯=ϕX+(1ϕ)100. After treatment, the distributions are assumed to be identical.

As discussed above, a specific fraction of AAF recruits scored above the mean on the AGCT. Given this, we wish to examine how the probability of entering the AAF differs between the two populations before and after treatment. Letψ be the fraction of recruits entering the AAF that had AGCT scores above the mean, and letθ be the fraction of the overall pool of recruits that entered the AAF.

The sources that we have available do not describe the exact process by which the target percentage of above-average recruits in the AAF was achieved. Here we consider a simple and robust algorithm that would have produced this result. Suppose that initially, the AAF draws randomly from the pool of recruits until it has filled its quota for recruitsbelow the mean. Since half the distribution is below the mean, this would fill a fraction 2(1 −ψ) of the allotted slots. To fill the remaining available slots, the AAF would draw randomly from the pool of recruits with AGCT score above the mean.

Under such an algorithm, we can calculate the probability of someone from a low-iodine region entering the AAF as the sum of the initial draw from the entire distribution and the secondary draw where only the top half of the distribution is considered.

P(AAF|iodinedeficient)=2(1ψ)θ+(2ψ1)θP(IQd>IQ¯)(1ϕ)P(IQ>IQ¯)+ϕP(IQd>IQ¯)

Similarly, the probability of someone from a non-deficient region entering the AAF is:

P(AAF|non-deficient)=2(1ψ)θ+(2ψ1)θP(IQ>IQ¯)(1ϕ)P(IQ>IQ¯)+ϕP(IQd>IQ¯)

The difference between these two probabilities corresponds to the regression coefficient in our specification. The difference is the increase in the probability of entering the AAF resulting from iodization39:

(2ψ1)θP(IQ>IQ¯)P(IQd>IQ¯)(1ϕ)P(IQ>IQ¯)+ϕP(IQd>IQ¯)

We apply this model to our setting as follows. First,Figure 4 suggests that, on average, across all the different recruitment periods (the 75% rule period, the period with no rule, and the period with two rules), about 70% of Air Force recruits had an AGCT score above the median. Thus we setψ = 0.70. Second, according to our data the fraction of all recruits going to the Air Force was 14%. We use this value forθ. Finally, we need a measure of the fraction of recruits coming from iodine-deficient regions. In practice, there was a range of iodine deficiency as recorded in theDefects data. In column ( 2)Table 4 we defined our “high-goiter” dummy as corresponding to the one-quarter of sections with the highest prevalence of goiter. Thus we use a value ofϕ = 0.2 5, and evaluate the coefficient from the regression that used this dummy variable.

Using these parameters,Table 7 shows the implied effect of iodization on the probability of entry into the AAF for a range of possible values ofX, the mean level of IQ in the iodine-deficient regions. The estimated coefficient on the high-goiter dummy inTable 4 is in the range 3.8–10 percentage points. Comparing these estimates toTable 7, we see that the lower range of these estimates are consistent with iodization raising IQ by 15 points (that is,X = 85), which is a reasonable expectation given the work of (Bleichrodt and Born 1994). However, the higher range of estimates is larger than any reasonable estimate of the increase in IQ that would have resulted from iodization. Given the uncertainty surrounding the selection process and prior literature on iodization we do not consider the larger results plausible. In any case, our results are consistent with a substantial effect, in line with the existing literature.40

Table 7:

Interpretation of coefficients

X (Mean IQ)Regression Coefficient
1000
95.015
90.029
85.041
80.052
75.060
70.065

Our estimates and their interpretation are based on a sample of male recruits. However, there is evidence suggesting that females are more prone to the development of iodine deficiency disorders than males.Field et al. (2009) discuss experimental evidence that, in animals, females are more likely to suffer cognitive damage in utero as a result of iodine deficiency. BothField et al. (2009) andPoliti (2010) find larger effects of iodization programs on graduation rates for females than for males. Therefore, the effect of iodization on the general population might well be larger than our estimates suggest.

10. Conclusion

Iodization of salt in the United States was one of the first instances of broad-based food fortification, a practice that continues in wealthy countries to this day and is spreading rapidly to the developing world. The experience of the US offers a useful natural experiment for identifying the long-term effects of an important micronutrient deficiency that many developing countries still struggle with today. Prior to iodization, there was significant geographic variation in iodine deficiency, resulting from identifiable soil and water conditions. This variation can be measured using data on the prevalence of goiter, allowing us to clearly distinguish between treatment and control groups. Iodization of salt was national in scope and implemented rapidly. The key period in which iodine deficiency affects cognitive development is a narrow window during gestation. Thus we are able to easily differentiate treated from non-treated cohorts.

We find that cognitive abilities of men born in iodine-deficient regions rose relative to those born in non-deficient regions for cohorts in utero after the advent of iodization. Our measure of cognitive ability is the probability of a man being assigned to the Army Air Forces, entry into which was partially based on performance on a standardized intelligence test. Interpreting our measure in terms of IQ, our finding is that in iodine-deficient regions, iodization raised IQ scores by roughly one standard deviation, or 15 points. Given that one- quarter of the population lived in such regions, this implies a nationwide increase in average IQ of 3.5 points. In developed countries, average IQ rose at a rate of roughly three points per decade for much of the twentieth century. This is the Flynn Effect, which is thought to be largely attributable to reduced health and nutrition insults in utero and among children. Thus our results are consistent with the elimination of iodine deficiency in the United States accounting for roughly one decade’s worth of the Flynn Effect.

From the few pre-iodization surveys of goiter prevalence in the general population41, we know that iodine deficiency in the worst-afflicted parts of the United States was of a magnitude similar to that encountered in goitrous areas of developing countries and considered a serious public health concern many decades later.42 Therefore, our results are relevant for current iodine supplementation efforts around the world.

Supplementary Material

Appendix

Acknowledgments

We would like to thank Hoyt Bleakley, Kenneth Chay, Andrew Clausen, Joseph Ferrie, seminar participants at Tel Aviv University, Boston College, Brown University, Northwestern University, and conference participants at the NBER’s Cohort Studies Meeting, AEA Meetings and SIRE’s Young Researchers Forum for helpful comments. Desislava Byanova, Federico Droller, Bryce Millett Steinberg, and Young Min Kim provided excellent research assistance.

Footnotes

1

In 2005 667,000 deaths of children under five were estimated to be attributable to vitamin A deficiency. The mortality burden of zinc deficiency was roughly two-thirds as large (Black et al. (2008)). Anemia caused by iron deficiency affects 1.6 billion people around the globe (de Benoist et al., eds (2008)).

2

The 2008 Copenhagen Consensus ranked zinc and vitamin A supplementation for children as the most cost effective potential intervention for global welfare. Iron and iodine supplementation was the third most cost effective potential intervention (Copenhagen Consensus Center (2008)). In a notable recent paperField et al. (2009) exploit delays and gaps in the implementation of the iodization program in Tanzania, and find that treatment of mothers with iodated oil resulted in a rise in schooling of 0.33 years among children, with a larger effect for girls.

3

In China, the fraction of salt iodized rose from 30% to 96% between 1990 and 2000. India banned the sale of non-iodized salt for human consumption in 2006 (Mannar and Bohac (2009) and MicronutrientInitiative (2009)).

4

Food fortification in the US began in 1924 with iodized salt, followed by vitamin D fortified milk in 1932, and vitamin B fortified flour in 1941. These interventions largely eliminated micronutrient deficiencies in the US.

5

Generally speaking, population surveys record much higher rates of goiter than military data. This is because females are much more prone to goiter (and thyroid disorders in general) than males. A second reason is that, in deficient populations, goiters grow with age, so recorded prevalence rises when one includes older cohorts, rather than just men of military age. In this paper we rely primarily on military data for two reasons: first, other markers of iodine deficiency used today, such as urinary iodine concentrations, are not available for that time period. Second, population surveys of goiter prevalence prior to iodization are rare, they tend to focus on a few localities, and they suffer from lack of uniformity in measurement.

6

See, for example,Almond (2006), andBehrman and Rosenzweig (2004). For an excellent review of the literature, seeAlmond and Currie (2011).

7

Goiter and hyperthyroidism can also result from Graves Disease, also called Basedow disease, an immune condition in which the thyroid is stimulated to produce excess thyroxin. Iodine-induced thyrotoxicosis is also called Jod-Basedow disease. “Jod” is German for iodine. The name indicates that iodine consumption is resulting in the symptoms of Basedow disease. Examining the rise in thyroid disease that followed the introduction of iodized bread in Tasmania,Connolly (1971) found that most patients with iodine-induced thyrotoxicosis had pre-existing goiter, and few had Graves disease.

8

Scrimshaw (1998, p.351).

9

Scrimshaw (1998) provides a list of studies and experiments documenting the hindering effects of iodine deficiency in utero on mental development.

10

Alternatives have included the iodization of water supplies and bread, as well as the provision of iodine- enriched chocolates or milk to babies and schoolchildren, and injections of slow-releasing iodated oil. Iodization of water supplies proved wasteful since only a small proportion of water is used for drinking and cooking purposes. Bread iodization was used in the Netherlands as a wartime measure (Matovinovic and Ramalingaswami (1960)). Ancient civilizations treated goiter with burnt sponge or seaweed, which are rich in iodine (Curtis and Fertman (1951),Langer (1960)).

11

Experiments with schoolchildren confirmed that the size of goiters decreased after receiving iodine. The first such experiment took place in Akron, Ohio in 1917, under the direction of David Marine and O.P. Kimball. For details seeMarine and Kimball (1921), andCarpenter (2005).

12

Salt production takes three forms: evaporated, rock salt, and the production of liquid brine. In 1924 the quantities produced by these three methods were 2.22, 2.06, and 2.51 million short tons, respectively. Brine was used exclusively as a feedstock by the chemical industry. According to the Salt Institute (http://www.saltinstitute.org), as of today, virtually all food grade salt sold or used in the United States is produced by evaporation. This was the case in 1924 as well (personal communication from Richard Hanneman, President, Salt Institute, March 6, 2008). In 1924, Michigan was the largest producer of evaporated salt in the country, accounting for 36% of total salt production. The next largest producers were New York (18%) and Ohio (14%) (Katz (1927)).

13

Markel (1987). Collusion in the evaporated salt industry was widespread, and Morton acted as the price setter. Many companies literally made copies of Morton’s price schedule, simply replacing their company letterheads for that of Morton (Fost (1970)). Morton’s decision to iodize salt in 1924 would thus likely have affected a large percentage of households, both directly and through Morton’s influence on smaller companies.

14

Many thanks to Hoyt Bleakley for making us aware of this marvelous book.

15

Since county borders in the US are relatively static, it is straightforward to map theDefects sections to present day US counties.

16

Classical measurement error in the county of birth will introduce attenuation bias in our estimates, which should therefore be seen as lower bounds.

17

The AGCT was a predecessor to the Armed Forces Qualification Test (AFQT) that is currently given to enlistees.

18

SeeFerrie et al. (2012) for a more detailed description of AGCT scores in the NARA data.

19

During World War II military aviation was part of the Army and not a separate branch. The Army Air Force was established in June of 1941 as a semi-autonomous group within the Army. Prior to this reorganization the aviation wing of the Army was known as the Army Air Corps. We will refer throughout to the Army Air Force even though our data span this renaming.

20

US Air Force Historical Study #76, Classification and Assignment of Enlisted Men in the Air Arm 1917–1945, p. 44.

21

ibid, p.46.

22

ibid, p.56.

23

Many thanks to Joe Ferrie for pointing this out to us.

24

We must also note that correlation of the “weight” field with education is much stronger for this limited time period, at about 65%, compared with no correlation for the rest of the sample. This strong correlation, however, is only true for cohorts born up to 1925. It drops to 35% for the 1926 cohort, which only consists of 28 observations. There are no observations for the 1927 and 1928 cohorts.

25

We only include values between 20 and 180, which are realistic test score values. The vast majority of observations (96.3% of the sample) are in this range. In addition, because the recording of AGCT scores in this field was for such a short period of time, there is some question as to whether all enlistment places coded this field the same way. For this reason we also drew the histograms using only data from enlistment places with over 500 recruits and where the mean of the weight field is between 80 and 120 within the enlistment place. This eliminated less than 11,000 observations and did not substantially change the distributions.

26

Out of the 541 recruits who were assigned to the Air Forces over that period, none of them was born in 1926 or later, and only 37 individuals were born in 1925.

27

The New classification procedure was based on “The Physical Profile System”, which became operative in 1944, and classified recruits into three profiles, according to their ability to withstand strenuous combat conditions. 80% of men assigned to the AGF had to belong to the top profile, whereas only 10% of the AAF recruits came from the top group.

28

Source of data:Grove and Hetzel (1968). Exophthalmic goiter is an enlargement of the thyroid accompanied by bulging of the eyes, which is sign of hyperthyroidism. While some medical dictionaries use the definition stated here, others define exophthalmic goiter as being synonymous with Graves disease, which is the dominant cause of the condition today. It is clear that the definition used in the text was being applied in the vital statistics data.

29

1.1 per 100,000 for men, 7.0 per 100,000 for women.

30

An exception isMcClure (1934) who notes that goiter deaths in Detroit spiked in the second year following iodization.

31

High-goiter states are: Idaho, Oregon, Washington, Montana, Utah, Wyoming, Wisconsin, Michigan, North Dakota, Minnesota, West Virginia, Illinois, Iowa, Indiana, Nevada, Ohio, and Colorado.

32

We also estimated a Logit model, with similar results (available upon request).

33

We drop recruits from Alaska, because there was no variation in their enlistment branch (only one individual is registered in the Air Force throughout our enlistment period).

34

This means that enlistment for the 1923 birth cohort starts in January 1941, for the 1924 cohort it starts in January 1942, and for the 1927 cohort it starts in January 1945.

35

We ran similar specifications with the same datasets using education as an outcome variable, but found no effect of iodization on education. We also tried using Census Data to identify an effect of iodization on education, using the goiter level at thestate of birth, rather than thesection of birth. We have not found any effect of iodization on education levels. While surprising, this finding is consistent with the fact that in our data we find a very weak correlation between AGCT scores and education. Specifically, the correlation between AGCT scores and graduation from high school is 14%, while the correlation between AGCT scores and having one year of college or more is just 9%.

36

The high-goiter cutoff for each percentile of the distribution is 6.65 goiter cases per 1000 for the top 20th percentile, 5.4 goiter cases per 1000 for the top 25th percentile, and 5.21 for the top 30th percentile.

37

The median demand for Air Force recruits across the sample is 10.6%.

38

The 10th percentile of the distribution of demand is 1.5%. The 90th percentile is 22.6%, the 25th percentile is 5.4% and the 75th percentile is 17.8%.

39

There is one subtle issue, which is how the cutoff for entry into the AAF would have changed when treated cohorts from iodine-deficient regions began to enter the recruit pool. At this time, the average level of AGCT scores would have gone up, and if the test were continuously re-normed, the cutoff for entry into the AAF would have gone up as well. We do not have any information about whether this took place. In the calculations presented here, we make the assumption that the cutoff did not change.

40

As mentioned above, we do not know the algorithm that was applied in order for the AAF to hit the target number of recruits with AGCT scores above the average. An alternative to the method presented here would be for the AAF to set a minimum AGCT score such that, if it took all recruits with scores above this minimum, it would hit the target percentage above the median. For a given value ofX (the average IQ in iodine deficient areas), such a method would produce a larger effect of iodization on the probability of entering the AAF – and thus imply a lower rise in IQ - than the method that we consider.

41

See, for example,Cowie (1937) for a survey of a few counties in Michigan.

42

To list a few examples fromKelly and Snedden (1960), in surveys conducted in 1954 Nicaragua had an average rate of goiter of 22.6% among children, with some departments recording rates over 50%. In the late 1940s Colombia had rates reaching 81% (in Caldas) among schoolchildren. In the mid-1950s many endemic regions in Sierra Leone and Sudan had population goiter rates exceeding 40%. Similar rates were recorded around the same time in Malaysia, Indonesia, and other countries.

References

  1. Almond Douglas (2006) ‘Is the 1918 Influenza Pandemic Over? Long-Term Effects of In Utero Influenza Exposure in the Post-1940 U.S. Population.’ Journal of Political Economy. 114, 672–712 [Google Scholar]
  2. Almond Douglas, and Currie Janet (2011) ‘Killing Me Softly: The Fetal Origins Hypothesis.’ Journal of Economic Perspectives25, 153–72 [DOI] [PMC free article] [PubMed] [Google Scholar]
  3. Andersson Maria, Karumbunathan Vallikkannu, and Zimmermann Michael B (2012) ‘Global iodine status in 2011 and trends over the past decade.’ The Journal of nutrition142, 744–750 [DOI] [PubMed] [Google Scholar]
  4. Behrman Jere R., and Rosenzweig Mark R. (2004) ‘Returns to birthweight.’ Review of Economics and Statistics86, 586–601 [Google Scholar]
  5. Black Robert E, Allen Lindsay H, Bhutta Zulfiqar A, Caulfield Laura E, De Onis Mercedes, Ezzati Majid, Mathers Colin, and Rivera Juan (2008) ‘Maternal and child undernutrition: global and regional exposures and health consequences.’ The Lancet371, 243–260 [DOI] [PubMed] [Google Scholar]
  6. Bleichrodt Nico, and Ph Marise. Born(1994) ‘A meta-analysis of researh on iodine and its relationship to cognitive development’ In The damaged brain of iodine deficiency, ed. Stanbury John B. (New York: Cognizant Communication; ) pp. 195–200 [Google Scholar]
  7. Carpenter Kenneth J. (2005) ‘David Marine and the Problem of Goiter.’ Journal of Nutrition135, 675–680 [DOI] [PubMed] [Google Scholar]
  8. Connolly RJ (1971) ‘An increase in thyrotoxicosis in southern Tasmania after an increase in dietary iodine.’ The Medical Journal of Australia1, 1268–1271 [DOI] [PubMed] [Google Scholar]
  9. Copenhagen Consensus Center (2008) ‘Copenhagen consensus 2008.’ Fredriksberg, Denmark: Copenhagen Consensus Center [Google Scholar]
  10. Cowie David Murray (1937) ‘A Study of the Effect of the Use of Iodized Salt on the Incidence of Goiter.’ Journal of the Michigan Medical Association36, 647–655 [Google Scholar]
  11. Curtis George M., and Fertman M. Been (1951) ‘Iodine in Nutrition’ In ‘Handbook of Nutrition,’ second ed. (American Medical Association; ) chapter 6, pp. 111–135 [DOI] [PubMed] [Google Scholar]
  12. de Benoist Bruno, Mclean Erin, Egli Ines, and Cogswell Mary, eds (2008) Worldwide prevalence of anemia 1993–2005: WHO Global Database on Anemia (Geneva, Switzerland: World Health Organization; ) [Google Scholar]
  13. de Benoist Bruno, Andersson Maria, Egli Ines, Takkouche Bahi, and Allen Henrietta, eds (2004) Iodine status worldwide: WHO Global Database on Iodine Deficiency (Geneva, Switzerland: World Health Organization; ) [Google Scholar]
  14. Ferrie Joseph P, Rolf Karen, and Troesken Werner (2012) ‘Cognitive disparities, lead plumbing, and water chemistry: Prior exposure to water-borne lead and intelligence test scores among world war two us army enlistees.’ Economics & Human Biology10, 98–111 [DOI] [PubMed] [Google Scholar]
  15. Field Erica, Robles Omar, and Torero Maximo (2009) ‘Iodine deficiency and schooling attainment in Tanzania.’ American Economic Journal: Applied Microeconomics1(4), 140169 [Google Scholar]
  16. Fleischer Michael, Forbes Richard M., Harriss Robert C., Krook Lennart, and Kubota Joe (1974) ‘Iodine.’ In ‘The relation of selected trace elements to health and disease,’ vol. 1 of Geochemistry and the Environment (Washington, D.C.: National Academy of Sciences; ) chapter 3, pp. 26–28 [Google Scholar]
  17. Fost Carolyn Ann (1970) ‘The salt industry: A case study in the evaluation of public policy.’ unpublishedPh.D. dissertation, Southern Illinois University pp. 49–50 [Google Scholar]
  18. Grove Robert D., and Hetzel Alice M. (1968) Vital statistics rates in the United States, 1940–1960 (U.S. Department of health, education, and welfare, Public Health Service, National Center for Health Statistics; ) [Google Scholar]
  19. Katz Frank J. (1927) Mineral Resources of the United States 1924, Department of Commerce (Washington: US Government Printing Office; ) [Google Scholar]
  20. Kelly FC, and Snedden WW (1960) ‘Prevalence and geographical distribution of endemic goitre.’ Endemic Goitre, WHO Monograph Series No.44 [PubMed] [Google Scholar]
  21. Koutras Demetrios A., Matovinovic Josip, and Vought Robert (1980) ‘The Ecology of Iodine’ In Endemic Goiter and Endemic Cretinism, ed. John B. Stanbury MD and Hetzel Basil S. (John Wiley and Sons; ) chapter 9, pp. 185–195 [Google Scholar]
  22. Langer P (1960) ‘History of Goitre.’ Endemic Goitre, WHO Monograph Series No.44 pp. 9–25 [PubMed] [Google Scholar]
  23. Love Albert G., and Davenport Charles B. (1920) Defects Found in Drafted Men. Statistical Information Compiled from the Draft Records Showing the Physical Condition of the Men Registered and Examined in Pursuance of the Requirements of the Selective-Service Act. (Washington, D.C.: Government Printing Office; ) [Google Scholar]
  24. Mannar MG Venkatesh, and Bohac Lucie (2009) Achieving Universal Salt Iodization: Lessons Learned and Emerging Issues (Ottawa, Canada: Micronutrient Initiative; ) [Google Scholar]
  25. Marine David, and Kimball OP (1921) ‘The prevention of simple goiter in man.’ Journal of the American Medical Association77, 1068–1070 [Google Scholar]
  26. Markel Howard (1987) “‘When it rains, it pours”:Endemic Goiter, Iodized Salt and David Murray Cowie, MD.’ American Journal of Public Health77, 219–229 [DOI] [PMC free article] [PubMed] [Google Scholar]
  27. Matovinovic J, and Ramalingaswami V (1960) ‘Therapy and Prophylaxis of Endemic Goitre.’ Endemic Goitre, WHO Monograph Series No.44 pp. 385–410 [Google Scholar]
  28. McClendon Jesse F. (1924) ‘Inverse relation between Iodin in food and drink and Goiter, Simple and Exophthalmic.’ Journal of the American Medical Association82, 1668–1672 [Google Scholar]
  29. McClure Roy D. (1934) ‘Thyroid surgery as affected by the generalized use of iodized salt Salt in an Endemic Goitre Region - Preventive Surgery.’ Annals of Surgery100, 924–932 [DOI] [PMC free article] [PubMed] [Google Scholar]
  30. Micronutrient Initiative (2009) Investing in the Future: A United Call to Action on Vitamin and Mineral Deficiencies: Global Report
  31. National Archives and Records Administration (2002) ‘World War II Army Enlistment Records, 1938–1946’
  32. Palmer Robert R., Wiley Bell I., and Keast William R. (1948) The Procurement and Training of Ground Combat Troops United States Army in World War II: The Army Ground Forces (Washington, D.C.: Center of Military History, United States Army; ) [Google Scholar]
  33. Politi Dimitra (2010) ‘The impact of iodine deficiency eradication on schooling: evidence from the introduction of iodized salt in Switzerland’ School of Economics Discussion Paper, University of Edinburgh [Google Scholar]
  34. Scrimshaw Nevin S. (1998) ‘Malnutrition, Brain Development, Learning and Behavior.’ Nutrition Research18, 351–379 [Google Scholar]
  35. U.S. Department of Commerce, Bureau of the Census (various years) Mortality Statistics

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Supplementary Materials

Appendix

ACTIONS

RESOURCES


[8]ページ先頭

©2009-2025 Movatter.jp