NCBI Bookshelf. A service of the National Library of Medicine, National Institutes of Health.
Institute of Medicine (US) Committee to Review Dietary Reference Intakes for Vitamin D and Calcium; Ross AC, Taylor CL, Yaktine AL, et al., editors. Dietary Reference Intakes for Calcium and Vitamin D. Washington (DC): National Academies Press (US); 2011.
The last step in the risk assessment process is the step of so-called risk characterization. Its intent is to highlight the nature of the “risks” or public health problems that are relevant to the use of Dietary Reference Intakes (DRIs) and to alert users of theDRI reference values to implications of the assessors' work and to related special issues. This chapter reflects the risk characterization step of the risk assessment approach and is organized to provide: a brief summary of the assessment; discussions about the implications of the committee's work for stakeholders; and discussions to highlight population segments and conditions of interest relative to calcium and vitamin D nutriture.
The new DRIs establish, for the first time, anEstimated Average Requirement (EAR) and aRecommended Dietary Allowance (RDA) for calcium and vitamin D. Previously, the DRIs for these nutrients reflected Adequate Intakes (AIs). The ability to set EARs and RDAs rather than AIs enhances the utility of the reference values for national planning and assessment activities. It is important to recognize that these values are intended for the North American population, and also that the requirement for each nutrient is based on the assumption that the requirement for the other nutrient is being met.
Considerable effort was made to ensure that an array of indicators was examined as a possible basis for setting requirements, as well as upper levels of intake. The intent was to fully and objectively examine the scientific basis for the suggested benefit before drawing conclusions. Despite the many claims of benefit surrounding vitamin D in particular, the evidence did not support a basis for a causal relationship between vitamin D and many of the numerous health outcomes purported to be affected by vitamin D intake. Although the current interest in vitamin D as a nutrient with broad and expanded benefits is understandable, it is not supported by the available evidence. The established function of vitamin D remains that of ensuring bone health, for which causal evidence across the life stages exists and has grown since the 1997 DRIs were established (IOM, 1997). The conclusion that there is not sufficient evidence to establish a relationship between vitamin D and health outcomes other than bone health does not mean that future research will not reveal a compelling relationship between vitamin D and another health outcome. The question is open as to whether other relationships may be revealed in the future.
Of great concern recently have been the reports of widespread vitamin D deficiency in the North American population. Based on this committee's work and as discussed below, the concern is not well founded. In fact, the cut-point values used to define deficiency, or as some have suggested, “insufficiency,” have not been established systematically using data from studies of good quality. Nor have values to be used for such determinations been agreed upon by consensus within the scientific community. When higher cut-point values are used compared with those used in the past, they necessarily result in a larger proportion of the population falling below the cut-point value and thereby defined as deficient. This, in turn, leads to higher estimations of the prevalence of deficiency among the population and possibly to unnecessary intervention incorporating high-dose supplementation in the health care of individuals. National survey data suggest that the serum25-hydroxyvitamin D (25OHD) levels in the North American population generally exceed the levels identified in this report as sufficient for bone health, underscoring the inability to conclude that there are significant levels of deficiency in the population.
Specifically in terms of the new DRIs and challenges for calcium and vitamin D nutriture, several points can be highlighted, within the context of the limitations of estimates of dietary intake, which tend to be under-estimates of actual consumption. First, for calcium, adolescent girls continue to be a group at risk for low intakes from food sources. Older women use calcium supplements in greater proportion, and some may be at risk for excess intake as a result of the use of high-dose supplements. If supplements are needed to ensure adequate calcium intake, it would appear that lower dose supplements should be considered. Many older women have baseline calcium intakes that are close to or just below requirements, and therefore the practice of calcium supplementation at high levels may be unnecessary. This is a special concern for calcium supplement use given the possibility that total intakes (diet plus supplements) above 2,000 mg/ day may increase the risk for kidney stones, and demonstrate no increase in benefits relative to bone health. There is also some limited evidence that the long-term use of calcium supplements may increase the risk for cardiovascular disease. Although no attempt was made to compare systematically the data used for the North American population that is the subject of this report with data from other countries focused on persons who are genetically and environmentally different from those in the United States and Canada, it should be recognized that calcium requirements may be subject to a variety of factors that have not yet been fully elucidated and so therefore cannot yet be integrated intoDRI reviews.
For vitamin D, the challenges introduced by issues of sun exposure cannot be ignored. This nutrient is unique in that it functions as a prohormone, and the body has the capacity to synthesize the nutrient if sun exposure is adequate. However, concerns about skin cancer risk preclude making recommendations about sun exposure; in any case, there are a number of unknowns surrounding the effects of sun exposure on vitamin D synthesis. At this time, the only solution when DRIs are to be set for vitamin D is to proceed on the basis of an assumption of minimal sun exposure and set a reference value assuming that all of the vitamin D must come from the diet. Moreover, the possibility of risk for persons typically of concern because of reduced synthesis of vitamin D, such as persons with dark skin or older persons in institutions, is minimized given the assumption of minimal sun exposure for the DRIs.
One unknown in the process ofDRI development for vitamin D is the degree to which waning kidney function with aging may be relevant. It appears that increasing serum 25OHD levels do not typically increase calcitriol levels in aging persons with mild renal insufficiency, and a dietary strategy to address the concern is not evident.
Although ensuring adequacy is important, there is now an emerging issue of excess vitamin D intakes. A congruence of diverse data on health outcomes ranging from all-cause mortality to cardiovascular risk suggests that adverse health outcomes may be associated with vitamin D intakes that are much lower than those classically associated with hypervitaminosis D and that appear to occur at serum 25OHD levels achievable through current levels of supplement use.
The extensive review of the data required to conduct this study and to determine DRIs for calcium and vitamin D that are consistent with existing scientific understandings has answered many questions. But, the process has also identified or left unanswered other questions due to the limitations of the available evidence. Because uncertainties exist in the knowledge base related to the role of vitamin D and calcium in health outcomes, it is important to acknowledge that there are uncertainties surrounding these reference values for calcium and vitamin D. The development of any reference value should be viewed as a work in progress, which may be subject to change if there are significant changes in the science base.
Further, an important aspect ofDRI development is its grounding in public health applications and the concept of distributions of risk. This approach may appear strange to some and may be disconcerting to those with a clinical orientation who are familiar with the medical model in which the goal is to treat the patient in the most efficacious manner to enhance a positive outcome. The interpretation and use of data in the case of DRI development are within the context of the relevant probability distributions of risk; the DRI task focuses on median requirements and the description of risk, whereas the medical model is based on maximizing effects that ensure beneficial outcomes for all persons. This report, therefore in contrast to a medical model approach, determines dose–response relationships by assessing the level at which 50 percent of the population's needs are met (theEAR) and the level at which approximately 97.5 percent of the population are likely to have their needs met (theRDA). The distribution of dose–response effects is highly relevant to DRI development, compared with information about a maximizing effect for benefit. A difficulty the committee too often faced was studies that included only a placebo or baseline low dose coupled with a relatively large, single supplemental dose, as these are relatively uninformative for DRI development.
Discussions below call attention to the uncertainties surrounding theDRI values for calcium and vitamin D and also highlight important conclusions that stem from the process of developing these DRIs. In addition, given that this report is the first effort to develop DRIs since the 2007IOM workshop that explored lessons learned and new challenges and outlined the risk assessment approach for DRI development (IOM, 2008;Taylor, 2008), comments are offered about the process. Specific research recommendations for the future development of DRIs related to calcium and vitamin D are presented inChapter 9.
The committee's assumption of minimal sun exposure is a markedly cautious approach given that the vast majority of North Americans appear to obtain at least some vitamin D from inadvertent or deliberate sun exposure. Currently, there is a lack of information about whether certain levels of sun exposure may be experienced without increased risk of cancer and whether such exposure would be consistent with a contribution of vitamin D useful to the body. Therefore, at this time, recommendations concerning sun exposure relative to vitamin D requirements cannot and should not be offered; there are no options other than to base dietary recommendations on the assumption of minimal sun exposure. The evidence to indicate that the synthesis of vitamin D from sun exposure is subject to a feedback loop that precludes toxicity from sun exposure is reassuring and, when coupled with the checks and balances introduced into theDRI development process, makes it very unlikely that consumption of the DRI levels of vitamin D, even if combined with high levels of sun exposure, will be problematic to the general population.
However, given that many North Americans appear to obtain at least some vitamin D from inadvertent or deliberate sun exposure, there are implications for the interpretation of intake levels of the vitamin. In short, the intake data for vitamin D cannot stand alone as a basis for public health action on a national population level. Such considerations are consistent with the 2000IOM report on applications of DRIs in dietary assessment (IOM, 2000), which states: “Whenever possible, the assessment of apparent dietary adequacy should consider biological parameters such as anthropometry, … biochemical indices, … diagnoses, … clinical status, and other factors as well as diet. Dietary adequacy should be assessed and diet plans formulated based on the totality of the evidence, not on dietary intake data alone.” In short, for policy making and decisions about the adequacy of the food supply for the general population at the national level, vitamin D must be considered in the context of measures of serum 25OHD, an established biomarker of exposure from endogenous synthesis as well as diet, including supplements. Although the reported estimates of vitamin D intake appear to be less than needed to meet requirements, the serum 25OHD data available—when coupled with the committee's assessment of serum 25OHD levels consistent withEAR andRDA values—suggest that average requirements are being met for theDRI age groups nationally in both countries. That is, although mean total intakes of vitamin D generally are lower than the estimated median requirement (the EAR), the available clinical measures do not suggest widespread deficiency states. This underscores the possibility that sun exposure is contributing generally to the maintenance of adequate serum 25OHD concentrations.
As discussed in the preceding chapters, there are limited data for many topics of interest in settingDRI values for calcium and vitamin D. Overall, the uncertainties surrounding the DRI values for calcium are less than those for vitamin D, because the evidence base is considerably larger for calcium, and the physiology and metabolism of calcium are better understood. The following key issues were identified by the committee as introducing uncertainty into the DRI values for calcium and vitamin D, as based on bone health outcomes:
An important question that will undoubtedly be asked given this committee's report, is: Why is it that so much information about the positive effects of vitamin D on outcomes such as cancer, diabetes, and immunity is said to exist and is reported almost daily in the press, but this committee found no basis to support these causal relationships? The short answer is that a systematic examination of the evidence, using established guidelines for measuring the strength and quality of studies, revealed that the claimed benefits based on the associations of low or high intakes of vitamin D on non-skeletal health outcomes could not be supported by the studies— the evidence was inconsistent and/or conflicting or did not demonstrate causality. In addition, some effects were not related to setting nutritional requirements for vitamin D. This conclusion, however, does not preclude pursuing investigation of causal relationships.
Moreover, a related question that will be asked is: With the advent of newer studies, why is there still so much uncertainty? At least one reason is that most studies were not designed to seek data maximally useful forDRI development, which is well described by others (Yetley et al., 2009). DRI development fundamentally requires elucidation of dose–response relationships and benefits from data of high quality obtained in randomized controlled trials. In making its conclusions about potential indicators other than bone health, the committee noted the findings previously specified by anIOM committee tasked with examining the evolution of evidence for nutrient and disease relationships (IOM, 2002). That committee concluded that evidence about relationships between specific nutrients and a disease or health outcome typically remains elusive for a number of reasons (IOM, 2002). These include the following:
The committee found all of the above findings to be the case for non-skeletal health outcomes for vitamin D, as the discussions of the strength, consistency, and causality of the evidence demonstrate inChapter 4.
Finally, an important uncertainty focuses on the issue of excess intake. This is particularly true for vitamin D, which has been hypothesized to confer health benefits at relatively high levels of intake. Although the committee's decisions for the ULs made use of emerging data concerning a U-shaped (or perhaps reverse-J-shaped) curve for risk, which suggested adverse effects at levels much lower than those associated with hypervitaminosis D, the lack of data on the safety of higher intakes of vitamin D when used chronically is very concerning.Byers (2010), in a recent editorial commenting on the outcomes of a pooling study focused on vitamin D and six types of cancer in which the only association observed was a doubling of the risk for pancreatic cancer for those in the highest quintile of circulating serum 25OHD levels, offered the following observation: “We have learned some hard lessons…. and we now know that taking vitamins in supernutritional doses can cause serious harm.”
Serum 25OHD levels have been used as a “measure of adequacy” for vitamin D, as they reflect intake from the diet coupled with the amount contributed by cutaneous synthesis. The cut-point levels of serum 25OHD intended to specify deficiency and sufficiency for the purposes of interpreting laboratory analyses and for use in clinical practice are not specifically within the charge to this committee. However, the committee notes with some concern that serum 25OHD cut-points defined as indicative of deficiency (or as reported by some, “insufficient”) for vitamin D have been subject to a wide variation in specification without a systematic, evidence-based consensus development process. In order to ensure clarity, the discussion in this section expresses serum 25OHD levels in both nmol/L and ng/mL measures.
From this committee's perspective, a considerable over-estimation of the levels of vitamin D deficiency in the North American population now exists due to the use by some of cut-points for serum 25OHD levels that greatly exceed the levels identified in this report as consistent with the available data. The 1997IOM report (IOM, 1997) specified a serum 25OHD concentration of 27.5 nmol/L (11 ng/mL) and above as an indicator of vitamin D adequacy from birth through 18 years of age, and a concentration of 30 nmol/L (12 ng/mL) and above as an indicator of vitamin D adequacy for adults. This level (27.5 nmol/L for children, and 30.0 nmol/L for adults) remains a level below which frank deficiency including rickets and osteomalacia may be expected to occur. In recent years, others have suggested different cut-point values as determinants of deficiency (or “insufficiency”). These include values ranging from less than 50 nmol/L (20 ng/ mL) to values above 125 nmol/L (50 ng/mL). Based on this committee's deliberations, the vitamin D–related bone health needs of approximately one-half of the population may be expected to be met at serum 25OHD concentrations between 30 and 40 nmol/L (12 and 16 ng/mL); most of the remaining members of the population are likely to have vitamin D needs met when serum concentrations between 40 and 50 nmol/L (16 and 20 ng/mL) are achieved. Failure to achieve such serum concentrations place persons at greater risk for less than desirable bone health as manifested by, depending upon age, increased rates of bone accretion, bone mineral density, and fractures.
Use of higher than appropriate cut-points for serum 25OHD levels would be expected to artificially increase the estimates of the prevalence of vitamin D deficiency. The specification of cut-point values for serum 25OHD levels has serious ramifications not only for the conclusions about vitamin D nutriture and nutrition public policy, but also for clinical practice. At this time, there is no central body that is responsible for establishing such values for clinical use. This committee's review of data suggests that persons are at risk of deficiency at serum 25OHD levels of below 30 nmol/L (12 ng/mL). Some, but not all, persons are potentially at risk for inadequacy at serum 25OHD levels from 30 up to 50 nmol/L (12 to < 20 ng/mL). Practically all persons are sufficient at levels of 50 nmol/L (20 ng/mL) and above. Serum concentrations of 25OHD above 75 nmol/L (30 ng/mL) are not associated with increased benefit. There may be reason for concern at serum 25OHD levels above 125 nmol/L (50 ng/mL). Given the concern about high serum 25OHD levels as well as the desirability of avoiding mis-classification of vitamin D deficiency, there is a critical public health and clinical practice need for consensus cut-points for serum 25OHD. The current lack of evidence-based consensus guidelines is problematic and of concern because individuals with levels of 25OHD serum above 50 nmol/L (20 ng/mL) may at times be diagnosed as deficient and treated with high-dose supplements of vitamin D containing many times the levels of intake outlined in this report.
Although this report identifies upper levels of intake below which adverse effects are not expected to arise, ULs are intended to serve as a lifetime public health measure for a free-living, unmonitored population. Those responsible for determining the appropriate dosages of nutrients to be studied in carefully controlled experimental trials conducted with appropriate adverse event and safety monitoring have the opportunity to bring other considerations into play when deciding on the levels of nutrients that are acceptable and appropriate for subjects taking part and being monitored in such studies. Research using intakes higher than those specified in the ULs can be justified under a number of circumstances after careful review of the literature and through the use of appropriate study protocols. Indeed, such studies are likely to be informative to the understanding of dose–response relationships and the health benefits or risks associated with calcium and vitamin D intakes.
As described inChapter 1, theDRI development process has recently been subjected to a review as well as targeted discussions about the process and ways to enhance it (IOM, 2008). As an overall result of these discussions, DRI development is now placed more clearly in the context of the risk assessment approach—that is, an organizing framework for conducting evaluations with public health implications often made with evidentiary uncertainties. There is also a series of existing “gap issues”—specifically, needed methodologies and guidelines—that have been identified as important to improving and enhancing the process for developing DRIs and would benefit from targeted efforts to resolve the gaps (Taylor, 2008).
The report of this committee is the firstDRI report to be completed subsequent to the 2004 to 2008 evaluation of the DRI development process. It has been structured to be consistent with the risk assessment process with the intent of enhancing its transparency, especially in the face of uncertainties. Although this committee was mindful of the identified methodological gaps for enhancing the DRI process, it was not tasked with addressing them; in any case, virtually all of the relevant issues are complex and suggest a need to convene groups of individuals with specific expertise germane to the question at hand. Because this DRI report is an initial effort to set DRI development on the path of a risk assessment approach, its experience points to the importance of addressing several gap issues.
Specifically:
As highlighted inChapter 3, excess adiposity or obesity—defined as a body mass index (BMI) measure of 30 mg/m2 or higher—is associated with lower serum 25OHD concentrations (and higher parathyroid hormone levels) than found in non-obese counterparts. This would appear to be due to sequestration of 25OHD by adipose tissue, given that supplementation of obese and lean persons with vitamin D appears to result in no significant difference in response between the two groups (Jones, 2008). Moreover, a few studies of modest weight loss have found circulating 25OHD levels to increase despite no increased intake of vitamin D from diet or sun exposure (Riedt et al., 2005;Reinehr et al., 2007;Zittermann et al., 2009;Tzotzas et al., 2010), suggesting release from adipose stores with adipose depletion. Further, neither season nor ethnicity influences these biochemical parameters (Alemzadeh et al., 2008).
An important concern is whether the lower serum 25OHD levels associated with obesity have meaningful consequences for theDRI indicator of bone health. Evidence for effects of obesity on bone density is mixed. The combined influence of increased weight-bearing activity and endogenous synthesis of estrogen due to outcomes of increased adiposity has long been associated with higher bone density (Reid, 2008). In a population-based study in Finland of perimenopausal and early postmenopausal women,Pesonen et al. (2005) found that increased body weight was a strong predictor of high bone density. Likewise,Morin and Leslie (2009), in a retrospective cohort study, found a strong correlation between higherBMI category and high bone density in postmenopausal women.
Although these and other studies have suggested that total body mass contributes to bone density and would appear to support the role of increased weight-bearing activity as a factor positively influencing bone density (Prentice et al., 1991;Khosla et al., 1996;Wortsman et al., 2000;Finkelstein et al., 2002,2008), more recent studies lead to further questions. The distribution of body fat may influence bone mass, such that excess intra-abdominal fat could adversely affect bone remodeling and even contribute to greater fracture risk (Premaor et al., 2010;Sukumar et al., 2011). One possibility is that intra-abdominal adipose tissue is more biologically active than subcutaneous fat, secreting cytokines and adipokines that negatively affect osteoblast and osteoclast activity (Kawai and Rosen, 2010). Moreover, both lean and fat mass contribute to weight-bearing effects. Because obesity is accompanied by increases in both lean mass and fat mass, at least in younger individuals, it is difficult to attribute the effect on bone density to fat mass as opposed to lean mass. Further, body composition changes with age, even in the obese; in turn, there may be less lean body mass in older individuals.
This complicates the ability to clarify how adiposity may affect bone health. As noted, some studies have suggested that adiposity or increased fat mass itself may be a factor in the development rather than the prevention of osteoporosis, particularly in the elderly.Zhao et al. (2007) observed that when the effect of mechanical loading from high body weight on bone density was statistically controlled, fat mass was inversely correlated with bone mineral content. Further investigation byZhao et al. (2008) suggested that molecular signaling pathways involved in osteoblast differentiation may contribute to the previously identified effect of increased adiposity on decreased bone mineral content, although a mechanism has not been elucidated. However, this science is just emerging and it is premature to speculate on its significance or relevance to bone health and bone density.
At this time, there is the possibility that obesity, at least in older persons, may not be beneficial for bone health and may be demonstrated to be a risk factor, not an advantage, for decreased bone density and, in turn, reduced bone health. There is no evidence that increases in calcium or vitamin D nutriture beyond the requirements specified for non-obese persons can affect this purported outcome.
The question of the impact of latitude on vitamin D nutriture is often a topic of concern or, at least, interest. The issue, however, is set in the context of the inability to specify a safe dose of sunlight that could contribute to vitamin D synthesis while also avoiding the risk of skin cancer. There are also the recognized challenges associated with quantifying the contributions from sun exposure coupled with the limited information on the role of stored vitamin D during seasonal changes. The prevailing assumption about the effect of latitude is that ultraviolet B (UVB) penetration decreases with increasing latitude (i.e., distance from the equator) and this, in turn, causes persons living at higher latitudes in North America to experience little or no UVB exposure, making them at risk for vitamin D deficiency. This assumption may not be entirely accurate. However, the question of latitude may work in tandem with other factors, discussed below, such as limited sun exposure overall or cultural and dietary practices. This section focuses only on the issue of latitude per se.
The relationship betweenUVB penetration and latitude is complex and not merely a function of distance from the equator. Other factors that come into play include the reduced atmosphere at the poles (about 50 percent less than at the equator), more cloud cover at the equator than at the poles, differences in ozone cover, and the duration of sunlight in summer versus winter. Geophysical surveys have indicated that UVB penetration over 24 hours during the summer months at Canadian north latitudes equals or exceeds UVB penetration at the equator (Lubin et al., 1998), suggesting that persons living in the northern latitudes are not necessarily receiving notably less total sunlight during the year. Rather, it suggests that there may be considerable opportunity during the spring, summer, and fall months in the far north for humans to form vitamin D and store it in liver and fat. Likewise, animals living in the same region that are consumed as part of the traditional diet are also rich sources of vitamin D (Keiver et al., 1988;Kenny et al., 2004;Brunborg et al., 2006;Kuhnlein et al., 2006).
These factors help to explain why latitude alone does not appear to predict serum 25OHD concentrations in humans. In a Finnish study, healthy subjects living above the Arctic Circle (latitude 66°N) did not have lower serum 25OHD levels than subjects living in southern Finland; in fact, the group living above the Arctic Circle had higher levels. Both groups achieved mean serum 25OHD levels above 90 nmol/L during the summer, whereas the mean serum 25OHD level at the winter nadir was 56 nmol/L in the south and 68 nmol/L in those living above the Arctic Circle (Lamberg-Allardt et al., 1983).
The DRIs for vitamin D established in this report are based on the assumption of minimal sun exposure. Therefore, they are regarded as adequate for persons who may be experiencing a reduced synthesis of vitamin D from sun exposure. Assuming that some population groups may be consuming less than the currentDRI values for vitamin D, the question is to what extent are these persons at risk for vitamin D deficiency, or, conversely, to what extent can inadvertent sun exposure be expected to compensate for lower intakes for these persons?
As described inChapter 3, skin pigmentation—due to melanin in the epidermal layer—can reduce the amount of vitamin D synthesized by the human body. The amount ofUVB required for changes in serum 25OHD levels is partly related to the degree of skin pigmentation. Further, a number of reports through the years have indicated consistently lower serum 25OHD levels in persons identified as black compared with those identified as white (Specker et al., 1985;Harkness and Cromer, 2005;Stein et al., 2006;Armas et al., 2007;Basile et al., 2007;Bodnar et al., 2007).Looker et al. (2008), using the National Health and Nutrition Examination Surveys (NHANES) 2000 to 2004, reported lower serum 25OHD levels for non-Hispanic blacks compared with Mexican Americans and whites. Mexican Americans had serum 25OHD concentrations that were intermediate between those of non-Hispanic blacks and whites.
The question is whether the consistently lower levels of serum 25OHD for persons with dark skin pigmentation have significant health consequences. Based on the data ofLooker et al. (2008), non-Hispanic blacks in theNHANES had an average serum 25OHD concentration of 40.14 nmol/L (± 0.88 nmol/L [standard error of the mean]). Given that 40 nmol/L may be reflective of an acceptable median level for serum 25OHD in serum based on this committee's work, it is difficult to suggest that this average serum 25OHD level is indicative of widespread deficiency, although such conclusions cannot be based solely on mean values. However, at least for those of African American ancestry, there are corollary data to suggest that rates of osteoporosis and bone disease are not higher among African Americans; in fact, African Americans have reduced rates of fracture and osteoporosis compared with whites (seeChapter 4). There are no data in this regard for other ethnic groups with dark skin, such as South Asians, so firm conclusions about their risk related to bone health cannot be drawn. Furthermore, it is possible that risk may be introduced or modulated by an array of variables, including cultural and ethnic practices.
Given the unknowns, dark-skinned immigrant groups who now reside in North America may present a concern, as described below. There is also a concern for dark-skinned infants and children whose overall diet may be low in calcium and who may have low serum 25OHD levels, especially if exclusively breast-fed and not otherwise supplemented (see below). The vitamin D and calcium issues related specifically to African Americans have been described earlier inChapter 4.
South Asian and Middle Eastern immigrant groups South Asians (e.g., Indians, Pakistanis, Sri Lankans) are now residing in greater numbers in North America, and are reported to be at increased risk for vitamin D–deficiency. This group has a significant presence in Canada and is growing in number (Statistics Canada, 2010). A recent study byWu et al. (2009) measured vitamin D intakes and serum 25OHD levels in three different ethnic groups in southern Ontario and found that levels were significantly lower in South Asians than in Eastern Asian or European groups. Over the past few decades, there have been sporadic reports of vitamin D–deficiency rickets in Canadians, almost always in breast-fed, dark-skinned Canadians of African or Asian descent, but the total number of cases, even in a major metropolitan area like Toronto, is small (17 over a 5-year period from 1988 to 1993) (Binet and Kooh, 1996). Similar to the situation in African Americans (seeChapter 3), the lower serum 25OHD levels observed are not associated with significant rises in the rates of bone disease (osteomalacia or rickets) in the Canadian South Asian cohort. In other South Asian communities living at relatively high latitudes (> 50°N) in Europe (e.g., Scotland), there have been reports of rickets and osteomalacia dating back to the early 1970s (Ford et al., 1976;Goel et al., 1976) and suggestions that vitamin D deficiency might also be associated with higher rates of tuberculosis (Yesudian et al., 2008). Although ensuring that the DRIs are met for these groups should reduce the risk for deficiency states to the extent that their conditions mimic those from minimal sun exposure, it is considered advisable to exercise vigilance for this growing group.
Some immigrant populations or religious groups adhere to cultural practices regarding clothing that can greatly reduce exposure to sun light and exacerbate the effects of low intake of vitamin D. There is the suggestion that at least 20 percent of the body's surface must be exposed toUVB for serum 25OHD levels to increase (Specker et al., 1985;Hollis, 2005). Whether such sun exposure is a wise public health practice for any group is not the issue, only that there is a need for awareness when such cultural practices limit sun exposure.
Dark-skinned, exclusively breast-fed infants In 2000, a report was published concerning rickets among nine children from various areas of the United States (Shah et al., 2000). Eight children were described as African American, and one child was described as Hispanic. All patients were primarily breast-fed for more than 11 months, with minimal intake of dairy products and without vitamin D supplementation. Breast milk, is of course, not a source of vitamin D for infants. This report had been preceded by a 1979 report fromBachrach et al. (1979), who noted 24 cases of vitamin D–deficiency rickets in black, breast-fed infants who were otherwise healthy and had no underlying malabsorptive or renal diseases, but whose parents belonged to groups that subscribed to dietary restrictions and clothing habits that minimized their exposure to sunlight. Later, a 2001 report described a black infant who was breast-fed until 10 months of age and then weaned to a soy food beverage that was not fortified with vitamin D or calcium (Carvalho et al., 2001). The infant developed normally until about 9 months of age when the child's height and weight became severely arrested. In 2003,DeLucia et al. (2003) commented on 43 children with nutritional rickets reported from 1986 through 2002 and located in the New Haven, Connecticut area. Approximately 86 percent were of African American, Hispanic, or Middle Eastern descent. More than 93 percent of the children had been breast-fed. In this case, the authors implicated both low calcium intake as well as marginal vitamin D nutriture in rickets.
A recent 2-year survey of Canadian pediatricians found the incidence of rickets in their patients to be 2.9 per 100,000; the mean age at diagnosis was 1.4 years (range of 2 weeks to 6.3 years). Ninety-four percent of the children with rickets had been breast-fed. Additional risk factors included dark skin, living in the far north, born of mother who took no vitamin supplements, limited sun exposure, emigrated from a region where vitamin D deficiency is endemic, and delayed initiation of solid foods (Ward et al., 2007).
Vitamin D supplementation of partially or fully breast-fed infants should begin in the first week of life and provide approximately 400IU/day, as breast milk is not a source of this nutrient for infants, and sun exposure to compensate for this cannot be adequately described but neither can it be recommended given the concerns for skin cancer. It is important to be especially vigilant regarding supplementation in the case of exclusively breast-fed, dark-skinned infants, as they appear to be at higher risk than lighter-skinned infants.
Sunscreen absorbs ultraviolet light and prevents it from reaching the skin. It has been reported that sunscreen with a sun protection factor (SPF) of 8 based on theUVB spectrum can decrease vitamin D synthetic capacity by 95 percent, whereas sunscreen with an SPF of 15 can reduce synthetic capacity by 98 percent (Matsuoka et al., 1987). The extent and frequency of use of sunscreen are unknown, and therefore the significance of the role that sunscreen may play in reducing the opportunity to synthesize vitamin D is unclear. Increases in serum 25OHD levels seen in summer months in national surveys conducted in both the United States and Canada would suggest either that sunscreen is not used consistently by the population as a whole or that the actual decrease in serum 25OHD level due to appropriate use of sunscreen has been overstated. Although inconsistent with advice provided by the American Academy of Dermatology4 and the National Council on SkinCancer Prevention5 for skin cancer protection, given the carcinogenic potential of UVB light, one report indicated that there is adequate vitamin D production when exposure of hands, face, arms and legs to sunlight is for an amount of time equal to about 25 percent of what it would take to develop a “mild sunburn”; after this extent of exposure, a sunscreen should be applied to prevent damage (Holick, 2003). However, this is in contrast to a recent report on mathematical models of observational data regarding the impact of seasonal sun exposure (Diffey, 2010). The effect of the use of sunscreen, as with other factors that may limit exposure, warrants vigilance. However, its use should not constitute a concern, given that theDRI values assume minimal sun exposure.
Increased urbanization and the normative condition among North Americans to work and recreate indoors cannot be quantified or addressed in terms of increased risk for vitamin D deficiency. The newly establishedDRI values assume minimal sun exposure, and therefore vitamin D intake need not be increased above this level for normal persons living in urban settings and spending time primarily indoors.
However, data for institutionalized, frail older persons suggest a propensity for lower serum 25OHD levels generally. Causation, however, is uncertain. It is likely that many factors contribute, such as their restriction to primarily indoor environments often coupled with inadequate total intake overall. Further, aging skin is known to be less effective in synthesizing vitamin D in part because of a decrease in skin provitamin D (7-dehydrocholesterol) levels and in part because of alterations in skin morphology (MacLaughlin and Holick, 1985). TheEAR andRDA values have taken this group into consideration to the extent possible and allowed by the data. Given the unknowns, however, monitoring institutionalized elderly people for vitamin D (and calcium) nutriture is appropriate. Supplementation, however, should not be random and without cause, because excess intakes of these nutrients may have adverse consequences for this frail sub-population.
Exclusion of dairy products occurs therapeutically in those with lactose intolerance or cow's milk food allergy, and voluntarily in those who are vegans or non-lacto vegetarians. As noted in the recent National Institutes of Health (NIH) Consensus Statement on Lactose Intolerance (Brannon et al., 2010;Suchy et al., 2010), exclusion of dairy products, all of which are rich sources of calcium and some of which are fortified with vitamin D (e.g., fluid milks, some yogurts, and limited other dairy products [Yetley, 2008]), can be a risk factor for inadequate intakes of calcium and vitamin D. This is also true for vegans (Craig, 2009) and likely others who systematically exclude dairy foods as well as other animal products from their diets. However, as pointed out by the American Dietetic Association (American Dietetic Association and Dieticians of Canada, 2003;Craig and Mangels, 2009) as well as the Dietitians of Canada (American Dietetic Association and Dietitians of Canada, 2003), appropriately planned vegetarian diets, including total vegetarian or vegan diets, are healthful and nutritionally adequate.
The North American prevalence of lactose intolerance, a clinical syndrome characterized by diarrhea, bloating and/or flatulence following consumption of lactose, is challenging to determine because the parameters surrounding lactose intolerance, lactose malabsorption, and lactase non-persistence are not well defined, and frequent self-diagnosis occurs (Brannon et al., 2010;Suchy et al., 2010). The prevalence of cow's milk allergy reported in a systematic evidence review (Rona et al., 2007) was 0.6 to 0.9 percent by skin test, specific immunoglobulin E measurement, or food challenge test; this is lower than the self-reported prevalence of 3 percent. Similarly to lactose intolerance, individuals may perceive that they have cow's milk food allergy when they do not. With respect to vegetarians, in 2006, approximately 1.4 percent ofU.S. adults and nearly 1 percent of children and adolescents 8 to 18 years of age self-reported that they were vegans, and 2.3 to 3 percent reported themselves to be vegetarians.6 In a 2002 survey, about 4 percent of Canadian adults reported being vegetarians (American Dietetic Association and Dietitians of Canada, 2003). Although there are few data to document the consequences of poorly planned diets that exclude dairy or animal products—it is noted thatCraig (2009) reported a 30 percent increased risk of fracture for vegans—it is best to assume that persons who have chosen or must follow such diets should make special efforts to ensure nutritional adequacy.
Strategies for ensuring adequate intakes of calcium and vitamin D vary depending on the reason for dietary exclusion. Using an Agency for Healthcare Research and Quality systematic evidence review as a basis (Shaukat et al., 2010;Wilt et al., 2010), anNIH Consensus Panel found that individuals with lactose intolerance or lactose malabsorption are able to tolerate up to 12 g of lactose, the equivalent of one cup of milk, in a single dose and may be able to tolerate larger amounts if consumed in smaller doses spread over the day and with other foods. Larger amounts of reduced-lactose dairy products such as certain yogurts and fluid milks as well as virtually unrestricted amounts of reduced-fat hard cheeses with very low amounts of lactose may be ingested to ensure adequate intakes of calcium. For those who avoid all dairy because of allergies or personal choice, consumption of non-dairy sources of calcium, such as low-oxalate vegetables (e.g., kale, bok choy, Chinese cabbage, broccoli, and collards), calcium-containing tofu, or fortified plant-based foods, such as cereals or fruit juice are feasible strategies to ensure adequate intakes of highly bioavailable calcium (Weaver et al., 1999). Finally, supplements of calcium are also a strategy, although care should be taken not to over-supplement.
Meeting vitamin D needs is more challenging in the absence of sun exposure. Plant foods are not natural sources of vitamin D,7 but the marketplace in the United States is increasingly offering plant-based fortified alternatives such as cereals and juices. In addition, the Canadian food supply includes margarines fortified with vitamin D and plant-based beverages that are fortified with vitamin D and calcium. Such fortified foods can be helpful in meeting the DRIs across age groups. As with calcium, a dietary supplement of vitamin D is also an option, but total intake (foods plus supplements) should not exceed theTolerable Upper Intake Level (UL).
Among the indigenous Canadian populations, switching from a traditional diet that contains vitamin D–rich foods to a westernized diet may increase the likelihood of vitamin D deficiency, especially ifUVB exposure is limited or avoided. This has been underscored by a survey of Inuit living in Greenland, which reported that those consuming a westernized diet had lower serum 25OHD levels than those consuming a traditional diet (32 vs. 53 nmol/L in summer, 29 vs. 41 nmol/L in winter) (Rejnmark et al., 2004). As noted above, there is ample opportunity during the spring, summer, and fall months in the far north for animals that commonly comprise the traditional diet of indigenous groups to form vitamin D and store it in liver and fat. In turn, the blubber and liver of various arctic marine mammals (e.g., seal, narwhal, beluga, walrus) and fish (e.g., char, cisco, lake trout, loche, sculpin, whitefish) are sources of vitamin D for those who consume a traditional diet (Keiver et al., 1988;Kenny et al., 2004;Brunborg et al., 2006;Kuhnlein et al., 2006).
The Canadian Health Measures Survey does not collect data on these indigenous populations living at upper northern latitudes, and overall dietary and health data for them are limited. One recent survey (Kuhnlein et al., 2008) in northern Canada found that the intakes of vitamin D differed by ethnic group. The median vitamin D intake was 200IU/day in both Yukon First Nations8 and Dene/Métis. However, much higher median intakes were found in older (over age 40) Inuit who consumed a traditional diet (1,000 IU/day and 680 IU/day in men and women, respectively), whereas younger Inuit had much lower intakes (328 IU/day and 372 IU/day in men and women, respectively). This research group also surveyed indigenous women of reproductive age from various communities in the Canadian Arctic and found the mean daily intakes of vitamin D to be 456 IU/day in Inuit from Qikiqtarjuaq, 364 IU/day in Inuit from 18 other communities, and 228 IU/day in a combined data set of Dene, Métis, and Yukon First Nations. Pregnant and lactating women had higher vitamin D intakes, with the highest mean intake being 816 IU/day in lactating Inuit from Qikiqtarjuaq (Berti et al., 2008). Neither of these surveys measured serum 25OHD levels.
A 1999 survey (Smith, 1999) estimated vitamin D intakes and measured serum 25OHD levels in 121 pregnant women living in the Inuvik region of the Northwest Territories. The sample included 33 whites, 51 Inuit, and 37 First Nations people. The investigator did not report whether the First Nations and Inuit mothers were consuming a traditional or a western diet; moreover, the accuracy for the measures of the vitamin D content of traditional foods is unclear. The estimated daily mean vitamin D intake of Inuit and First Nations people was 324IU/day with supplements (136 IU/ day without) compared with 532 IU with supplements (232 IU without) to whites. At the point of delivery, the plasma levels of 25OHD were lower in the First Nations and Inuit mothers and their babies than in their white counterparts. Not quite as far north, a survey of 104 pregnant women from three First Nations communities in northern Manitoba found that their serum 25OHD levels ranged from < 15 nmol/L (undetectable) to 63 nmol/L, with mean values of 18, 21, and 24 nmol/L in each of the three communities (Smith, 1999). No information was provided in that report as to whether the women were consuming a traditional or western diet. A chart review was done of all babies born in 1993 and 1994 to determine how many had been diagnosed with rickets, and a high prevalence was found. Despite similar serum 25OHD levels in all three communities, there was a marked difference in the prevalence of rickets: 85/1,000 and 55/1,000 in two communities, but none in the third. No clear explanation for the differing prevalence was obtained by the investigator (Smith, 1999).
Taken as a whole, the limited data surrounding indigenous Canadian populations suggest a basis for concern regarding vitamin D nutriture, most notably in the likelihood that typical diets are changing from traditional foods to more westernized foods. Although the assumption of minimal sun exposure underpinning theDRI values may not entirely align with this group of people who may experience considerable sun exposure in the summer, ensuring that the diet meets the DRI values should provide assurances that risk of vitamin D deficiency has been greatly reduced.
The forms and nature of calcium supplements have been discussed inChapter 2, and their possible role in kidney stone formation as well as the emerging data regarding possible adverse cardiovascular effects have been outlined inChapter 6. The mechanisms for differential effects of food sources and supplement forms of calcium on kidney stone formation are complex and may relate to the timing of calcium administration. Approximately 80 percent of kidney stones contain calcium combined with oxalate or, less often, phosphate (Park and Pearle, 2007).Calcium in food or in supplements taken with food is believed to bind to dietary oxalate in the digestive tract, reducing the absorption and subsequent urinary excretion of oxalate and thus risk for kidney stones (urinary oxalate may be more critical than urinary calcium with respect to calcium oxalate crystallization) (Curhan et al., 1997). When calcium supplements are not taken with food, dietary oxalate is absorbed unopposed and thus is more available for stone formation. Although dairy foods, which are the major source of calcium in much of North America, have been suggested to contain an unidentified protective compound not found in supplements (Curhan et al., 1997), this possibility has not been well studied. Obtaining sufficient calcium via dietary sources is the preferred strategy—and it remains uncertain as to whether taking calcium supplements with food may reduce the likelihood of stone formation associated with supplement use. Head-to-head comparisons of different calcium supplement formulations with respect to risk for kidney stone formation are also lacking. In any case, given the desirability of not surpassing theUL for calcium intake and given that even those not meeting their requirement for calcium are nonetheless consuming some calcium from dietary sources that range from breads to dairy products, care must be taken in selecting a calcium supplement that when combined with dietary intake does not result in a total intake above the UL. The UL for a sizable proportion of the population, including groups that commonly consume calcium supplements, is 2,000 mg/day, which is relatively close to theEAR andRDA values. For these more vulnerable groups, supplements containing amounts less than the RDA may be appropriate given that their diet is likely to contain at least some calcium. Further, until better information is available to clarify the possible link between supplement use and kidney stone formation, taking calcium supplements with foods is advisable.
Moreover, in the case of persons prone to developing kidney stones who cannot get adequate calcium from diet (e.g., due to lactose intolerance), there is limited evidence from small, short-term trials suggesting that supplemental calcium in moderate doses may not increase risk for stone recurrence (Levine et al., 1994;Williams et al., 2001;Lewandowski and Rodgers, 2004). Again, taking supplements with food is desirable.
The ULs are defined for the healthy, general population. Nonetheless, gray areas are acknowledged to exist between healthy people and those with medical conditions; for some persons in these gray areas a calcium intake as high as theUL may no longer be considered without any risk. The effect of calcium intake in situations of hypercalciuria is not fully understood, but conditions leading to hypercalciuria (which may be exacerbated by adding extra vitamin D to an already high calcium intake) may warrant a more cautious approach to ULs for calcium in the future. In older adults experiencing illness or decline, hypercalciuria may develop. For pregnant women experiencing absorptive hypercalciuria and therefore at higher risk of renal stone formation, keeping calcium intake below the UL may also be most appropriate. Similarly, as lactation drives bone resorption, urinary calcium excretion decreases, the ionized serum calcium concentration rises slightly, intravascular volume is contracted and occasionally women become hypercalcemic. Under these and similar conditions, ensuring a calcium intake below the UL may be most appropriate. Greater surveillance of urinary calcium excretion in future studies may shed more light on the relationship between higher levels of total calcium intake and risk of hypercalciuria or hypercalcemia under special conditions.
The use of ethinyl estradiol oral contraceptives (OCs) has been hypothesized to reduce bone resorption and preserve bone density in premenopausal and postmenopausal women. This concept was based on clinical and observational evidence that ethinyl estrogen–based hormone replacement therapy reduced risk for osteoporosis in postmenopausal women (Zittermann, 2000). A non-systematic review of clinical trials carried out before 1994 indicated that the evidence at that time largely supported positive effects of OCs on bone density in postmenopausal women, although a number of trials in the review showed no effects (DeCherney, 1996). Among clinical trials and observational evidence examining the effects of OCs on bone density from the past two decades, results have been mixed and, when considered in total, are inconclusive. A systematic review of 75 studies of varied design, including 11 randomized controlled trials, examined outcomes ofOC use and bone density in healthy premenopausal, amenorrheic premenopausal, anorexic premenopausal, and perimenopausal women (Liu and Lebrun, 2006). A meta-analysis was not done; however, the review found good evidence for a positive effect of OCs on bone density in perimenopausal women, fair evidence for an effect in amenorrheic premenopausal women, and limited evidence for an effect in anorexic and healthy premenopausal women.
Observational studies published sinceLiu and Lebrun (2006) also suggest mixed results from studies onOC use and bone density that may be related to the population group studied. A small study on OC use and bone density and bone size in a young white female cohort found that OC use had a significant negative effect on bone density at the spine and heel and resulted in a non-significant decrease in hip bone density (Ruffing et al., 2007). Similarly,Hartard et al. (2007), in a cross–sectional analysis of young white women taking OCs, also suggested a negative effect of OCs on bone density. Women who had ever used OCs had significantly lower bone densities at the tibial shaft and femoral neck compared with those who had never used OCs. In premenopausal and postmenopausal women no significant difference was found between OCs users and never users in another cross–sectional study of the effects of OCs on bone density and bone markers (Allali et al., 2009).
Randomized trials of estrogen treatment with and without vitamin D and calcium supplementation suggest a positive effect on bone density in postmenopausal women.Recker et al. (1999) tested vitamin D and calcium supplementation with and without low-dose hormone replacement therapy for effectiveness in maintaining bone density in postmenopausal women more than 65 years of age. Although this study did not differentiate between hormone replacement therapy alone and therapy combined with vitamin D and calcium supplementation, it did suggest an effect of increasing bone density and bone markers in older women who received the combination therapy compared with those who received vitamin D and calcium supplementation alone. A randomized, double-blind, placebo-controlled trial ofOC therapy either alone or combined with calcitriol therapy found a significant increase in bone density and reduction in bone resorption at the hip compared with OC therapy alone in postmenopausal women (ages 65 to 77 years) who had normal bone density for their age (Gallagher et al., 2001). Another prospective randomized trial in postmenopausal women (ages 53 to 79 years) treated with hormone replacement therapy alone or with calcitriol also found a significant increase in bone density, at multiple sites and total body, for the combined therapy compared with hormone replacement alone (Gutteridge et al., 2003).
Given the variability in all the study outcomes reviewed by the committee and the unresolved question of the effect of age and endogenous estrogen status on the ability of OCs to preserve bone density or prevent bone resorption, specific recommendations to address the impact of OCs with or without vitamin D and calcium supplementation for both premenopausal and postmenopausal women cannot be offered at this time.
Premature infants are a clinical population and thus outside the scope of this committee's task, which is focused on the normal, healthy population. However, because premature infants are a highly vulnerable group and do raise special concerns relative to calcium and vitamin D nutriture, this group is discussed here briefly.
The minerals in human milk, especially calcium and phosphorus, do not fully meet the needs of rapidly growing premature infants who rely primarily on passive intestinal absorption of calcium, therefore “this and other factors place premature infants at high risk for nutritional rickets” (Abrams, 2005). “The recent addition of various forms of mineral salts and/or mineral fortifiers to human milk and the use of specialized preterm infant formulas with high calcium content have been reported to enhance the amount of calcium and other minerals retained from the diet, to increase the bone mineral content of the infants and to decrease the incidence of osteopenia and frank rickets in preterm infants (Schanler et al., 1988;Schanler and Abrams, 1995;Schanler, 1998)... The bioavailability of the calcium in these fortifiers may be a key aspect of their adequacy. Using a commercially available human milk fortifier,Schanler and Abrams (1995) reported that net calcium retention was 104 ± 36 mg/kg body weight per day in premature infants, a value approximating thein utero accretion rate during the third trimester. These retention values are well above those achieved using earlier human milk fortifiers (Schanler et al., 1988)” (Abrams, 2005).
“Of interest is that calcium absorption from both fortified human milk and specialized preterm formula averages 50 to 65 percent in many studies (Abrams et al., 1991;Bronner et al., 1992). This constancy of absorptive fraction in premature infants suggests that much of the calcium absorption by premature infants and newborn full-term infants is not vitamin D dependent…” (Abrams, 2005), which is the conclusion of a review of more than 100 balance studies byBronner et al. (1992).
How much vitamin D is needed by premature infants is more difficult to determine. Unfortunately, there are no studies using modern isotope techniques of the effects of vitamin D on calcium absorption in premature infants, nor could such studies be possible practically or ethically. One study with oral vitamin D intakes as low as 160IU/day (Koo et al., 1995) and multiple studies with intakes of 200 to 400 IU/day (Cooke et al., 1990;Pittard et al., 1991;Backstrom et al., 1999a) “demonstrated adequate serum 25OHD concentrations and clinical outcomes with oral vitamin D intakes as low as 160 IU/day (Koo et al., 1995). In addition, studies have generally failed to show any clinical benefit to increasing vitamin D intake above 400 IU/day in preterm infants (Backstrom et al., 1999b)” (Abrams, 2005).
Routine measurement of serum 25OHD levels in premature infants is not supported by currently available clinical research. No studies have related serum 25OHD level in these infants to specific clinical outcomes, and extremely few data suggest a dose–response relationship between serum 25OHD levels and other outcomes. A normal level at different gestational ages or postnatal ages is not available for 25OHD in serum based on end-points such as calcium absorption or bone mineral content. However, in the presence of a likely impairment of 25-hydroxylation, such as might be present in an infant with cholestasis, measurement of serum 25OHD level might be considered, especially to ensure a level at or above 50 nmol/L (20 ng/mL). “The effects of other formula components on mineral absorption have also been considered. A study using a triple lumen perfusion technique demonstrated that calcium absorption was greater using a solution that included a glucose polymer rather than lactose (Stathos et al., 1996). As glucose polymers are widely used in preterm formulas, this effect may be clinically important. Altering the fat blend of infant formula to more closely resemble that of human milk may also enhance mineral absorption in premature infants (Carnielli et al., 1995;Lucas et al., 1997)” (Abrams, 2005).
Although clinical practice and related guidelines are outside this committee's purview, it is useful to acknowledge that measures of the various forms of vitamin D can be affected by prescription drugs and related medications. A brief listing of key interactions can be found inTable 8-1.
Drugs and Their Effect on Vitamin D Metabolism.
“New methodologies—many from other fields of study—are emerging and can be useful for examining and approximating dose–response relationships when available data are limited. These should be more closely examined and incorporated into theDRI process as appropriate” (Taylor, 2008).
“There is considerable interest—as well as more than 10 years of experience—surrounding the inclusion of chronic disease indicators withinDRI development. A variety of perspectives were put forward. There is a need for focused discussions about how to include chronic disease indicators in the DRI process, including specific approaches for addressing their confounders, identification of appropriate biomarkers, and quantifying their effects” (Taylor, 2008).
“There is broad interest in addressing the AIs as a component of theDRI values, but no clear path has emerged in terms of clarifying, adapting or eliminating AIs. Nor is there agreement about directions to be taken in the future forAI development” (Taylor, 2008).
Available online athttp://www
Available online athttp://www
Available online athttp://www
Some algal supplements and mushrooms that have been processed with irradiation contain vitamin D, but not in significant amounts. Available online athttp://ods
First Nation: A term that came into common usage in the 1970s to replace the word “Indian.” Among its uses, the term “First Nations peoples” refers to the Indian peoples in Canada, both Status and non-Status. Definitions available online athttp://www
Your browsing activity is empty.
Activity recording is turned off.
See more...