|Home | About | Journals | Submit | Contact Us | Français|
Observations that dual-energy x-ray absorptiometry (DXA) measures of areal bone mineral density cannot completely explain fracture incidence after anti-resorptive treatment have led to renewed interest in bone quality. Bone quality is a vague term, but generally refers to the effects of skeletal factors that contribute to bone strength but are not accounted for by measures of bone mass. Since a clinical fracture is ultimately a mechanical event, it follows then that any clinically relevant modification of bone quality must change bone biomechanical performance relative to bone mass. In this perspective, we discuss a framework for assessing the clinically relevant effects of bone quality based on two general concepts: 1) the biomechanical effects of bone quality can be quantified from analysis of the relationship between bone mechanical performance and bone density; and 2) because of its hierarchical nature, biomechanical testing of bone at different physical scales (<1mm, 1mm, 1 cm, etc.) can be used to isolate the scale at which the most clinically relevant changes in bone quality occur. As an example, we review data regarding the relationship between the strength and density in excised specimens of trabecular bone and highlight the fact that it is not yet clear how this relationship changes during aging, osteoporosis development, and anti-resorptive treatment. Further study of new and existing data using this framework should provide insight into the role of bone quality in osteoporotic fracture risk.
Everyone knows that bone strength is determined by a combination of bone size, shape, and material properties  (Figure 1). Measures of bone mass and density, such as dual-energy x-ray absorptiometry (DXA) measures of bone mineral content (BMC, g) and areal bone mineral density (aBMD, g/cm2), explain a substantial portion of the effects of bone size, shape and material properties and are strongly correlated with bone mechanical performance and fracture risk [2, 3]. However, these measures do not completely explain fracture incidence. It has been reported that over half of those who experience fragility fractures do not have aBMD t-scores below the threshold used to identify osteoporosis . It has also become clear that the modest average increases in aBMD of 5–8% caused by anti-resorptive treatments cannot explain the associated 50–60% reductions in fracture incidence [5–7]. As a result, there has been increased interest in aspects of bone size, shape and material properties that influence bone’s ability to resist fracture, but are not explained by aBMD. The term “bone quality” is commonly used in relation to these characteristics and their effects. A number of characteristics of bone have been implicated as important aspects of bone quality [8–10] (Table 1), leading to a proliferation of studies seeking to determine how these characteristics change during aging, disease development and treatment. However, because these characteristics are often related to each other or bone mass, it is not always clear how or if these characteristics influence whole bone mechanical properties and fracture. Thus, there remains a need to assess changes in bone quality in a more clinically meaningful manner.
Although the term “bone quality” has been used in the literature for more than 15 years, its meaning remains vague and elusive [11, 12]. What is clear is that if bone quality is to be important in determining fracture risk it must play a role in determining bone mechanical properties [1, 13, 14]. The term “bone quality” is used in two ways in the literature. In one usage, bone quality represents the sum of all characteristics of bone that affect the ability of bone to resist fracture (i.e. all aspects of bone size, shape, and material properties) [9, 10]. In another usage, bone quality refers to the influence of factors that affect fracture but are not accounted for by bone mass or quantity (Figure 1) [8, 12]. Given the clinical interest in bone quality, we will use the latter of these two definitions of the term, although we emphasize that there is no current consensus in the field . Regardless of one’s preference as to a general definition of bone quality, bone quality remains a skeletal trait and therefore cannot account for any non-skeletal factors that might also contribute to fracture incidence such as risk of falling or limitations of commonly used measurements of bone mass. It has been suggested, for example, that limitations of DXA measurements (repeatability, inability to differentiate cortical and trabecular bone, inaccuracies due to local soft tissue, etc.) are partially responsible for the discrepancies between treatment-induced changes in aBMD and fracture incidence . The contributions of such inaccuracies do not represent bone quality because they are not inherent to the bone and would not be observed using other measures of bone mass. Thus, while discrepancies between changes in aBMD and fracture risk have formed the clinical motivation for the study of bone quality, that is not to say that bone quality as a concept should be defined in terms of DXA measures of areal BMD or any other specific measures of bone mass.
Rather than attempt to resolve the controversy of precisely defining bone quality, it may be more relevant to focus on quantifying the biomechanical effects of changes in bone quality. If differences in bone quality are to account for a portion of bone fragility, as shown in Figure 1, then bone quality must influence bone mechanical properties in ways that are not accounted for by bone mass. Since a clinical fracture is ultimately a biomechanical event, it follows then that any clinically relevant modification of bone quality must change bone biomechanical performance relative to bone mass. This is a key, but often overlooked biomechanical consequence of changes in bone quality .
Evaluation of relations between biomechanical performance and bone mass will, of course, depend on the nature of the specific measures of bone biomechanical performance and bone mass that are used. With regard to measures of biomechanical performance, there are a number of different assays that can be used to indicate bone fragility, including bone stiffness, strength, toughness, post-yield deformation, fatigue, and creep properties. In addition, these assays can be performed under a number of different loading conditions such as compression, tension, shear, or bending, alone or in combination, and can be applied either cyclically or monotonically, short- or long-term, and at different loading rates. At present, it is not yet clear which of these assays or loading modes is most closely related to fracture incidence, although strength is most intuitive since it relates directly to the force capacity of a bone for a single event.
Although bone mass is a combination of bone size, shape and tissue material properties, it has been common practice to normalize bone mass by bone size and report measures of bone density. Evaluation of bone density rather than mass removes some of the effects of whole bone size on bone fragility, but has nevertheless been useful clinically because of relationships between bone size, body weight and typical mechanical loads. Bone density is expressed in a number of different ways, including areal bone mineral density from DXA and volumetric bone mineral density from QCT (commonly measured non-invasively) as well as ash density, apparent density and tissue density or degree of mineralization (commonly measured directly in excised bone specimens). Since these measures of density differ in terms of units and measurement accuracy, they are not all equivalent. For example, clinical aBMD measures are not true measures of density as they are normalized by area instead of volume and can be biased by bone size and orientation . Tissue degree of mineralization is also limited not a true measure of density as it expresses mineral mass relative to tissue mass. Another commonly used measure related to bone density and mass is bone volume fraction (BV/TV). Bone volume fraction is directly proportional to apparent density and can be used as a surrogate measure of apparent density if one assumes variations in tissue density are small . Assumptions made when using a specific measure are important to keep in mind because bone density measures are often limited by the circumstances of a study, a fact that can limit resulting conclusions made regarding bone quality. In the remainder of our discussion we will concentrate on evaluations of bone mechanical performance relative to density because density measurements are most common clinically and experimentally.
A number of approaches can be used to analyze bone biomechanical performance relative to bone density. One approach is to examine the relationship between measures of bone biomechanical performance and bone density. As an example, consider a hypothetical study comparing two treatments that increase bone strength as compared to an untreated control (Figure 2A). Compared to the untreated bone (the solid line), bone exposed to Treatment 1 (the dashed line) shows increased bone strength for any given value of density. We would interpret this to indicate a difference in bone quality. By contrast, bone from the group exposed to Treatment 2 (the dotted line) displays a similar strength-density relationship as in the untreated group. Thus, although Treatment 2 has increased bone strength just as much as Treatment 1, we would conclude that it has not altered bone quality in a clinically relevant fashion. Although we have illustrated this concept using a linear relationship between strength and density, such comparisons are also valid for non-linear relationships, as proposed previously for interpretation of the effects of sodium fluoride treatment .
A second approach is to normalize measures of mechanical performance by bone mass or density on a per-specimen basis, for example, calculating a ratio of strength to density for each individual specimen. Ratios between mechanical properties and density are frequently used in engineering to identify the most efficient materials and structures for design. For example, commonly used steel alloys are much stronger than aluminum alloys, but the ratio of strength to density for aluminum alloys is greater, which is one reason why aluminum alloys have traditionally been more common in aircraft construction. The concept also applies to structures, in which a lighter structure is considered more structurally efficient than a heavier structure having similar strength. The arrangement of material within a structure may also contribute to such structural efficiency, e.g. beams with an I-shaped cross-section are widely used because of their great structural efficiency compared to rectangular cross-sections. By analogy, bones with higher values of the strength:density ratio are more biomechanically efficient. Intuitively, such bone would be considered to be of better quality than less structurally efficient bone. In our example (Figure 2B), bone exposed to Treatment 1 shows an increased strength:density ratio as compared to the two other groups, suggesting that it is different in terms of bone quality.
These two approaches to evaluating the bone biomechanical performance relative to bone density are not mutually exclusive. If the relationship between bone biomechanical performance and bone density is linear with a non-zero intercept or is non-linear, the ratios of the biomechanical performance to density among groups at opposite ends of the density range can differ even when data follows the same relationship (linear or non-linear). We therefore consider examination of the relationship between biomechanical performance and mass to be a more general method of detecting differences in bone quality as it can be used in any situation. However, comparisons between regression models can be difficult to achieve statistically and often require large sample sizes. The ratio of bone mechanical performance to bone density is much simpler to compare between individuals and groups and will yield similar conclusions when compared over similar skeletal regions.
In proposing these two approaches to evaluating the biomechanical effects of bone quality we have not specified what type of bone specimen is being studied. This is because the two approaches can be applied to any bone specimen at any physical scale feasible for mechanical testing. Obvious examples would be whole bones, excised specimens of cortical bone or trabecular bone, or even microscopic specimens such as individual osteons or trabeculae.
The sheer number and range in scale of proposed aspects of bone quality (Table 1) presents a challenge because rarely is one characteristic changed alone and occasionally some are associated with bone density. However, the fact that bone is a hierarchical structure (Figure 3)  can be quite useful for reducing the number of characteristics that must be considered when assessing the biomechanical effects of bone quality. As a hierarchical structure, the biomechanical performance of bone at a specific physical scale represents the net influence of all factors acting at lower physical scales. For example, if one performs biomechanical tests at a particular physical scale and no differences in bone quality are detected (using the above methods), one can conclude that there are no net effects on bone quality originating at lower scales – either because the lower scale characteristics of bone do not appreciably influence biomechanical performance or because their effects are counteracted by compensatory mechanisms. Furthermore, by performing tests at different scales, it becomes possible to isolate the physical scale at which the most clinically relevant changes in bone quality originate. For example, if testing of whole bones suggests that a treatment changes bone quality yet testing of excised trabecular and cortical bone specimens at the scale of 5–8 mm do not concur, then one can conclude that the clinically relevant changes in bone quality originate at a larger scale than 5–8 mm, implicating changes in internal organization and whole bone morphology (Figure 3). If instead biomechanical testing of 5–8 mm samples did imply changes bone quality, then we would conclude that at least some of the clinically relevant changes in whole bone quality must originate at that scale or below, implicating such potential factors as microarchitecture; degree, type, and distribution of mineralization; and collagen biochemistry, etc.
What we have presented so far is a general framework for quantifying the biomechanical effects of clinically relevant changes in bone quality and a strategy for identifying the physical scale at which such changes originate. This framework should prove insightful when applied to animal and cadaver studies, and could also be applied to clinical studies if appropriately validated non-invasive measures of bone biomechanics and density or mass are used. Such analyses can be performed retrospectively on pre-existing data that have not yet been analyzed according to this framework. With that in mind, we now illustrate the use of this framework by revisiting previously reported studies focusing on excised specimens of trabecular bone at the 5–8 mm scale.
The biomechanical performance of excised samples of trabecular bone on the order of 5–8mm in smallest dimension has been studied for some time [20–22], and reflects the net effects of differences in microarchitecture, bone volume fraction, and tissue material properties (Figure 3). Here we discuss strength and density as specific measures of bone biomechanical performance and mass, respectively. A comparison of healthy trabecular bone in different regions of the human skeleton suggests that there are substantial variations in trabecular bone compressive strength relative to apparent density (Figure 4, ,5)5) [23–25]. For example, compared to trabecular bone from the vertebral body, trabecular bone from the proximal tibia is, on average, denser and stronger (Figure 4 A,B). This of itself is not indicative of a difference in bone quality. However, the relationship between strength and density in the proximal tibia has a greater slope (p < 0.01) and the strength:density ratio is also greater than that in the vertebral body (Figure 5). This indicates that bone from the proximal tibia is much more efficient at resisting loads and therefore has improved bone quality (as evaluated by bone strength). Other differences in strength-density characteristics exist among other regions of the skeleton (Figure 5).
While a number of factors may cause these variations in bone quality across skeletal regions (Table 1), the fact that the strength:density ratio tends to be positively correlated with density (Figure 4 C-D) suggests that these differences in bone quality may be a result of interactions between bone density and other characteristics of bone. One possible explanation is microarchitecture. Micro-mechanical analyses of trabecular bone have demonstrated that plate-type trabecular bone is much more mechanically efficient than rod-type trabecular bone [26–28], which would explain why the strength:density ratio is higher for the human proximal tibial bone than the human greater trochanter although the densities in these two sites are similar (see Figure 5). In addition, at lower densities, changes in trabecular failure mechanisms associated with thinning and loss of trabeculae [29–31] — from microdamage and yielding to non-linear deformation effects such as buckling and excessive bending — may also contribute to the observed differences in biomechanical performance relative to density. Another possibility is variations in tissue material properties. Variations in tissue material properties (often associated with degree of mineralization) have been shown to influence trabecular bone biomechanics [18, 32–35] and have been noted in humans and rodents .
With regard to age-, disease-, or treatment-related changes in trabecular bone there are very few data examining trabecular bone mechanical performance relative to density. Regarding aging, data from distal femoral trabecular bone show a linear relationship between compressive strength and apparent density in a cohort that varied greatly in age (20–102 years)  (Figure 4, distal femur data). Similar linear relationships between strength and apparent density were observed in both males and females for this cohort even though the study presumably included both pre- and postmenopausal women. This suggests that the large increase in whole body bone turnover experienced by females at menopause [37–39] may not result in clinically relevant changes in trabecular bone quality, at least in the distal femur. Data from vertebral trabecular bone suggest that the ratio of compressive strength to apparent density does decline during aging. Analysis of data from Mosekilde and colleagues  as well as from our laboratory shows a significant decline in the strength:density ratio with age (Figure 6). As yet, the causes for this trend are not well understood but are likely associated with its lower density and propensity to undergo large deformation type failure mechanisms (such as excessive bending or buckling) that would not occur in higher density bone . Clearly, more data are required to address this important issue and the specific causes.
Regarding the effects of osteoporosis, we are aware of only one study  that directly addressed differences in trabecular bone mechanical properties relative to density in healthy and osteoporotic individuals. That study, which compared retrieved specimens from the femoral heads of patients with hip fractures against a control group, did not find a difference in the strength-bone volume fraction relation but did find differences in the elastic modulus-bone volume fraction relation. A subsequent analysis using micro-CT-based finite element models of the specimens  concluded that the main changes in the elastic behavior (strength behavior was not analyzed) were in the transverse properties of the bone, not those along the main habitual loading direction. While further study is required to better explain these intriguing findings they indicate the need to investigate bone biomechanical properties and bone quality not only in the main habitual loading direction, but also along directions and loading modes associated with falls and trauma .
The effects of drug treatment on the relationship between mechanical performance and density in excised (5–8 mm sized) specimens of trabecular bone are not well understood, again due to a lack of data. One reason for the limited data is that specimens of trabecular bone of this size cannot be obtained from small animals, limiting analysis to larger animals (dogs, mini-pigs, sheep, primates). Even then, the relatively high bone volume fraction in most of these animals compared to humans presents a confounding factor in interpretation of the results since, as discussed above, changes in bone quality associated with the micro-architecture may well depend on the initial density of the bone. Biomechanical testing of iliac crest biopsies is another possibility for analysis, but since the ilium is not a common site of fragility fracture, the fact that bone quality can vary between sites (Figures 4, ,5)5) raises questions about how well changes in bone quality of iliac crest biopsies are related to changes in clinical fracture sites.
Consistent with the clinical experience from treatment with sodium fluoride [45–47], it has been observed in large animal models that the relationship between trabecular bone strength and apparent density is compromised by sodium fluoride treatment [16, 48, 49]. Although a number of studies have looked at the effects of bisphosphonates on trabecular bone biomechanics in large animals [50–57], we could find only two that looked specifically for differences in mechanical properties relative to density [49, 58]. Neither of the studies observed significant changes in the relationships as a result of treatment. Although these studies are not conclusive due to the small sample sizes, they do not support the idea that alendronate (the bisphosphonate used) causes clinically relevant changes in bone quality at a scale of 5 mm or less (as measured by monotonic strength or elastic modulus relative to apparent density). Recent computational work lab has found that the relationship between strength and bone volume fraction in canine vertebrae is not appreciably modified by risedronate treatment induced changes in microarchitecture .
An alternative approach to assessing the biomechanical effects of microarchitecture on bone quality has been to use multiple regression analysis in which bone volume fraction and a variety of microarchitecture parameters are treated as explanatory variables. This approach, while simple to implement, is confounded by the correlations between micro-architecture parameters and bone volume fraction [66–68]. Recent studies in large animals investigating the effects of bisphosphonates (alendronate, ibandronate, and risedronate) on microarchitecture [53, 55, 57] have found that some currently used microarchitectural measures do indeed improve predictions of mechanical properties beyond what can be achieved using bone mass, density or volume fraction alone. None of these studies directly compared the strength-density relationships between treated and untreated groups, however. Furthermore, since the studies do not agree on a microarchitectural parameter that both changes in response to treatment and contributes to the prediction of mechanical properties, the findings are difficult to interpret with regard to the mechanisms behind discrepancies between aBMD and fracture risk during anti-resorptive therapy.
Given the small amount of data and the controversy over the causes of discrepancies between aBMD and fracture risk, there is a critical need for more comprehensive analyses of changes in biomechanical performance relative to bone density during anti-resorptive therapy. In particular, studies at the scale of 5–8 mm could be particularly useful for testing the idea that changes in bone microarchitecture and/or tissue material properties are responsible for any clinically relevant changes in bone biomechanical performance relative to density. For example, if it turns out that a particular treatment does not change bone quality at the scale of 5–8 mm, then attention can be focused on analysis of bone quality at higher physical scales, which should be feasible using current radiological techniques combined with finite element analysis [44, 60–65] or with whole bone mechanical testing. In this way a more complete picture of how characteristics of bone might explain discrepancies between aBMD and fracture incidence can be achieved.
Since a clinical fracture is ultimately a biomechanical event, any clinically relevant modification of bone quality must change bone biomechanical performance relative to bone mass. Here we have discussed a framework for quantifying the biomechanical effects of bone quality based on two general concepts: 1) the biomechanical effects of bone quality can be quantified from analysis of the relationship between bone biomechanical performance and bone density; and 2) because of its hierarchical nature, biomechanical testing of bone at different physical scales (<1mm, 1mm, 1 cm, etc.) can isolate the scale at which the most clinically relevant changes in bone quality occur. Analysis of existing data from our laboratory as well as others’ revealed that it is still not yet clear whether there are changes in bone biomechanical performance relative to bone density with aging, osteoporosis, or treatment with anti-resorptive agents. We suggest that use of the framework presented here, which represents well-established principles of bone biomechanics, will provide new insight into the conditions and mechanisms through which aspects of bone quality influence fracture.
This work was supported by NIH grants AR49828, AR43784. The authors thank Tony S. Keller for providing data from one of his studies.
Conflict of Interest Statement: Dr. Keaveny has a financial interest in O.N. Diagnostics and both he and the company may benefit from the results of this research. Dr. Hernandez has no potential conflicts of interest.