|Home | About | Journals | Submit | Contact Us | Français|
Press-fit acetabular components are susceptible to deformation in an underreamed socket, with excessive deformation of metal-on-metal (MOM) components potentially leading to increased torsional friction and micromotion. Specifically, however, it remains unclear how cup diameter, design, and time from implantation affect shell deformation.
We asked whether (1) changes in component geometry and material altered maximum shell deformation and (2) time-dependent deformational relaxation processes occurred.
Diametral deformation was quantified after press-fit implantation of metal shells into a previously validated polyurethane model. Experimental groups (n = 6–8) consisted of 48-, 54-, 60-, and 66-mm MOM cups of 6-mm wall thickness, 58-mm cups of 10-mm wall thickness, and CoCrMo and Ti6Al4V 58-mm modular cups.
Greater cup diameter, thinner wall construction, and Ti6Al4V modular designs generated conditions for maximum shell deformation ranging from 0.047 to 0.267 mm. Relaxation (18%–32%) was observed 120 hours postimplantation in thin-walled and modular designs.
Our findings demonstrate a reduction of shell deformation over time and suggest, under physiologic loading, early component deformation varies with design.
Component deformation should be a design consideration regardless of bearing surface. Designs neglecting to adequately address deformational changes in vivo could be susceptible to diminished cup survival, increased wear, and premature revision.
Thin-walled acetabular cup designs are advantageous in that they offer the ability to preserve bone stock while accommodating larger femoral head sizes for increased ROM , lower volumetric wear rates [1, 9], and reduced likelihood of femoral head dislocation [2, 12, 15]. Thin-walled press-fit acetabular cups, however, are susceptible to substantial deformation after implantation . Clinical and cadaveric investigations have described component pinching between the ischial and ilial columns after press-fit implantation of thin-walled acetabular cups [5, 16, 17], resulting in deformation of the acetabular cup in underreamed specimens. This deformation could be accompanied by increased micromotion at the bone-implant interface or lead to an increase in wear due to induced changes in component sphericity, degradation of fluid-film lubrication, or induced equatorial contact [5, 6, 8, 11, 14, 17].
The role of diametral clearance in THA lies in the establishment of elastohydrodynamic lubrication between the femoral head and acetabular bearing surface in metal-on-metal (MOM) device designs [3, 13, 18]. In a tribologic study of MOM performance in a hip simulator, Dowson et al.  recommended femoral heads be as large as possible with minimal diametral clearance to establish a scenario of mixed-film lubrication. Similarly, in an in vitro study, Rieker et al.  showed maintaining a lower diametral clearance produced lower amounts of run-in wear compared with cups having relatively high diametral clearances. However, clearance that is too tight paired with large component deformation from press-fit implantation and gait loading could potentially increase frictional torsion, resulting in component loosening.
In a clinical study, Squire et al.  described diametral deformation as great as 0.570 mm in Dorr Type A bone stock implanted with 50- to 60-mm-diameter modular cups while noting increased component stiffness with increasing cup diameter in that design. Other studies [4, 5, 16] have observed deformations of between 0.015 to 0.300 mm among varied designs under controlled pinch loads up to 1500 N, with many designs exhibiting increased deformation beyond that of manufacturer-reported diametral clearances in larger-diameter cups. Recent studies of modular designs have utilized a cadaverically validated foam model , reporting deformations up to 0.280 mm after liner seating , leading to increased frictional torques in metal-on-polyethylene articulations  and up to 5.5 times greater volumetric wear in finite element models . Understanding of the overall effect of implantation on acetabular shell deformation is critical to the prevention of such deformation-related clinical complications as increased wear, loosening, and premature failure.
We therefore asked, within the validated model, whether (1) a change in component size, wall thickness, or material resulted in altered maximum shell deformation of MOM and metal-backed modular cups and (2) time-dependent deformational relaxation processes occurred.
We used solid rigid polyurethane biomechanical testing blocks (Pacific Research Laboratories, Inc, Vashon, WA, USA) as the testing medium for the implantation and deformation of press-fit acetabular shells. As previously established in the validation study by Jin et al. , a two-point relief geometry was manufactured from a 0.48-g/cm3 polyurethane foam to replicate a worst-case scenario of component pinching (Fig. 1). Four acetabular component designs were included in this study: two distinct nonmodular CoCrMo MOM cups and two distinct modular Ti6Al4V shells (all manufactured by Biomet, Inc, Warsaw, IN, USA) (Table 1). Within the MOM experimental groups, thin-walled, hemispherical MOM shells with a constant 6-mm wall thickness were tested with nominal outer diameters of 48, 54, 60, and 66 mm, in addition to a thick-walled MOM 58-mm-diameter nonmodular cup design with a 10-mm wall thickness throughout. Furthermore, two designs of 58-mm Ti6Al4V modular cups were also tested, one having a porous titanium ingrowth surface and the other integrating a porous plasma spray coating. Six shells were tested in each size cohort of the thin-walled MOM design, while eight shells were tested in all other sized 58-mm cups. Thus, a total of 48 shells were implanted.
Using an expected SD of 0.010 mm and an expected cup count of n = 6/group, our study was adequately powered (with power = 0.80, two-sided alpha = 0.05, and beta [probability of Type II error] = 0.20) to detect differences of 0.018 mm between means. Expected SD takes into account a manufacturer specification of ± 0.003 mm uncertainty in digital image correlation (DIC) displacement measurement in this field of view. Furthermore, a threshold of 0.018 mm was considered the targeted difference of means, as it reflected a change similar to the total deformation in the smallest M2a-Magnum™ cups (Biomet, Inc) observed by Springer et al.  at 400-N compression, the approximate average compressive force after press-fit implantation, as measured in vivo by Squire et al. .
Foam block geometry as originally defined by Jin et al. [5, 6] was linearly scaled according to the nominal diameter of each shell set to maintain a uniform percentage of pinch and relief areas along the component perimeter across each size group (Table 2). Foam cancellous bone models were machined utilizing an automated computer numerical control mill to precisely reproduce test specimen geometry for all tests. Replicate cancellous blocks were underreamed by 1 mm utilizing standard acetabular reamers as supplied by the component manufacturer. Shells were then implanted into reamed blocks using a uniaxial servohydraulic materials testing system (Model 858; MTS Corp, Eden Prairie, MN, USA) in a randomized order. During implantation, design-specific inserters were placed at the top surface of the component rim and a compressive load was applied to the inserter at a displacement-controlled rate of 0.5 mm/second until a peak compressive load of 6 kN, a point at which consistent shell seating was verified in preliminary testing.
We used the ARAMIS 5 M (GOM mbH, Braunschweig, Germany) DIC system for deformational measurement across the perimeter of the acetabular shell. DIC is a process that tracks relative motion of points on a test specimen across multiple frames in a sequence of high-resolution images to calculate surface displacement and strain on the specimen of interest. In this study, acetabular shells were prepared for DIC measurement through the application of a black and white stoichiometric speckle pattern on the component rim and the top surface of the foam block (Fig. 2A). Data images were collected in a 105- × 85- × 65-mm three-dimensional viewing window, yielding a measurement accuracy of ± 0.002 mm, per manufacturer specifications. Sets of five images were taken with two 5.0-megapixel cameras immediately before shell implantation, immediately after implantation, 48 hours after implantation, and 120 hours after implantation of each shell specimen. Within the ARAMIS image analysis software, a set of 32 points radially distributed 11.25° apart was established along the rim of each acetabular shell to create 16 diametral measurement lines to track the change in shell shape at each of the followup time periods (Fig. 2B). Thus, a total of 48 foam blocks were reamed, shells were inserted once, and deformation was measured immediately after insertion, 48 hours after insertion, and 120 hours after insertion.
We recorded points of maximum cup compression and expansion for each observation. We compared maximum compression and expansion between each separate cup design and between each cup size within the thin-walled MOM design utilizing a linear regression ANOVA using SAS® software (SAS Institute Inc, Cary, NC, USA). Least-square means were calculated as a part of the linear regression model and utilized as the mechanism to identify and compensate for methodologic variances among specimens.
Changes in cup deformation due to changing component size, wall thickness, and shell material were each observed in this study, with greater deformation measured in larger as opposed to smaller shell sizes, thin- as opposed to thick-walled components, and Ti6Al4V as opposed to CoCrMo acetabular components. The maximum shell compression and expansion, averaged across each shell size and design, are shown (Table 3). We observed no difference in maximum shell compression between the 48- and 54-mm thin-walled MOM shell designs. The 60- and 66-mm shells each exhibited increases (p < 0.001) in overall maximum shell deformation with respect to the smaller shell diameters. We observed a nonlinear trend in deformation with increased shell size in the thin-walled MOM shells (Fig. 3), with the largest-diameter shell tested exhibiting a 2.2 times greater deformation than the smallest MOM shell. The smallest overall shell deformation was observed in the 58-mm thick-walled MOM shell design, which incorporates a 10-mm-thick CoCrMo shell around a 38-mm-diameter inner bearing surface. The thick-walled MOM shell, stiffest of all shells tested, was compressed a maximum average of 0.008 mm and maintained that low level of component compression throughout the time intervals tested. Conversely, the greatest overall shell deformation was observed in the two 58-mm Ti6Al4V acetabular components designed for insertion of a modular polyethylene liner. The most flexible of those shells, the plasma-sprayed Ti6Al4V modular shell, exhibited greater (p < 0.001) deformation within both the pinch and relief regions than all other shells tested, while the porous Ti modular shell deformed similarly to the 66-mm thin-walled MOM component and more (p < 0.001) than each of the other MOM shells.
Excluding the thick-walled MOM shell, which showed consistently low maximum deformation throughout the testing period, an overall reduction in deformation (p < 0.001) was observed from initial implantation to 120 hours after implantation in both the pinch and relief regions of the model. An average of 18% reduction in compression over all shell designs was observed 48 hours after implantation, while an average of 7% additional pinch reduction was observed between 48 and 120 hours after implantation. Similarly, we observed a mean 13% reduction in diametral expansion in the relief regions 48 hours after implantation, with an additional 6% mean reduction through 120 hours after implantation.
Acetabular cup deformation, an inevitable mechanical consequence in uncemented press-fit components, can lead to a breakdown in lubrication mechanisms resulting in suboptimal conditions for long-term success for MOM and modular cup designs. Our study represents an effort to quantify deformation over a segment of the wide variety of design and material variations available in the current orthopaedic marketplace. Specifically, we investigated whether (1) a change in component geometry and design altered maximum shell deformation of MOM and metal-backed modular cups in a biomechanical model and (2) any time-dependent deformational relaxation processes were exhibited in the model.
We recognize limitations in our methodology and clinical extrapolation of our findings. First, we used a static, polyurethane model to represent a dynamic in vivo environment. Clinical projection of true component deformation in the native acetabulum may be limited. This model is validated for recreating and measuring initial cup deformation . This model has not been validated for measuring cup deformation beyond initial implantation. Second, a single material density and a single interference fit were tested though clinically they vary widely. Interference fit is subjective, fluctuating from case to case as the strength in the bone and surgeon preference dictates. The amount of variance that typically occurs clinically with 1 mm of underreaming is not only unknown but rather broad. Variation in reaming derives from variation in surgeon-applied force, instrument manufacturing, and bone quality; therefore, control of these variables within a representative density and underream setting provides a baseline for comparison in this study. Third, we used only a single-sized thick-walled MOM component; thus, conclusions based on altered component size from these data can only be confidently applied to thin-walled MOM designs. Fourth, this study incorporated a single loading condition while varied deformational response could be expected in dynamic loading or in conditions of rim loading and incomplete shell seating; however, this loading condition provides a relative baseline of comparison among varied cup designs.
Our results parallel those of Jin et al. [5, 6] and Everitt et al.  regarding both the range of deformations observed and the tendency for higher deformations in cups with larger diameters and thinner walls. In the CoCrMo components tested, we observed a nonlinear trend of increasing deformation with increased cup diameter. An increase in wall thickness from 6 mm to 10 mm in MOM components yielded an order-of-magnitude lower shell deformation. Implant factors associated with less wear in MOM articulations and increased survival include surface finish, clearance (affected by head size tolerance, sphericity, and cup deformation), carbon content, and casting process . It is noteworthy that the largest deformation remained less than the smallest radial clearance (0.150 mm) according to the manufacturer’s stated tolerances. Thus, peripheral seizing leading to a disruption of fluid film lubrication would potentially be minimized.
Recent studies have begun to examine the effect of polyethylene liner deformation in regard to frictional torque, liner fracture, and volumetric wear in foam block and finite element analyses [8, 11, 14]. Schmidig et al.  observed higher magnitudes of deformation in the polyethylene liner than in the shell in which they were inserted. Because maximal deformation occurs at the periphery of the cup, this change could potentially lead to adverse consequences with respect to the peripheral locking mechanisms in modular cups. This finding may be of particular concern with respect to the reduction of fracture toughness (increased stress and crack initiation) of the newer polyethylene. However, with stress relaxation and diminished deformation of the polyethylene liner over time, reduced effect on fracture and wear would be expected. Likewise, the negative clinical impact of this polyethylene deformation may be ameliorated through modern polyethylene processing techniques . It is important to note these potential adverse consequences of liner deformation have yet to be demonstrated clinically.
The viscoelasticity of bone lends to the hypothesis of diminished cup deformation over time in both MOM and metal-backed polyethylene acetabular cups. However, to our knowledge, no extensive long-term in vivo cup deformation data are available in the current scientific literature. In this study, a reduction in initial deformation magnitude was observed across all but the stiffest MOM designs in this nonphysiologic model. Our data would seem to support the hypothesis of long-term reduction in initial cup deformation in the more transient in vivo environment. More extensive evaluation in vivo should be performed to fully characterize the time dependence of cup deformation in press-fit implantation.
In summary, our findings agree with those of previously published experimental and finite element studies maintaining decreased wall thickness and increased cup diameter lead to higher initial cup deformations in a worst-case-scenario pinch relief cancellous foam model. Thick-walled MOM acetabular shells exhibited small overall shell deformations, while thin-walled components were susceptible to larger deformations. Furthermore, conventional metal-backed shells for modular polyethylene liners exhibited higher deformations than the majority of the MOM designs and sizes tested. This study indicates component material and design as factors in initial acetabular cup deformation in these specific devices. Further study is needed to assess the effects of pelvic viscoelasticity, bony remodeling, and polyethylene liner insertion on long-term component deformation and its impact on implant survivability.
The authors thank Michael Volitich BS, Ron Hofmann BS, and Gary Burgess BS, for their assistance in experimental setup. We additionally thank Kenneth Davis MS, and Matthew Brunsman MS, for their assistance with statistical analysis.
The institution of one or more of the authors (JBM, SRS, MEB, MAR) has received funding, during the study period, from St Francis Hospital (Mooresville, IN, USA), ERMI, Inc (Atlanta, GA, USA), DePuy Orthopedics, Inc (Warsaw, IN, USA), and Stryker Orthopaedics (Mahwah, NJ, USA). A partnering institution (Rose-Hulman) received funding from the National Science Foundation for the research instrumentation utilized in this study through Major Research Instrumentation Awards 0923135 and 1039716. One of the authors (JBM) certifies that he, or a member of his immediate family, has received or may receive payments or benefits, during the study period, an amount in excess of $100,000, from Biomet, Inc (Warsaw, IN, USA); one of the authors (MEB) certifies that he, or a member of his immediate family, has received or may receive payments or benefits, during the study period, an amount in excess of $100,000, from Biomet, Inc, and OrthAlign, Inc (Aliso Viejo, CA, USA).
All ICMJE Conflict of Interest Forms for authors and Clinical Orthopaedics and Related Research editors and board members are on file with the publication and can be viewed on request.
Clinical Orthopaedics and Related Research neither advocates nor endorses the use of any treatment, drug, or device. Readers are encouraged to always seek additional information, including FDA approval status, of any drug or device before clinical use.
Specimen preparation and data collection were conducted at Rose-Hulman Institute of Technology (Terre Haute, IN, USA). Statistical analysis was performed at Joint Replacement Surgeons of Indiana Foundation, Inc (Mooresville, IN, USA).