What's Important: Research and Surgery: Finding Balance and Building Community
J Bone Joint Surg Am. 2025 Feb 20. doi: 10.2106/JBJS.24.01336. Online ahead of print.
NO ABSTRACT
PMID:39977550 | DOI:10.2106/JBJS.24.01336
J Bone Joint Surg Am. 2025 Feb 20. doi: 10.2106/JBJS.24.01336. Online ahead of print.
NO ABSTRACT
PMID:39977550 | DOI:10.2106/JBJS.24.01336
J Bone Joint Surg Am. 2025 Feb 20. doi: 10.2106/JBJS.24.00371. Online ahead of print.
ABSTRACT
BACKGROUND: Nerve injuries in pediatric supracondylar humeral (SCH) fractures occur in 2% to 35% of patients. Previous research has suggested that isolated anterior interosseous nerve injuries are not influenced by the time to surgery; however, little is known about other nerve injuries or mixed, motor, and sensory injuries. With this study, we aimed to examine the impact of time to surgery on nerve recovery in patients with traumatic nerve injuries associated with SCH fractures.
METHODS: Patients <18 years of age with SCH fractures stabilized using percutaneous pins during the period of January 2009 to June 2022 were retrospectively reviewed. Patients presenting with any traumatic nerve injury noted preoperatively were included, while those with iatrogenic or postoperative nerve injuries and incomplete documentation were excluded. Demographic data, injury characteristics, time to surgery, and number of days to nerve recovery were collected. Comparisons of nerve recovery time by anatomic distribution and functional deficit using an 8-hour time-to-surgery cutoff were made in bivariate and multivariate analyses.
RESULTS: A total of 2,753 patients with SCH fractures were identified, with 214 of the patients having an associated nerve injury. Documentation of nerve recovery was available for 197 patients (180 patients with complete recovery) with an overall mean age of 6.8 ± 2.1 years. Time to recovery differed significantly when comparing the motor, sensory, and mixed-deficit cohorts (p < 0.001). Early surgery (≤8 hours from injury to surgery) was significantly associated with shorter overall time to nerve recovery (p = 0.002), recovery of multiple nerve distributions (p = 0.011), and recovery of mixed motor and sensory deficits (p = 0.007). On multivariable analysis, mixed nerve deficits (hazard ratio [HR], 0.537 [95% CI, 0.396 to 0.728]; p < 0.001) and time from injury to treatment of >8 hours (HR, 0.542 [95% CI, 0.373 to 0.786]; p = 0.001) were significantly associated with delayed nerve recovery.
CONCLUSIONS: Surgical timing impacts the time to recovery of complex nerve injuries. Early surgical management of patients with mixed motor-sensory deficits may help to reduce the time to complete nerve recovery.
LEVEL OF EVIDENCE: Therapeutic Level III. See Instructions for Authors for a complete description of levels of evidence.
PMID:39977536 | DOI:10.2106/JBJS.24.00371
J Bone Joint Surg Am. 2025 Feb 20. doi: 10.2106/JBJS.24.00108. Online ahead of print.
ABSTRACT
BACKGROUND: Changing from standing to sitting positions requires rotation of the femur from an almost vertical plane to the horizontal plane. Osteoarthritis of the hip limits hip extension, resulting in less ability to recruit spinopelvic tilt (SPT) while standing and requiring increased SPT while sitting to compensate for the loss of hip range of motion. To date, the effect of total hip arthroplasty (THA) on spinopelvic sitting and standing mechanics has not been reported, particularly in the setting of patients with coexistent sagittal plane spinal deformity.
METHODS: A retrospective review was performed of patients ≥18 years of age undergoing unilateral THA for hip osteoarthritis with sitting and standing radiographs made before and after THA. Alignment was analyzed at baseline and follow-up after THA in both standing and sitting positions in a relaxed posture with the fingers resting on top of the clavicles. Patients were grouped according to the presence or absence of sagittal plane deformity preoperatively into 3 groups: no sagittal plane deformity (normal), thoracolumbar (TL) deformity (pelvic incidence-lumbar lordosis [PI-LL] mismatch > 10° and/or T1-pelvic angle [TPA] > 20°), or apparent deformity (PI-LL ≤ 10° and TPA ≤ 20°, but sagittal vertical axis [SVA] > 50 mm).
RESULTS: In this study, 192 patients were assessed: 64 had TL deformity, 39 had apparent deformity, and 89 had normal alignment. Overall, patients demonstrated a reduction in standing SVA (45 to 34.1 mm; p < 0.001) and an increase in SPT (14.6° to 15.7°; p = 0.03) after THA. There was a greater change in standing SVA (p < 0.001) among patients with apparent deformity (-29.0 mm) compared with patients with normal alignment (0.9 mm) and patients with TL deformity (-16.3 mm). Those with apparent deformity also experienced the greatest difference (p = 0.03) in postural SPT change (moving from standing to sitting) (-10.1°) from before to after THA when compared with those with normal alignment (-3.6°) and TL deformity (-1.2°). The difference in postural SVA change from before to after THA was also greatest (p < 0.001) in those with apparent deformity (32.1 mm) compared with those with normal alignment (6.5 mm) and TL deformity (17.3 mm).
CONCLUSIONS: Postural changes in spinopelvic alignment vary after THA depending on the presence of TL deformity or apparent deformity due to hip flexion contracture. Patients with apparent deformity had larger changes in standing and sitting alignment than patients with TL deformity or patients with normal alignment. The assessment of global sagittal alignment findings can be used to predict the likelihood of improvement in sagittal alignment after THA.
LEVEL OF EVIDENCE: Therapeutic Level III. See Instructions for Authors for a complete description of levels of evidence.
PMID:39977534 | DOI:10.2106/JBJS.24.00108
J Bone Joint Surg Am. 2025 Feb 20. doi: 10.2106/JBJS.24.01339. Online ahead of print.
NO ABSTRACT
PMID:39977532 | DOI:10.2106/JBJS.24.01339
J Bone Joint Surg Am. 2025 Feb 20. doi: 10.2106/JBJS.24.00908. Online ahead of print.
ABSTRACT
BACKGROUND: Minimally invasive techniques such as percutaneous screw fixation have previously been shown to be mostly successful for pain relief and functional improvement in patients with pelvic metastases. In this study, we retrospectively reviewed the largest single-center cohort to date to further characterize the impact of this treatment on pain palliation, ambulation, and function; the predictors of suboptimal outcomes; and complications.
METHODS: Electronic medical records were reviewed. The primary outcome measures were pain, as assessed with use of the visual analog scale (VAS) score; functional status, as assessed with use of the Eastern Cooperative Oncology Group (ECOG) score; and ambulation, as assessed with use of the Combined Pain and Ambulatory Function Score (CPAFS), including preoperatively and postoperatively. Secondary outcome measures included radiographic evidence of fracture healing and the need for narcotics.
RESULTS: The study included 103 consecutive patients (42 men, 61 women) with a mean age of 64.1 years (range, 34 to 93 years) and a median follow-up of 14.4 months (range, 3 to 64 months) who underwent 107 procedures (bilateral in 4 patients). Sixty-nine had periacetabular lesions, whereas 38 had non-periacetabular lesions. VAS, ECOG, and CPAFS values improved from preoperatively at all time points (p < 0.001). Fifty-seven (85.1%) of the 67 patients presenting with a pathologic fracture demonstrated radiographic healing. A lack of radiographic healing was associated with a prolonged need for narcotics (p < 0.001). Six hips were converted to total hip arthroplasties, and 1 underwent a Girdlestone procedure. Complications were observed in 3 cases (2.8%).
CONCLUSIONS: Percutaneous screw fixation provided sustained benefits of pain relief and functional improvement in the treatment of metastatic pelvic lesions, with a low rate of complications. Bone healing after fixation was common. The risk of prolonged narcotic usage was higher in patients without evidence of bone healing.
LEVEL OF EVIDENCE: Therapeutic Level III. See Instructions for Authors for a complete description of levels of evidence.
PMID:39977531 | DOI:10.2106/JBJS.24.00908
J Bone Joint Surg Am. 2025 Feb 20. doi: 10.2106/JBJS.24.00888. Online ahead of print.
ABSTRACT
BACKGROUND: This study compared the functional outcomes of patients with open tibial shaft fractures who were randomized to either modern external ring fixation (EF) or internal fixation (IF). We hypothesized that there would be differences in patient-reported function between the treatment groups.
METHODS: This preplanned analysis of secondary outcomes from the FIXIT study, a multicenter randomized clinical trial, included patients 18 to 64 years of age with a Gustilo-Anderson Type-IIIB or severe-Type IIIA diaphyseal or metaphyseal tibial fracture who were randomly assigned to either IF (n = 132) or EF (n = 122). Follow-up visits occurred at 6 weeks and 3, 6, and 12 months after randomization. Outcomes included Short Musculoskeletal Function Assessment (SMFA) scores, the Veterans RAND 12-Item Health Survey (VR-12) physical component score (PCS), use of ambulatory assistive devices, and ability to ambulate.
RESULTS: The mean VR-12 PCS was slightly higher (better) for IF (24.8) than for EF (22.6) at 3 months (mean difference, 2.2 [95% confidence interval (CI): 0.2, 4.3]; p = 0.03) and trended higher for IF (27.0) compared with EF (25.3) at 6 months (mean difference, 1.8 [95% CI: -0.9, 4.4]; p = 0.19). However, there was no difference between the groups at 12 months. There were no clinically important or significant differences in SMFA Dysfunction and Bother scores between the treatment groups at any time point. EF was associated with a higher risk of using any ambulatory assistive device at 6 months (relative risk, 1.5 [95% CI: 1.21, 1.82]; p < 0.0001). The absolute percentage of patients using any ambulatory device was 37.6% for IF and 45.4% for EF at 1 year. There was no difference in ambulatory status between the treatment groups at any time point.
CONCLUSIONS: We found no difference in physical function between patients with severe tibial fractures treated with IF versus EF. There was a high rate of impairment overall. Assistive devices for walking were more often utilized in the EF group at 6 months, and both treatment groups demonstrated similar overall impairment.
LEVEL OF EVIDENCE: Therapeutic Level I. See Instructions for Authors for a complete description of levels of evidence.
PMID:39977529 | DOI:10.2106/JBJS.24.00888
J Bone Joint Surg Am. 2025 Feb 20. doi: 10.2106/JBJS.24.00044. Online ahead of print.
ABSTRACT
BACKGROUND: The purpose of this study was to determine the incidence of postoperative ileus (POI) after spine surgery and to identify risk factors for its development.
METHODS: A retrospective database study was performed between 2019 and 2021. A database of all patients who underwent spine surgery was searched, and patients who developed clinical and radiographic evidence of POI were identified. Demographic characteristics, perioperative data including opioid consumption, ambulation through postoperative day 1, surgical positioning, medical history, and surgical history were obtained and compared to examine risk factors for developing POI.
RESULTS: A total of 10,666 consecutive patients were identified who underwent cervical, thoracic, thoracolumbar, lumbar, or lumbosacral surgery with or without fusion. No patients were excluded from this study. The overall incidence of POI after spine surgery was 1.63%. POI was associated with a significantly greater mean length of stay of 7.6 ± 5.0 days compared with 2.9 ± 2.9 days in the overall cohort (p < 0.001). A history of ileus (odds ratio [OR], 21.13; p < 0.001) and a history of constipation (OR, 33.19; p < 0.001) were also associated with an increased rate of POI compared with patients without these conditions. Postoperatively, patients who developed POI had decreased early ambulation distance through postoperative day 1 at 14.8 m compared with patients who did not develop POI at 31.4 m (p < 0.001). Total postoperative opioid consumption was significantly higher (p < 0.001) in the POI group (330.3 morphine equivalent dose [MED]) than in the group without POI (174.5 MED). Lastly, patients who underwent fusion (p < 0.001), were positioned in a supine or lateral position (p = 0.03) (indicators of anterior or lateral approaches), had thoracolumbar or lumbar surgery (p = 0.01), or had multiple positions during the surgical procedure (p < 0.001) had a significantly higher risk of POI than those who did not.
CONCLUSIONS: The overall incidence of POI after all spine surgery is low. Several nonmodifiable predictors of POI include prior ileus, constipation, hepatitis, and prostatectomy. Multiple surgical factors increased the risk of POI, including supine positioning, surgery with the patient in multiple positions, and fusion. POI was associated with decreased early ambulation and increased opioid usage. Strategies should be implemented to maximize early ambulation and decrease opioid usage perioperatively.
LEVEL OF EVIDENCE: Prognostic Level III. See Instructions for Authors for a complete description of levels of evidence.
PMID:39977528 | DOI:10.2106/JBJS.24.00044
J Bone Joint Surg Am. 2025 Feb 20. doi: 10.2106/JBJS.24.00313. Online ahead of print.
ABSTRACT
BACKGROUND: The late diagnosis rate of developmental dysplasia of the hip (DDH) with universal ultrasound screening is 0.2 per 1,000 children according to a recent meta-analysis, which is the same as in Japan where selective ultrasound screening is used. We hypothesized that Finland's current program of universal clinical screening complemented with targeted ultrasound is noninferior to universal and selective ultrasound screening programs.
METHODS: For this retrospective cohort study, we collected the number of children <15 years of age who were diagnosed with DDH (International Classification of Diseases, Tenth Revision [ICD-10] codes Q65.0-Q65.6 and Ninth Revision [ICD-9] code 7543) as their primary diagnosis after ≥3 visits to a physician. These data were obtained from the Finnish Care Register for Health Care, which collects the ICD-10 and ICD-9 codes from every medical appointment. We calculated the annual incidence of DDH diagnoses per 1,000 newborns between 2002 and 2021. Late diagnosis of DDH was defined as a finding of DDH in children aged 6 months through <15 years at the initial diagnosis who had undergone treatment under anesthesia (closed reduction and casting or surgery). We also registered the geographic, age, and sex distributions of the DDH diagnoses.
RESULTS: During the 20-year study period, 1,103,269 babies were born (median per year, 57,214 babies; range per year, 45,346 to 60,694 babies). A total of 6,421 children had a diagnosis of DDH (mean per year, 321 children; range per year, 193 to 405 children), with a mean calculated incidence of 5.8 per 1,000 newborns (95% confidence interval [CI], 5.7 to 6.0). Altogether, 120 children aged 6 months through <15 years were treated for DDH, with little annual variation (median, 6.5 children; range, 2 to 9 children). The mean national incidence of late-diagnosed cases was 0.11 per 1,000 newborns (95% CI, 0.09 to 0.13).
CONCLUSIONS: Finland's current DDH screening program, which includes universal clinical screening with targeted ultrasound, is noninferior when compared with other screening programs.
LEVEL OF EVIDENCE: Prognostic Level III. See Instructions for Authors for a complete description of levels of evidence.
PMID:39977488 | DOI:10.2106/JBJS.24.00313
J Bone Joint Surg Am. 2025 Feb 19. doi: 10.2106/JBJS.24.00544. Online ahead of print.
ABSTRACT
BACKGROUND: Increasing U.S. health-care costs raise concerns regarding the sustainability of the U.S. health-care system, with the potential for negative effects on the mental and physical health of patients. Orthopaedic injuries often impose considerable financial burdens on patients and hospitals, but the trends in, and drivers of, costs remain unclear. This study evaluated the total expenditure and out-of-pocket (OOP) costs of patients with a lower-extremity (LE) fracture in the non-institutionalized U.S. population from 2010 to 2021.
METHODS: A total of 3,016 participants with an LE fracture from the Medical Expenditure Panel Survey (MEPS) were propensity score matched with 15,080 MEPS participants with no LE fracture. Patients with an LE fracture were predominantly between 40 and 64 years old (43.2%), female (66.0%), and White (78.8%). Total expenditure and OOP costs were compared between the groups. A multivariable regression analysis was performed to identify factors that were associated with costs. Outcomes were adjusted on the basis of the 2022 Consumer Price Index.
RESULTS: Patients with an LE fracture had greater total expenses than the control group ($20,230 [95% confidence interval (CI), $18,916 to $21,543] versus $10,678 [95% CI, $10,302 to $11,053]; p < 0.001) as well as greater OOP costs ($1,634 [95% CI, $1,516 to $1,753] versus $1,089 [95% CI, $1,050 to $1,128]; p < 0.001). Between 2010 and 2021, total expenses increased more for patients with an LE fracture than for the control group (101.2% versus 51.4%; p < 0.001), whereas OOP costs increased to a lesser degree in both groups (61.1% versus 44.5%; p = 0.17). In the LE fracture group, total expenditure was driven by inpatient care, office-based visits, and prescription costs, whereas OOP costs were driven by office-based visits, prescription costs, and "other" sources. Femoral fracture, hospitalization, and certain comorbidities were associated with higher total expenses. Hospitalization, uninsured status, and a higher income level were associated with increased OOP costs, whereas African American or Hispanic background and a lower educational level were associated with lower OOP costs.
CONCLUSIONS: An LE fracture was associated with considerable total expenditure and OOP costs, which increased disproportionately compared with general health-care costs over the past decade. Post-hospitalization care was the biggest driver of both total expenses and OOP costs. Due to limitations inherent to the MEPS database, the impact of financial burden on not only payers but also individuals and their medical decision-making remains unclear and requires further investigation.
LEVEL OF EVIDENCE: Economic Level III. See Instructions for Authors for a complete description of levels of evidence.
PMID:39970239 | DOI:10.2106/JBJS.24.00544
J Bone Joint Surg Am. 2025 Feb 19. doi: 10.2106/JBJS.24.00589. Online ahead of print.
ABSTRACT
BACKGROUND: Computer navigation and patient-specific instrumentation have been in use over the past 2 decades for total knee replacement (TKR). However, their effects on implant survival and patient-reported outcomes remain under debate. We aimed to investigate their influence on implant survival, outcomes of the Oxford Knee Score (OKS) and health-related quality of life (EQ-5D-3L), intraoperative complications, and postoperative mortality compared with conventional instrumentation, across a real-world population.
METHODS: This observational study used National Joint Registry (NJR) data and included adult patients who underwent primary TKR for osteoarthritis between April 1, 2003, and December 31, 2020. The primary analysis evaluated revision for all causes, and secondary analyses evaluated differences in the OKS and EQ-5D-3L at 6 months postoperatively, and mortality within 1 year postoperatively. Weights based on propensity scores were generated, accounting for several covariates. A Cox proportional hazards model was used to assess revision and mortality outcomes. Generalized linear models were used to evaluate differences in the OKS and EQ-5D-3L. Effective sample sizes were computed and represent the statistical power comparable with an unweighted sample.
RESULTS: Compared to conventional instrumentation, the hazard ratios (HRs) for all-cause revision following TKR performed using computer navigation and patient-specific instrumentation were 0.937 (95% confidence interval [CI], 0.860 to 1.021; p = 0.136; effective sample size [ESS] = 91,607) and 0.960 (95% CI, 0.735 to 1.252; p = 0.761; ESS = 13,297), respectively. No differences were observed in the OKS and EQ-5D-3L between conventional and computer-navigated TKR (OKS, -0.134 [95% CI, -0.331 to 0.063]; p = 0.183; ESS = 29,135; and EQ-5D-3L, 0.000 [95% CI, -0.005 to 0.005]; p = 0.929; ESS = 28,396) and between conventional TKR and TKR with patient-specific instrumentation (OKS, 0.363 [95% CI, -0.104 to 0.830]; p = 0.127; ESS = 4,412; and EQ-5D-3L, 0.004 [95% CI, -0.009 to 0.018]; p = 0.511; ESS = 4,285). Mortality within 1 year postoperatively was similar between conventional instrumentation and either computer navigation or patient-specific instrumentation (HR, 1.020 [95% CI, 0.989 to 1.052]; p = 0.212; ESS = 110,125).
CONCLUSIONS: On the basis of this large registry study, we conclude that computer navigation and patient-specific instrumentation have no statistically or clinically meaningful effect on the risk of revision, patient-reported outcomes, or mortality following primary TKR.
LEVEL OF EVIDENCE: Therapeutic Level II. See Instructions for Authors for a complete description of levels of evidence.
PMID:39970237 | DOI:10.2106/JBJS.24.00589
J Bone Joint Surg Am. 2025 Feb 19;107(4):e12. doi: 10.2106/JBJS.24.01070. Epub 2025 Feb 19.
NO ABSTRACT
PMID:39969497 | DOI:10.2106/JBJS.24.01070
J Bone Joint Surg Am. 2025 Feb 19;107(4):e11. doi: 10.2106/JBJS.24.01050. Epub 2025 Feb 19.
NO ABSTRACT
PMID:39969496 | DOI:10.2106/JBJS.24.01050
J Bone Joint Surg Am. 2025 Feb 18. doi: 10.2106/JBJS.24.00433. Online ahead of print.
ABSTRACT
BACKGROUND: The impact of radiation exposure on cataracts and hand skin cancer in orthopaedic and spine surgeons remains understudied. This study aimed to investigate the prevalence of cataracts and chronic hand inflammation in orthopaedic and spine surgeons and to assess their association with radiation exposure.
METHODS: A cross-sectional analysis was conducted on orthopaedic and spine surgeons attending the 38th Annual Meeting of the Neurospinal Society of Japan or the 31st Annual Meeting of the Japanese Society for the Study of Low Back Pain. Cataractous changes were categorized into none, lens micro-opacity, or cataracts and were detailed alongside the prevalence of chronic hand inflammation, which included longitudinal melanonychia and hand eczema. Participants were divided into quartiles according to hand-exposure opportunities in the operating and fluoroscopy rooms in 2022. Prevalence ratios and 95% confidence intervals (CIs) of chronic hand inflammation in the upper quartiles relative to the first quartile were calculated using modified Poisson regression adjusted for potential confounders.
RESULTS: The median work experience of the 162 participants was 23 years, and the median number of hand-exposure opportunities was 70 (interquartile range [IQR], 20 to 123) in the operating room and 20 (IQR, 0 to 60) in the fluoroscopy room. The prevalence of cataracts was 20% (32 participants), and the prevalence of cataractous changes, including lens micro-opacity, was 40% (64 participants). Chronic hand inflammation was present in 62 participants (38%), of whom 52 had longitudinal melanonychia and 23 had hand eczema. The adjusted prevalence ratios of chronic hand inflammation relative to the lowest quartile of hand-exposure opportunities in the operating room were 0.91 (0.50, 1.66) for quartile 2, 0.72 (0.41, 1.25) for quartile 3, and 1.56 (0.97, 2.50) for quartile 4. For fluoroscopy room exposure, the adjusted prevalence ratios were 2.31 (1.16, 4.58) for quartile 2, 2.03 (1.00, 4.09) for quartile 3, and 2.94 (1.51, 5.75) for quartile 4.
CONCLUSIONS: This study highlighted substantial cataractous and chronic hand inflammatory changes in spine surgeons, indicating indirect and direct radiation exposure effects. Therefore, radiation safety and protective measures must be emphasized. Comparative studies with other populations and longitudinal observations are required to better understand the effects of radiation on health.
LEVEL OF EVIDENCE: Prognostic Level III. See Instructions for Authors for a complete description of levels of evidence.
PMID:39965042 | DOI:10.2106/JBJS.24.00433
J Bone Joint Surg Am. 2025 Feb 18. doi: 10.2106/JBJS.24.00196. Online ahead of print.
ABSTRACT
BACKGROUND: Patient-reported outcome measures (PROMs) are used to evaluate the impact of musculoskeletal conditions and their treatment on patients' quality of life, but they have limitations, such as high responder burden and floor and ceiling effects. The Patient-Reported Outcomes Measurement Information System (PROMIS) was developed to address these issues but needs to be further evaluated in comparison with legacy PROMs. The goals of this study were to evaluate the floor and ceiling effects of, the correlations between, and the predictive ability of PROMIS scores compared with traditional legacy measures at 10-year follow-up in a cohort who underwent revision anterior cruciate ligament (ACL) reconstruction.
METHODS: A total of 203 patients (88.7% White; 51.7% female) who underwent revision ACL reconstruction completed the PROMIS via computer adaptive tests as well as legacy PROMs at the cross-sectional, 10-year follow-up of the longitudinal MARS cohort study (MARS cohort n = 1,234). Floor and ceiling effects and Spearman rho correlations between PROMIS and legacy measures are reported. Linear regression with quadratic terms were used to develop and evaluate conversion equations to predict legacy scores from the PROMIS.
RESULTS: No floor or ceiling effects were reported for the PROMIS Physical Function (PF) domain, whereas a floor effect was found for 37.9% of the participants for the PROMIS Pain Interference (PI) domain, and a ceiling effect was found for 34.0% of the participants for the PROMIS Physical Mobility (PM) domain. PROMIS domains correlated moderately with the International Knee Documentation Committee total subjective score (absolute value of rho [|ρ|] = 0.68 to 0.74), fairly to moderately with the Knee injury and Osteoarthritis Outcome Score and Western Ontario and McMaster Universities Osteoarthritis Index scores (|ρ| = 0.52 to 0.67), and fairly with the Marx Activity Rating Scale (|ρ| = 0.35 to 0.44). None of the legacy-measure scores were accurately predicted by the PROMIS scores.
CONCLUSIONS: The PROMIS PF domain has value in assessing patients 10 years after revision ACL reconstruction. Because of floor and ceiling effects, using the PI and PM domains may not allow for precision when measuring long-term changes in pain and mobility. Although the PROMIS measures correlated with the legacy measures, with effect sizes ranging from fair to moderate, the legacy scores were not accurately predicted by the PROMIS. The results suggest that knee-specific legacy measures should not be eliminated from long-term follow-up when the goal is to capture the specific knee-related information that they provide.
LEVEL OF EVIDENCE: Therapeutic Level II. See Instructions for Authors for a complete description of levels of evidence.
PMID:39965036 | DOI:10.2106/JBJS.24.00196
J Bone Joint Surg Am. 2025 Feb 17. doi: 10.2106/JBJS.24.00324. Online ahead of print.
ABSTRACT
BACKGROUND: Sport participation has been associated with favorable outcomes following hip arthroscopy (HA) for femoroacetabular impingement (FAI) at short- and mid-term follow-up; however, few studies have evaluated the 10-year outcomes in this population. The purpose of this study was to compare patient-reported outcome measures (PROMs), the achievement of clinically significant outcomes, and reoperation-free survivorship between patients with and without regular preoperative sport participation who underwent HA for FAI and had a minimum of 10 years of follow-up.
METHODS: Data were prospectively collected for patients who underwent primary HA for FAI between January 2012 and September 2013. Patients who participated in weekly sport participation at the time of surgery ("athletes") were matched 1:1 to patients who denied sport participation ("nonathletes"), controlling for age, sex, and body mass index (BMI). Preoperative and 10-year postoperative PROMs were collected, including the Hip Outcome Score Activities of Daily Living (HOS-ADL) and Sports (HOS-Sports) subscales, the modified Harris hip score (mHHS), and the visual analog scale for pain (VAS Pain) and satisfaction (VAS Satisfaction). Patient acceptable symptom state (PASS) achievement and reoperation-free survivorship were compared between the groups.
RESULTS: Sixty-four athletes were matched to 64 nonathletes of similar age, sex, and BMI (p ≥ 0.411). In the athlete group, 85.9% were recreational-level athletes. The groups had similar preoperative PROMs, except for the HOS-ADL subscale, where the athlete group demonstrated a higher preoperative score (67.8 ± 16.7 versus 59.9 ± 21.1, p = 0.029). Both groups demonstrated a significant improvement in all PROMs (p < 0.001) at the minimum 10-year follow-up10.3 ± 0.4 years). At the time of the final follow-up, the athlete group demonstrated significantly higher scores across all of the measured PROMs (p ≤ 0.036). Athletes showed a higher cumulative PASS achievement compared with nonathletes for the HOS-ADL subscale (73% versus 50%, p = 0.033), the HOS-Sports subscale (85% versus 61%, p = 0.010), the mHHS (69% versus 43%, p = 0.013), and the VAS Pain (78% versus 51%, p = 0.006). Reoperation-free survivorship frequencies were 87.5% and 82.8%, respectively (p = 0.504).
CONCLUSIONS: Athletes who underwent contemporary HA for FAI showed superior PROMs and PASS achievement compared with nonathletes at the 10-year follow-up. Athletes and nonathletes showed reoperation-free survivorship frequencies of 87.5% and 82.8%, respectively.
LEVEL OF EVIDENCE: Therapeutic Level III. See Instructions for Authors for a complete description of levels of evidence.
PMID:39960979 | DOI:10.2106/JBJS.24.00324
J Bone Joint Surg Am. 2025 Feb 14. doi: 10.2106/JBJS.24.00164. Online ahead of print.
ABSTRACT
BACKGROUND: Delayed-onset neurologic changes (DONCs) following spinal deformity surgery are poorly understood and are often devastating.
METHODS: A retrospective review of cases from 12 hospitals was performed. The clinical and radiographic parameters of patients who experienced a new DONC after spinal deformity correction were evaluated.
RESULTS: Eighteen patients, with a mean preoperative major Cobb angle of 75° ± 24°, were included. The mean age at surgery was 13 ± 2 years, and 6 patients (33%) were male. Seven patients had temporary intraoperative neuromonitoring changes. Fourteen patients (78%) had neurologic changes within 24 hours postoperatively (range, 3 to 24 hours). Of 16 patients with blood pressure data, 8 (50%) had at least 1 documented episode of hypotension surrounding the change in neurologic status. No misaligned implants were seen on axial imaging. Fourteen patients (78%) were treated with vasopressors. Sixteen patients (89%) returned to the operating room, and 11 patients (61%) underwent implant removal. Seven patients (39%) sustained a spinal cord infarct, with only 1 (6%) experiencing recovery beyond an ASIA (American Spinal Injury Association Impairment Scale) score of B. Ten (91%) of the 11 patients without an infarct demonstrated recovery (5 patients with an ASIA score of D and 5 with a score of E).
CONCLUSIONS: A DONC is a rare complication of spinal deformity surgery. This study represents the largest documented series of DONCs and highlights the multifactorial and still poorly understood nature of this condition. The primary modifiable risk factor may be hypotension in the postoperative period: 50% of patients had a mean arterial pressure below the 5th percentile for their height, as documented around the time of the neurologic change. Eighty-six percent of patients with a spinal cord infarct had minimal neurologic recovery, whereas nearly all of the patients without an infarct did recover function. The management of this condition may include elevation of blood pressure with temporary implant removal.
LEVEL OF EVIDENCE: Therapeutic Level IV. See Instructions for Authors for a complete description of levels of evidence.
PMID:39951514 | DOI:10.2106/JBJS.24.00164
J Bone Joint Surg Am. 2025 Feb 14. doi: 10.2106/JBJS.24.00582. Online ahead of print.
ABSTRACT
BACKGROUND: Tungsten carbide rings are increasingly popular modern jewelry items. Tungsten carbide is an extremely dense, hard metal. Previously described methods to remove tungsten carbide rings include using locking pliers to compress the ring or cutting the ring with a high-speed dental drill.
METHODS: A universal mechanical testing system (MTS) machine was used to diametrically compress tungsten carbide rings 9 mm in length and 2.4 mm in thickness with a 23.4-mm outer diameter and a 21.0-mm inner diameter while measuring the force required to fracture the rings. A computer numerical control (CNC) machine was used to cut tungsten carbide rings with a diamond grinding bit with and without a flow of normal saline solution. An infrared thermal camera was used to record the temperature at the site of contact between the ring and the grinding bit and at a site one-quarter of the circumference along the ring.
RESULTS: A mean force with 95% confidence interval of 3.7 ± 1.2 kN was required to fracture the tungsten carbide rings via diametral compression (p = 0.05). The rings failed at a mean displacement with 95% confidence interval of 0.32 ± 0.13 mm (p = 0.05). The maximum temperature observed at the site of contact between the ring and grinding bit averaged 160.2°C including cases with and without coolant. The time to reach maximum temperature and the duration of maximum temperature differed significantly between the cases with and without coolant (p = 0.0007 and p = 0.0108, respectively).
CONCLUSIONS: Tungsten carbide rings fractured with minimal displacement using a small amount of force via diametral compression. The brittle fracture pattern of the rings produced minimal comminution. Tungsten carbide rings reached extreme temperatures when cut with a high-speed diamond grinding bit despite cooling with normal saline solution.
CLINICAL RELEVANCE: Previously documented methods to remove a tungsten carbide ring include breaking the ring by compressing it with pliers or cutting it off with a high-speed dental drill. Clinicians should be aware of potential complications of current methods to remove tungsten carbide rings.
PMID:39951513 | DOI:10.2106/JBJS.24.00582
J Bone Joint Surg Am. 2025 Feb 14. doi: 10.2106/JBJS.24.00766. Online ahead of print.
ABSTRACT
BACKGROUND: Osteoporosis continues to be underdiagnosed and inadequately treated in older hip-fracture patients. Our aim was to improve the rate of osteoporosis treatment with IV bisphosphonate therapy in eligible patients admitted for hip-fracture surgery.
METHODS: The present study was designed as a quality improvement initiative using Plan-Do-Study-Act (PDSA) cycles at an academic medical center in Portland, Oregon, over 2.5 years. A protocol was developed (1) to administer IV zoledronate on postoperative day 2 to inpatients aged ≥50 years who underwent surgery for a low-energy hip fracture and (2) to formally diagnose osteoporosis during admission. The protocol was introduced across 3 care settings in a stepped-wedge manner. Outcome measures were the percentage of inpatient zoledronate administered to eligible patients and formal documented diagnosis of osteoporosis. Balance measures included fever after administration and hospital length of stay (LOS). Measures were assessed through quarterly chart review and tracked via control charts.
RESULTS: The rate of zoledronate administration significantly increased from 34.5% (29 of 84) to 74.6% (53 of 71) following the second PDSA cycle (p < 0.001). Documented osteoporosis diagnosis also significantly improved from 51.0% (53 of 104) to 85.7% (96 of 112) following the second PDSA cycle (p < 0.001). No significant differences were shown for hospital LOS, and 1 of 82 patients had medical work-up for post-infusion acute phase reaction after administration.
CONCLUSIONS: This initiative was effective at improving osteoporosis diagnosis and treatment among older hip-fracture patients at our institution. Protocol development for administrating inpatient zoledronate after hip fracture is a reliable way to predictably offer bone health care and secondary-fracture prevention to hip-fracture patients and can be adapted and implemented at other institutions.
LEVEL OF EVIDENCE: Diagnostic Level III. See Instructions for Authors for a complete description of levels of evidence.
PMID:39951512 | DOI:10.2106/JBJS.24.00766
J Bone Joint Surg Am. 2025 Feb 13. doi: 10.2106/JBJS.24.00880. Online ahead of print.
NO ABSTRACT
PMID:39946500 | DOI:10.2106/JBJS.24.00880
J Bone Joint Surg Am. 2025 Feb 13. doi: 10.2106/JBJS.24.00657. Online ahead of print.
ABSTRACT
BACKGROUND: Total hip arthroplasty (THA) is rarely performed in pediatric patients and can be challenging in children because of smaller-sized joints, pathomorphological changes around the hip, residual growth, and higher physical demands. Literature on THA outcomes in this unique population is scarce. We aimed to describe characteristics of pediatric patients who underwent THA, 5- and 10-year implant survival, risk factors for revision, and results of patient-reported outcome measures (PROMs) in a large national cohort.
METHODS: Primary THAs (from 2007 through 2022) in pediatric patients (11 to 18 years of age) for non-oncological indications were extracted from the Dutch Arthroplasty Register (LROI). Implant survival was assessed using Kaplan-Meier survival analyses. Functional, quality-of-life, and pain-related PROM scores were described preoperatively and at 3 and 12 months postoperatively and compared using paired t tests.
RESULTS: We included 283 THAs (161 [57%] in female patients) performed in 253 patients. The mean age at surgery (and standard deviation) was 16 ± 1.6 years. The most common indications were osteonecrosis (90 [32%] of the THA procedures), dysplasia (82 [29%]), and osteoarthritis (54 [19%]). The median follow-up was 7 years (range, 2 to 16 years). Fourteen (6% of 234) THAs were revised. The 5- and 10-year cumulative survival rates were 95% (95% confidence interval [CI], 91% to 97%) and 91% (95% CI, 84% to 95%), respectively. There was an insufficient number of events to allow for statistical analyses of potential risk factors for revision. All PROMs had improved significantly at 12 months postoperatively versus preoperatively (p < 0.001).
CONCLUSIONS: This study, the largest to date on THA in children (≤18 years of age), showed good short- and mid-term THA survival, approaching that among adults. Combined with the positive PROM results, THA appears to be an effective and satisfactory intervention in cases of debilitating pediatric hip disease. Further studies should focus on long-term survival and risk factors for implant failure.
LEVEL OF EVIDENCE: Therapeutic Level III. See Instructions for Authors for a complete description of levels of evidence.
PMID:39946439 | DOI:10.2106/JBJS.24.00657
The SICOT website uses cookies to help it provide a better user experience and function properly. Some of these cookies are used to retain user preferences and are needed to provide SICOT with anonymised data related to the visitors. By visiting this website, you are giving implied consent to the use of these cookies.
To read SICOT's Privacy Policy, please click here.