JBJS

Radiation Shielding Effect of Surgical Loupes Compared with Lead-Lined Glasses and Plastic Face Shields

J Bone Joint Surg Am. 2025 Mar 20. doi: 10.2106/JBJS.24.00642. Online ahead of print.

ABSTRACT

BACKGROUND: Fluoroscopy plays a crucial role in various medical procedures, especially in orthopaedic and spinal surgery. However, concerns have arisen regarding ocular radiation exposure given its association with posterior lens opacities and cataracts. Protective measures are essential to mitigate ocular radiation exposure. During spine surgery, loupes are frequently used but often lack lead lining. The purpose of the present study was to assess the effect of surgical loupes, as compared with lead glasses and plastic face shields, on ocular radiation exposure.

METHODS: Dosimeters were positioned anterior (unshielded) and posterior (shielded) to the lens of each type of eyewear: lead glasses, surgical loupes, and plastic face shields. Eyewear/dosimeters were exposed directly to the horizontal beam of a C-arm for 2 minutes of continuous fluoroscopy. This was repeated 20 times for each type of eyewear (40 total/eyewear, 120 times overall). Radiation doses were modeled with use of generalized estimating equations with a Gaussian distribution and identity link function. Separate models were employed for each outcome, including eyewear category (lead glasses, loupes, plastic shield) and dosimeter position (anterior/unshielded versus posterior/shielded).

RESULTS: Radiation dose was significantly lower in posterior compared with anterior dosimeters for lead glasses (0.00 versus 1,689.80 mRem; p < 0.001) and for loupes (20.27 versus 1,705.95 mRem; p < 0.001). The difference for plastic face shields did not reach significance (1,539.75 versus 1,701.45 mRem; p = 0.06). Lead glasses offered the most protection, followed by surgical loupes and then plastic shields, when comparing the shielded dosimeter readings (0.00 versus 20.27 versus 1,539.75; p < 0.001 for all comparisons). There was no significant difference in radiation dose for dosimeters placed anterior to lead glasses, loupes, and plastic face shields (1,689.80 versus 1,705.95 versus 1,701.45 mRem; p = 0.99).

CONCLUSIONS: Lead glasses were most effective (∼100% reduction), followed by surgical loupes (97%), whereas plastic face shields showed no significant reduction in radiation dose. Surgical loupes can substantially reduce ocular radiation exposure.

CLINICAL RELEVANCE: Surgical loupes may offer ocular radiation protection.

PMID:40112043 | DOI:10.2106/JBJS.24.00642

Weight Loss Before Total Joint Arthroplasty Using a Remote Dietitian and a Mobile Application: A Multicenter Randomized Controlled Trial

J Bone Joint Surg Am. 2025 Mar 20. doi: 10.2106/JBJS.24.00838. Online ahead of print.

ABSTRACT

BACKGROUND: Many surgeons recommend weight loss for patients with obesity before total joint arthroplasty (TJA), but few studies have evaluated weight loss interventions. This study compared weight loss using a remote dietitian and a mobile application (app) with weight loss using standard care for patients with severe obesity before TJA.

METHODS: This multicenter randomized controlled trial included 60 subjects with a body mass index (BMI) of 40 to 47 kg/m2 who had been scheduled for primary total hip or knee arthroplasty from September 2019 to January 2023. The mean age was 61 years, 67% were women, and the mean BMI was 44 kg/m2. The control subjects (n = 29) received standard care; the intervention subjects (n = 31) completed video calls with dietitians and used a mobile app for 12 weeks preoperatively. Weights and surveys were collected at baseline and 12 weeks, with 87% follow-up. Weight loss, patient-reported outcomes, complications, revisions, and reoperations were compared. The mean follow-up was 1.8 years.

RESULTS: The intervention subjects lost more weight (-4.1 versus -2.1 kg, p = 0.22) and had larger decreases in BMI (-1.4 versus -0.9 kg/m2, p = 0.36 than the controls, but not significantly so. The intervention subjects had higher odds of achieving a BMI of <40 kg/m2 (odds ratio = 1.9, p = 0.44), but not significantly so. There were no significant differences in the mean change in the Hip disability and Osteoarthritis Outcome Score, the Knee injury and Osteoarthritis Outcome Score, or the Lower Extremity Activity Scale score. At baseline, only 11% had seen a dietitian in the last 3 months. Most subjects (83%) felt that video calls were helpful. There were no differences in complications between the groups; there was a patellar fracture in the control group and a deep venous thromboembolism in the intervention group.

CONCLUSIONS: A preoperative weight loss intervention using a dietitian and a mobile app was feasible and viewed favorably among patients. Remote dietitians and mobile apps may address gaps in access to obesity treatment before TJA. While the intervention subjects lost more weight and were more likely to achieve a BMI of <40 kg/m2, the differences were not significant. More intensive interventions may be needed to achieve enough weight loss for clinically important improvements in TJA.

LEVEL OF EVIDENCE: Therapeutic Level I. See Instructions for Authors for a complete description of levels of evidence.

PMID:40112039 | DOI:10.2106/JBJS.24.00838

The Minimal Clinically Important Difference (MCID) for Total Joint Arthroplasty Outcome Measures Varies Substantially by Calculation Method

J Bone Joint Surg Am. 2025 Mar 20. doi: 10.2106/JBJS.24.00916. Online ahead of print.

ABSTRACT

BACKGROUND: As the United States health-care system transitions to a value-based model, the minimal clinically important difference (MCID) has become an important metric for assessing perceived benefit in clinical settings. However, there is substantial ambiguity surrounding the MCID value because the calculation method used can lead to substantial changes in the clinical interpretation of surgical success.

METHODS: A total of 1,113 patients who underwent either total knee arthroplasty (TKA) or total hip arthroplasty (THA) between June 2021 and June 2023 and completed their patient-reported outcomes (the KOOS JR [Knee injury and Osteoarthritis Outcome Score for Joint Replacement] or HOOS JR [Hip disability and Osteoarthritis Outcome Score for Joint Replacement]) preoperatively and at 1 year postoperatively were reviewed for this study. The MCID values for the HOOS JR and KOOS JR were determined using 16 statistically appropriate methods, and the resulting MCID values were applied to the study group to assess how differences in methods changed the number of patients who met the MCID at 1 year postoperatively.

RESULTS: The study cohort consisted of 570 patients who underwent TKA and 543 who underwent THA. The overall cohort was 62.2% female, had a mean age of 69.3 ± 8.3 years, and was 92.3% Caucasian, 2.9% African American, and 4.8% other race (i.e., Asian, multiracial, or "other"). The MCID values varied substantially among the methods evaluated. The mean MCID was 11.5 ± 9.2 (range, 0.5 to 36.6) for the KOOS JR and 12.2 ± 8.9 (range, 0.6 to 34.3) for the HOOS JR. Distribution-based methods led to smaller but more variable MCID values, whereas anchor-based methods were noted to have larger but more consistent MCID values.

CONCLUSIONS: Different statistical approaches resulted in substantial variation in the MCID threshold value, which affected the number of patients who reached the MCID. This study demonstrates the ambiguity of the MCID and casts some doubt regarding its utility for assessing the surgical benefit of total joint arthroplasty.

LEVEL OF EVIDENCE: Prognostic Level III. See Instructions for Authors for a complete description of levels of evidence.

PMID:40112037 | DOI:10.2106/JBJS.24.00916

Preseason Patellar Tendon Thickness Predicts Symptomatic Patellar Tendinopathy in Male NCAA Division I Basketball Players

J Bone Joint Surg Am. 2025 Mar 18. doi: 10.2106/JBJS.24.00680. Online ahead of print.

ABSTRACT

BACKGROUND: The purpose of this study was to evaluate whether increased anteroposterior (AP) thickness of the proximal patellar tendon at preseason evaluation is predictive of symptomatic patellar tendinopathy and associated sequelae.

METHODS: Thirty-one male National Collegiate Athletic Association (NCAA) Division I basketball players voluntarily participated in this study (n = 52 tendons from 27 athletes after application of exclusion criteria, with evaluation at preseason, midseason, and postseason time points). At each time point, Victorian Institute of Sport Assessment-Patellar Tendon (VISA-P) scores, patellar tendon tenderness, patellar tendon AP thickness, and the presence of a proximal patellar tendon hypoechoic region were evaluated. Measurement of patellar tendon AP thickness and the identification of hypoechoic regions were performed using a portable ultrasound device. Outcome measures included a proximal patellar tendon hypoechoic region, a trip to the training room (TTR), time-loss symptomatic patellar tendinopathy (TLPT), and patellar tendon rupture. Covariates evaluated in the multivariable regression model included body mass index and a patient-reported history of patellar tendinopathy (α = 0.05).

RESULTS: The mean preseason tendon thickness was 4.78 ± 1.22 mm. Nine (17.3%) of the tendons were symptomatic to the point of requiring a TTR. Preseason tendon thickness was associated with increased odds of a TTR (adjusted odds ratio [aOR] = 3.68 [95% confidence interval (CI) = 1.73 to 7.81]; p < 0.01). The predicted probability of a TTR was 86.0% with a preseason tendon thickness of 8 mm versus 3.4% with a preseason tendon thickness of 4 mm. Preseason tendon thickness was also predictive of TLPT (aOR = 1.96 [95% CI = 1.03 to 3.71]; p = 0.04). Preseason VISA-P scores were not predictive of a TTR (p = 0.66) or TLPT (p = 0.60).

CONCLUSIONS: Increased patellar tendon thickness on preseason ultrasound is predictive of symptomatic patellar tendinopathy and associated sequelae during an NCAA Division I basketball season. Ultrasound identification of at-risk individuals may allow triage toward additional physical therapy and activity modification for these athletes to prevent progression to irreversible patellar tendon disease. These data support the use of ultrasound as a screening tool for elite jumping athletes.

LEVEL OF EVIDENCE: Prognostic Level II. See Instructions for Authors for a complete description of levels of evidence.

PMID:40100945 | DOI:10.2106/JBJS.24.00680

Anterior Glenohumeral Instability: Clinical Anatomy, Clinical Evaluation, Imaging, Nonoperative and Operative Management, and Postoperative Rehabilitation

J Bone Joint Surg Am. 2025 Jan 1;107(1):81-92. doi: 10.2106/JBJS.24.00340. Epub 2024 Nov 14.

ABSTRACT

➢ Anterior glenohumeral instability is a complex orthopaedic problem that requires a detailed history, a thorough physical examination, and a meticulous review of advanced imaging in order to make individualized treatment decisions and optimize patient outcomes.➢ Nonoperative management of primary instability events can be considered in low-demand patients, including elderly individuals or younger, recreational athletes not participating in high-risk activities, and select in-season athletes. Recurrence can result in increased severity of soft-tissue and osseous damage, further increasing the complexity of subsequent surgical management.➢ Surgical stabilization following primary anterior instability is recommended in young athletes who have additional risk factors for failure, including participation in high-risk sports, hyperlaxity, and presence of bipolar bone loss, defined as the presence of both glenoid (anteroinferior glenoid) and humeral head (Hill-Sachs deformity) bone loss.➢ Several surgical treatment options exist, including arthroscopic Bankart repair with or without additional procedures such as remplissage, open Bankart repair, and osseous restoration procedures, including the Latarjet procedure.➢ Favorable results can be expected following arthroscopic Bankart repair with minimal (<13.5%) bone loss and on-track Hill-Sachs lesions following a primary instability event. However, adjunct procedures such as remplissage should be performed for off-track lesions and should be considered in the setting of subcritical glenoid bone loss, select high-risk patients, and near-track lesions.➢ Bone-grafting of anterior glenoid defects, including autograft and allograft options, should be considered in cases with >20% glenoid bone loss.

PMID:40100014 | DOI:10.2106/JBJS.24.00340

The Impact of Sustained Outreach Efforts on Gender Diversity in Orthopaedic Surgery

J Bone Joint Surg Am. 2025 Jan 1;107(1):e1. doi: 10.2106/JBJS.24.00210. Epub 2024 Nov 22.

ABSTRACT

BACKGROUND: Orthopaedic surgery is one of the least gender-diverse surgical specialties, with only 7% women in practice and 20.4% in residency. There are numerous "leaks" in the talent pipeline for women orthopaedic surgeons that lead to the field as a whole falling short of a critical mass (30%) of women. For over a decade, a network of professional and nonprofit organizations, including the Ruth Jackson Orthopaedic Society, The Perry Initiative, Nth Dimensions, and others, have focused on targeted outreach and mentoring of women in the talent pipeline; they report a positive effect of these interventions on recruitment and retention of women in the field.

METHODS: In this study, we applied mathematical models to estimate the historic and future impacts of current outreach and hands-on exposure efforts to recruit more women into orthopaedic surgery. The model uses published data on program reach and impact from one of the largest and longest-running programs, The Perry Initiative, and combines it with AAMC and AAOS Census data. These data were used to forecast the percentage of women entering the profession as postgraduate year 1 (PGY1) residents and among practicing orthopaedic surgeons.

RESULTS: The results of the mathematical models suggest that the increase in women in the PGY1 population from 14.7% to 20.9% from 2008 to 2022 is at least partially attributable to current mentoring and outreach efforts by The Perry Initiative and others. Assuming continued intervention at present levels, the PGY1 residency class will reach peak diversity of 28% women in 2028, and the field as a whole will reach a steady-state composition of approximately 25% practicing women orthopaedic surgeons by 2055.

CONCLUSIONS: The results of this study indicate that outreach and exposure efforts, such as those of The Perry Initiative, are having a substantive impact on gender diversity in orthopaedic surgery. With continued intervention, the field as a whole should approach a critical mass of women within a generation. The collective efforts of the orthopaedics community over the past decade to close the gender gap serve as a guidebook for other professions seeking to diversify.

PMID:40100013 | DOI:10.2106/JBJS.24.00210

Thoracolumbar Fracture: A Natural History Study of Survival Following Injury

J Bone Joint Surg Am. 2025 Jan 1;107(1):73-79. doi: 10.2106/JBJS.24.00706. Epub 2024 Nov 19.

ABSTRACT

BACKGROUND: Fractures of the thoracic and lumbar spine are increasingly common. Although it is known that such fractures may elevate the risk of near-term morbidity, the natural history of patients who sustain such injuries remains poorly described. We sought to characterize the natural history of patients treated for thoracolumbar fractures and to understand clinical and sociodemographic factors associated with survival.

METHODS: Patients treated for acute thoracic or lumbar spine fractures within a large academic health-care network between 2015 and 2021 were identified. Clinical, radiographic, and mortality data were obtained from medical records and administrative charts. Survival was assessed using Kaplan-Meier curves. We used multivariable logistic regression to evaluate factors associated with survival, while adjusting for confounders. Results were expressed as odds ratios (ORs) and 95% confidence intervals (CIs).

RESULTS: The study included 717 patients (median age, 66 years; 59.8% male; 69% non-Hispanic White). The mortality rate was 7.0% (n = 50), 16.2% (n = 116), and 20.4% (n = 146) at 3, 12, and 24 months following injury, respectively. In adjusted analysis, patients who died within the first year following injury were more likely to be older (OR = 1.03; 95% CI = 1.01 to 1.05) and male (OR = 1.67; 95% CI = 1.05 to 2.69). A higher Injury Severity Score, lower Glasgow Coma Scale score, and higher Charlson Comorbidity Index at presentation were also influential factors. The final model explained 81% (95% CI = 81% to 83%) of the variation in survival.

CONCLUSIONS: We identified a previously underappreciated fact: thoracolumbar fractures are associated with a mortality risk comparable with that of hip fractures. The risk of mortality is greatest in elderly patients and those with multiple comorbidities. The results of our model can be used in patient and family counseling, informed decision-making, and resource allocation to mitigate the potential risk of near-term mortality in high-risk individuals.

LEVEL OF EVIDENCE: Prognostic Level III. See Instructions for Authors for a complete description of levels of evidence.

PMID:40100012 | DOI:10.2106/JBJS.24.00706

Clinical Outcomes After 1 and 2-Level Lumbar Total Disc Arthroplasty: 1,187 Patients with 7 to 21-Year Follow-up

J Bone Joint Surg Am. 2025 Jan 1;107(1):53-65. doi: 10.2106/JBJS.23.00735. Epub 2024 Nov 22.

ABSTRACT

BACKGROUND: In this study, we expand the supportive evidence for total disc arthroplasty (TDA) with results up to 21 years in a large patient cohort who received a semiconstrained ball-and-socket lumbar prosthesis. The objectives of the study were to compare the results for 1 versus 2-level surgeries and to evaluate whether prior surgery at the index level(s) impacts clinical outcomes.

METHODS: From 1999 to 2013, 1,187 patients with chronic lumbar degenerative disc disease (DDD) underwent lumbar TDA, of whom 772 underwent a 1-level procedure and 415 underwent a 2-level procedure. A total of 373 (31.4%) of the 1,187 patients had prior index-level surgery. Patients were evaluated preoperatively; at 3, 6, 12, 18, and 24 months postoperatively; and yearly thereafter. The follow-up duration ranged from 7 to 21 years (mean, 11 years and 8 months). Collected data included radiographic, neurological, and physical assessments, as well as self-evaluations using the Oswestry Disability Index (ODI) and visual analog scale (VAS) for back and leg pain. Perioperative data points, complication rates, and reoperation or revision rates were also assessed. Patients were divided into 4 groups: 1-level TDA with no prior surgery at the index level, 1-level TDA with prior surgery, 2-level TDA with no prior surgery, and 2-level TDA with prior surgery.

RESULTS: All groups showed dramatic reduction in the ODI at 3 months postoperatively and maintained these scores over time. Although VAS pain did not diminish to its final level as rapidly for patients with prior surgery, there was no significant difference between the groups in terms of pain reduction at 24 months postoperatively. Of 1,187 patients, 49 (4.13%) required either a new surgery at another level or revision or reoperation at the index level. Rates were too low in all groups to compare them statistically. Total TDA revision and adjacent-level surgery rates over 7 to 21 years were very low (0.67% and 1.85%, respectively).

CONCLUSIONS: This study demonstrates the robust long-term clinical success of 1 and 2-level lumbar TDA as assessed at 7 to 21 years postoperatively in one of the largest evaluated cohorts of patients with TDA. Patients had dramatic and maintained reductions in disability and pain scores over time and low rates of index-level revision or reoperation and adjacent-level surgery relative to published long-term fusion data. Additionally, patients who underwent 1-level lumbar TDA and those who underwent 2-level TDA demonstrated equivalent improvement, as did patients with prior surgery at the index level and those with no prior surgery.

LEVEL OF EVIDENCE: Therapeutic Level III. See Instructions for Authors for a complete description of levels of evidence.

PMID:40100011 | PMC:PMC11665976 | DOI:10.2106/JBJS.23.00735

Intraoperative Facet Joint Block Reduces Pain After Oblique Lumbar Interbody Fusion: A Double-Blinded, Randomized, Placebo-Controlled Clinical Trial

J Bone Joint Surg Am. 2025 Jan 1;107(1):16-25. doi: 10.2106/JBJS.23.01480. Epub 2024 Nov 20.

ABSTRACT

BACKGROUND: Oblique lumbar interbody fusion (OLIF) results in less tissue damage than in other surgeries, but immediate postoperative pain occurs. Notably, facet joint widening occurs in the vertebral body after OLIF. We hypothesized that the application of a facet joint block to the area of widening would relieve facet joint pain. The purpose of this study was to evaluate the analgesic effects of such injections on postoperative pain.

METHODS: This double-blinded, placebo-controlled study randomized patients into 2 groups. Patients assigned to the active group received an intra-articular injection of a compound mixture of bupivacaine and triamcinolone, whereas patients in the placebo group received an equivalent volume of normal saline solution injection. Back and dominant leg pain were evaluated with use of a visual analog scale (VAS) at 12, 24, 48, and 72 hours postoperatively. Clinical outcomes were evaluated preoperatively and at 6 months postoperatively with use of the Oswestry Disability Index (ODI) and VAS for back and dominant leg pain.

RESULTS: Of the 61 patients who were included, 31 were randomized to the placebo group and 30 were randomized to the active group. Postoperative fentanyl consumption from patient-controlled analgesia was higher in the placebo group than in the active group at up to 36 hours postoperatively (p < 0.001) and decreased gradually in both groups. VAS back pain scores were significantly higher in the placebo group than in the active group at up to 48 hours postoperatively. On average, patients in the active group had a higher satisfaction score (p = 0.038) and were discharged 1.3 days earlier than those in the placebo group.

CONCLUSIONS: The use of an intraoperative facet joint block decreased pain perception during OLIF, thereby reducing opioid consumption and the severity of postoperative pain. This effect was also associated with a reduction in the length of the stay.

LEVEL OF EVIDENCE: Therapeutic Level I. See Instructions for Authors for a complete description of levels of evidence.

PMID:40100010 | DOI:10.2106/JBJS.23.01480

Defining the Cost of Arthroscopic Rotator Cuff Repair: A Multicenter, Time-Driven Activity-Based Costing and Cost Optimization Investigation

J Bone Joint Surg Am. 2025 Jan 1;107(1):9-15. doi: 10.2106/JBJS.23.01351. Epub 2024 Nov 20.

ABSTRACT

BACKGROUND: Rotator cuff repair (RCR) is a frequently performed outpatient orthopaedic surgery, with substantial financial implications for health-care systems. Time-driven activity-based costing (TDABC) is a method for nuanced cost analysis and is a valuable tool for strategic health-care decision-making. The aim of this study was to apply the TDABC methodology to RCR procedures to identify specific avenues to optimize cost-efficiency within the health-care system in 2 critical areas: (1) the reduction of variability in the episode duration, and (2) the standardization of suture anchor acquisition costs.

METHODS: Using a multicenter, retrospective design, this study incorporates data from all patients who underwent an RCR surgical procedure at 1 of 4 academic tertiary health systems across the United States. Data were extracted from Avant-Garde Health's Care Measurement platform and were analyzed utilizing TDABC methodology. Cost analysis was performed using 2 primary metrics: the opportunity costs arising from a possible reduction in episode duration variability, and the potential monetary savings achievable through the standardization of suture anchor costs.

RESULTS: In this study, 921 RCR cases performed at 4 institutions had a mean episode duration cost of $4,094 ± $1,850. There was a significant threefold cost variability between the 10th percentile ($2,282) and the 90th percentile ($6,833) (p < 0.01). The mean episode duration was registered at 7.1 hours. The largest variability in the episode duration was time spent in the post-acute care unit and the ward after the surgical procedure. By reducing the episode duration variability, it was estimated that up to 640 care-hours could be saved annually at a single hospital. Likewise, standardizing suture anchor acquisition costs could generate direct savings totaling $217,440 across the hospitals.

CONCLUSIONS: This multicenter study offers valuable insights into RCR cost as a function of care pathways and suture anchor cost. It outlines avenues for achieving cost-savings and operational efficiency. These findings can serve as a foundational basis for developing health-economics models.

LEVEL OF EVIDENCE: Economic and Decision Analysis Level III. See Instructions for Authors for a complete description of levels of evidence.

PMID:40100009 | DOI:10.2106/JBJS.23.01351

Identifying Risk Factors from Preoperative MRI Measurements for Failure of Primary ACL Reconstruction: A Nested Case-Control Study with 5-Year Follow-up

J Bone Joint Surg Am. 2025 Mar 10. doi: 10.2106/JBJS.23.01137. Online ahead of print.

ABSTRACT

BACKGROUND: Identifying patients at high risk for failure of primary anterior cruciate ligament reconstruction (ACLR) on the basis of preoperative magnetic resonance imaging (MRI) measurements has received considerable attention. In this study, we aimed to identify potential risk factors for primary ACLR failure from preoperative MRI measurements and to determine optimal cutoff values for clinical relevance.

METHODS: Retrospective review and follow-up were conducted in this nested case-control study of patients who underwent primary single-bundle ACLR using hamstring tendon autograft at our institution from August 2016 to January 2018. The failed ACLR group included 72 patients with graft failure within 5 years after primary ACLR, while the control group included 144 propensity score-matched patients without failure during the 5-year follow-up period. Preoperative MRI measurements were compared between the 2 groups. Receiver operating characteristic (ROC) curve analyses were conducted to determine the optimal cutoff values for the significant risk factors. Odds ratios (ORs) were calculated, and survival analyses were performed to evaluate the clinical relevance of the determined thresholds.

RESULTS: A greater lateral femoral condyle ratio (LFCR) (p = 0.0076), greater posterior tibial slope in the lateral compartment (LPTS) (p = 0.0002), and greater internal rotational tibial subluxation (IRTS) (p < 0.0001) were identified in the failed ACLR group compared with the control group. ROC analyses showed that the optimal cutoff values for IRTS and LPTS were 5.8 mm (area under the curve [AUC], 0.708; specificity, 89.6%; sensitivity, 41.7%) and 8.5° (AUC, 0.655; specificity, 71.5%; sensitivity, 62.5%), respectively. Patients who met the IRTS (OR, 6.14; hazard ratio [HR], 3.87) or LPTS threshold (OR, 4.19; HR, 3.07) demonstrated a higher risk of primary ACLR failure and were significantly more likely to experience ACLR failure in a shorter time period.

CONCLUSIONS: Preoperative MRI measurements of increased IRTS, LPTS, and LFCR were identified as risk factors for primary ACLR failure. The optimal cutoff value of 5.8 mm for IRTS and 8.5° for LPTS could be valuable in the perioperative management of primary ACLR.

LEVEL OF EVIDENCE: Prognostic Level III. See Instructions for Authors for a complete description of levels of evidence.

PMID:40063685 | DOI:10.2106/JBJS.23.01137

Operative Treatment of Flail Chest Injuries Does Not Reduce Pain or In-Hospital Opioid Requirements: Results from a Multicenter Randomized Controlled Trial

J Bone Joint Surg Am. 2025 Mar 7. doi: 10.2106/JBJS.24.01099. Online ahead of print.

ABSTRACT

BACKGROUND: A previous randomized controlled trial (RCT) evaluating operative versus nonoperative treatment of acute flail chest injuries revealed more ventilator-free days in operatively treated patients who had been ventilated at the time of randomization. It has been suggested that surgery for these injuries may also improve a patient's pain and function. Our goal was to perform a secondary analysis of the previous RCT to evaluate pain and postinjury opioid requirements in patients with operatively and nonoperatively treated unstable chest wall injuries.

METHODS: We analyzed data from a previous multicenter RCT that had been conducted from 2011 to 2019. Patients who had sustained acute, unstable chest wall injuries were randomized to operative or nonoperative treatment. In-hospital pain medication logs were evaluated, and daily morphine milligram equivalents (MMEs) were calculated. The patients' symptoms were also assessed, including generalized pain, chest wall pain, chest wall tightness, and shortness of breath. Additionally, patients completed the 36-Item Short Form Health Survey (SF-36), and they were followed for 1 year postinjury.

RESULTS: In the original trial, 207 patients were analyzed: 99 patients received nonoperative treatment, and 108 received operative treatment. There were no significant differences in pain medication usage between the 2 groups at any of the examined time points (p = 0.477). There were no significant differences in generalized pain, chest wall pain, chest wall tightness, or shortness of breath at any time postinjury in the 2 groups. There were also no significant differences in the SF-36 scores.

CONCLUSIONS: This secondary analysis of a previous RCT suggested that operative treatment of patients with flail chest injuries does not reduce in-hospital daily opioid requirements. There were also no reductions in generalized pain, chest wall pain, chest wall tightness, or shortness of breath with operative treatment. The SF-36 scores were similar for both groups. Further work is needed to identify those patients most likely to benefit from operative treatment of flail chest injuries.

LEVEL OF EVIDENCE: Therapeutic Level I. See Instructions for Authors for a complete description of levels of evidence.

PMID:40053576 | DOI:10.2106/JBJS.24.01099

Pages