Mode
Text Size
Log in / Sign up

Deep learning model predicts impending AAA rupture in symptomatic patients

Deep learning model predicts impending AAA rupture in symptomatic patients
Photo by Navy Medicine / Unsplash
Key Takeaway
Consider this predictive model promising but requiring prospective validation before clinical use.

This retrospective cohort study developed and validated an interpretable multimodal deep learning model to predict impending rupture in symptomatic abdominal aortic aneurysms. The study included 263 hemodynamically stable patients with symptomatic AAAs, with 230 in the development cohort and 33 in an independent temporal test set. The model combined sequential CTA slices with six key clinical biomarkers and was compared against two pragmatic clinical baselines: a clinical-rule model and a CTA-sign model.

In the matched development test set (n=30), the model achieved an area under the curve (AUC) of 0.898, with sensitivity of 93.3% and negative predictive value (NPV) of 93.3%. In the independent temporal validation cohort (n=33), performance remained strong with an AUC of 0.880, sensitivity of 92.9%, and NPV of 87.5%. The model outperformed both clinical baseline models, which achieved AUCs of 0.751 (clinical-rule) and 0.778 (CTA-sign). Grad-CAM visualization showed anatomical plausibility in 78.8% of cases.

Safety and tolerability data were not reported. The primary limitation is the need for prospective validation before clinical implementation. The study suggests this model may offer a clinically relevant improvement in emergency triage safety and efficiency over current practice, but its retrospective design and small validation cohort warrant caution. Generalizability beyond the study population and clinical implementation without prospective validation should not be assumed.

Study Details

Study typeCohort
EvidenceLevel 3
PublishedApr 2026
View Original Abstract ↓
ObjectiveIn hemodynamically stable patients with symptomatic abdominal aortic aneurysms (AAA), timely diagnosis of impending rupture remains a critical challenge. To address this, we developed and validated an interpretable multimodal deep learning model to assess rupture risk and support emergency decision-making.MethodsThis retrospective cohort study included 263 symptomatic AAA patients, with the most recent year's cases (n = 33) as an independent temporal test set. In the 230-patient development cohort, 75 impending rupture cases were matched 1:1 with 75 stable controls using propensity score for age, sex, and maximum aortic diameter. We developed a multimodal deep learning model that combines sequential CTA slices with six key clinical biomarkers through a bidirectional cross-attention (BCA) mechanism built on a ResNet-50 image encoder. For interpretability, we used Gradient-weighted Class Activation Mapping (Grad-CAM) and conducted pre-specified sensitivity analyses assessing robustness against endpoint decision-dependence, treatment-related data leakage, and domain shifts.ResultsIn the matched development test set (n = 30), our multimodal model achieved an area under the curve (AUC) of 0.898 with sensitivity and negative predictive value (NPV) both at 93.3%, offering a high safety margin for ruling out rupture. It markedly outperformed two pragmatic clinical baselines (clinical-rule model AUC: 0.751; CTA-sign model 0.778). This strong performance persisted in the independent temporal validation cohort (n = 33), where it attained an AUC of 0.880, sensitivity of 92.9%, and NPV of 87.5%. The proposed BCA fusion outperformed alternative architectures, and Grad-CAM visualizations were anatomically plausible in 78.8% of cases, supporting model interpretability.ConclusionWe developed and temporally validated an interpretable multimodal model that integrates CTA and clinical biomarkers to enable rapid AAA rupture risk stratification, offering a clinically relevant improvement in the safety and efficiency of emergency triage over current practice, pending prospective validation.
Free Newsletter

Clinical research that matters. Delivered to your inbox.

Join thousands of clinicians and researchers. No spam, unsubscribe anytime.