Multimodal AI Virtual Breast Biopsies

Radiology Journal detailed a multimodal AI solution that can classify breast lesion subtypes using mammograms, potentially reducing unnecessary biopsies and improving biopsy interpretations. 

Researchers from Israel and IBM/Merative first pretrained a deep learning model with 26k digital mammograms to classify images (malignant, benign, or normal), and used these pretraining weights to develop a lesion subtype classification model trained with mammograms and clinical data. Finally, they trained a pair of lesion classification models using digital mammograms linked to biopsy results from 2,120 women in Israel and 1,642 women in the US. 

When the Israel AI model was tested against mammograms from 441 Israeli women it…

  • Predicted malignancy with an 0.88 AUC
  • Classified ductal carcinoma in situ, invasive carcinomas, or benign lesions with 0.76, 0.85, and 0.82 AUCs
  • Correctly interpreted 98.7% of malignant mammographic examinations and 74.6% of invasive carcinomas (matching three radiologists)
  • Would have prevented 13% of unnecessary biopsies and missed 1.3% of malignancies (at 99% sensitivity)

When the US AI model was tested against mammograms from 344 US women it…

  • Predicted malignancy with a lower 0.80 AUC
  • Classified ductal carcinoma in situ, invasive carcinomas, or benign lesions with lower 0.74, 0.83, and 0.72 AUCs 
  • Correctly interpreted 96.8% of malignant mammographic examinations and 63% of invasive carcinomas (matching three radiologists)

The authors attributed the US model’s lower accuracy to its smaller training dataset, and noted that the two models’ also had worse performance when tested against data from the other country (US model w/Israel data, Israel model w/ US data) or when classifying rare lesion types. 

However, they were still bullish about this approach with enough training data, and noted the future potential to add other imaging modalities and genetic information to further enhance multimodal breast cancer assessments.

The Takeaway 

We’ve historically relied on biopsy results to classify breast lesion subtypes, and that will remain true for quite a while. However, this study shows that multimodal-trained AI can extract far more information from mammograms, while potentially reducing unnecessary biopsies and improving the accuracy of the biopsies that are performed.

Multimodal NSCLC Treatment Prediction

Memorial Sloan Kettering researchers showed that data from routine diagnostic workups (imaging, pathology, genomics) could be used to predict how patients with non-small cell lung cancer (NSCLC) will respond to immunotherapy, potentially allowing more precise and effective treatment decisions.

Immunotherapy can significantly improve outcomes for patients with advanced NSCLC, and it has already “rapidly altered” the treatment landscape. 

  • However, only ~25% of advanced NSCLC patients respond to immunotherapy, and current biomarkers used to predict response have proved to be “only modestly helpful.”  

The researchers collected baseline diagnostic data from 247 patients with advanced NSCLC, including CTs, histopathology slides, and genomic sequencing. 

  • They then had domain experts curate and annotate this data, and leveraged a computational workflow to extract patient-level features (e.g. CT radiomics), before using their DyAM model to integrate the data and predict therapy response.

Using diagnostic data from the same 247 patients, the multimodal DyAM system predicted immunotherapy response with an 0.80 AUC. 

  • That’s far higher than the current FDA-cleared predictive biomarkers – tumor mutational burden and PD-L1 immunohistochemistry score (AUCs: 0.61 & 0.73) – and all imaging approaches examined in the study (AUCs: 0.62 to 0.64).

The Takeaway

Although MSK’s multimodal immunotherapy response research is still in its very early stages and would be difficult to clinically implement, this study “represents a proof of principle” that integrating diagnostic data that is already being captured could improve treatment predictions – and treatment outcomes.

This study also adds to the recent momentum we’re seeing with multi-modal diagnostics and treatment guidance, driven by efforts from academia and highly-funded AI startups like SOPHiA GENETICS and Owkin.

Get every issue of The Imaging Wire, delivered right to your inbox.

You might also like..

Select All

You're signed up!

It's great to have you as a reader. Check your inbox for a welcome email.

-- The Imaging Wire team

You're all set!