Learning better deep features for the prediction of occult invasive disease in ductal carcinoma in situ through transfer learning

Bibo Shi, Rui Hou, MacIej A. Mazurowski, Lars J. Grimm, Yinhao Ren, Jeffrey R. Marks, Lorraine M. King, Carlo Maley, E. Shelley Hwang, Joseph Y. Lo

Research output: Chapter in Book/Report/Conference proceedingConference contribution

1 Scopus citations

Abstract

Purpose: To determine whether domain transfer learning can improve the performance of deep features extracted from digital mammograms using a pre-trained deep convolutional neural network (CNN) in the prediction of occult invasive disease for patients with ductal carcinoma in situ (DCIS) on core needle biopsy. Method: In this study, we collected digital mammography magnification views for 140 patients with DCIS at biopsy, 35 of which were subsequently upstaged to invasive cancer. We utilized a deep CNN model that was pre-trained on two natural image data sets (ImageNet and DTD) and one mammographic data set (INbreast) as the feature extractor, hypothesizing that these data sets are increasingly more similar to our target task and will lead to better representations of deep features to describe DCIS lesions. Through a statistical pooling strategy, three sets of deep features were extracted using the CNNs at different levels of convolutional layers from the lesion areas. A logistic regression classifier was then trained to predict which tumors contain occult invasive disease. The generalization performance was assessed and compared using repeated random sub-sampling validation and receiver operating characteristic (ROC) curve analysis. Result: The best performance of deep features was from CNN model pre-trained on INbreast, and the proposed classifier using this set of deep features was able to achieve a median classification performance of ROC-AUC equal to 0.75, which is significantly better (p<=0.05) than the performance of deep features extracted using ImageNet data set (ROCAUC = 0.68). Conclusion: Transfer learning is helpful for learning a better representation of deep features, and improves the prediction of occult invasive disease in DCIS.

Original languageEnglish (US)
Title of host publicationMedical Imaging 2018
Subtitle of host publicationComputer-Aided Diagnosis
PublisherSPIE
Volume10575
ISBN (Electronic)9781510616394
DOIs
StatePublished - Jan 1 2018
EventMedical Imaging 2018: Computer-Aided Diagnosis - Houston, United States
Duration: Feb 12 2018Feb 15 2018

Other

OtherMedical Imaging 2018: Computer-Aided Diagnosis
CountryUnited States
CityHouston
Period2/12/182/15/18

    Fingerprint

Keywords

  • Breast cancer
  • deep learning
  • mammogram
  • transfer learning

ASJC Scopus subject areas

  • Electronic, Optical and Magnetic Materials
  • Biomaterials
  • Atomic and Molecular Physics, and Optics
  • Radiology Nuclear Medicine and imaging

Cite this

Shi, B., Hou, R., Mazurowski, M. A., Grimm, L. J., Ren, Y., Marks, J. R., King, L. M., Maley, C., Hwang, E. S., & Lo, J. Y. (2018). Learning better deep features for the prediction of occult invasive disease in ductal carcinoma in situ through transfer learning. In Medical Imaging 2018: Computer-Aided Diagnosis (Vol. 10575). [105752R] SPIE. https://doi.org/10.1117/12.2293594