Learning better deep features for the prediction of occult invasive disease in ductal carcinoma in situ through transfer learning

Bibo Shi, Rui Hou, MacIej A. Mazurowski, Lars J. Grimm, Yinhao Ren, Jeffrey R. Marks, Lorraine M. King, Carlo Maley, E. Shelley Hwang, Joseph Y. Lo

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Purpose: To determine whether domain transfer learning can improve the performance of deep features extracted from digital mammograms using a pre-trained deep convolutional neural network (CNN) in the prediction of occult invasive disease for patients with ductal carcinoma in situ (DCIS) on core needle biopsy. Method: In this study, we collected digital mammography magnification views for 140 patients with DCIS at biopsy, 35 of which were subsequently upstaged to invasive cancer. We utilized a deep CNN model that was pre-trained on two natural image data sets (ImageNet and DTD) and one mammographic data set (INbreast) as the feature extractor, hypothesizing that these data sets are increasingly more similar to our target task and will lead to better representations of deep features to describe DCIS lesions. Through a statistical pooling strategy, three sets of deep features were extracted using the CNNs at different levels of convolutional layers from the lesion areas. A logistic regression classifier was then trained to predict which tumors contain occult invasive disease. The generalization performance was assessed and compared using repeated random sub-sampling validation and receiver operating characteristic (ROC) curve analysis. Result: The best performance of deep features was from CNN model pre-trained on INbreast, and the proposed classifier using this set of deep features was able to achieve a median classification performance of ROC-AUC equal to 0.75, which is significantly better (p<=0.05) than the performance of deep features extracted using ImageNet data set (ROCAUC = 0.68). Conclusion: Transfer learning is helpful for learning a better representation of deep features, and improves the prediction of occult invasive disease in DCIS.

Original languageEnglish (US)
Title of host publicationMedical Imaging 2018
Subtitle of host publicationComputer-Aided Diagnosis
PublisherSPIE
Volume10575
ISBN (Electronic)9781510616394
DOIs
StatePublished - Jan 1 2018
EventMedical Imaging 2018: Computer-Aided Diagnosis - Houston, United States
Duration: Feb 12 2018Feb 15 2018

Other

OtherMedical Imaging 2018: Computer-Aided Diagnosis
CountryUnited States
CityHouston
Period2/12/182/15/18

Fingerprint

Carcinoma, Intraductal, Noninfiltrating
learning
Biopsy
cancer
Learning
Neural networks
Neural Networks (Computer)
Classifiers
predictions
ROC Curve
classifiers
Mammography
lesions
Large-Core Needle Biopsy
Needles
receivers
Logistics
Tumors
Sampling
Area Under Curve

Keywords

  • Breast cancer
  • deep learning
  • mammogram
  • transfer learning

ASJC Scopus subject areas

  • Electronic, Optical and Magnetic Materials
  • Biomaterials
  • Atomic and Molecular Physics, and Optics
  • Radiology Nuclear Medicine and imaging

Cite this

Shi, B., Hou, R., Mazurowski, M. A., Grimm, L. J., Ren, Y., Marks, J. R., ... Lo, J. Y. (2018). Learning better deep features for the prediction of occult invasive disease in ductal carcinoma in situ through transfer learning. In Medical Imaging 2018: Computer-Aided Diagnosis (Vol. 10575). [105752R] SPIE. https://doi.org/10.1117/12.2293594

Learning better deep features for the prediction of occult invasive disease in ductal carcinoma in situ through transfer learning. / Shi, Bibo; Hou, Rui; Mazurowski, MacIej A.; Grimm, Lars J.; Ren, Yinhao; Marks, Jeffrey R.; King, Lorraine M.; Maley, Carlo; Hwang, E. Shelley; Lo, Joseph Y.

Medical Imaging 2018: Computer-Aided Diagnosis. Vol. 10575 SPIE, 2018. 105752R.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Shi, B, Hou, R, Mazurowski, MA, Grimm, LJ, Ren, Y, Marks, JR, King, LM, Maley, C, Hwang, ES & Lo, JY 2018, Learning better deep features for the prediction of occult invasive disease in ductal carcinoma in situ through transfer learning. in Medical Imaging 2018: Computer-Aided Diagnosis. vol. 10575, 105752R, SPIE, Medical Imaging 2018: Computer-Aided Diagnosis, Houston, United States, 2/12/18. https://doi.org/10.1117/12.2293594
Shi B, Hou R, Mazurowski MA, Grimm LJ, Ren Y, Marks JR et al. Learning better deep features for the prediction of occult invasive disease in ductal carcinoma in situ through transfer learning. In Medical Imaging 2018: Computer-Aided Diagnosis. Vol. 10575. SPIE. 2018. 105752R https://doi.org/10.1117/12.2293594
Shi, Bibo ; Hou, Rui ; Mazurowski, MacIej A. ; Grimm, Lars J. ; Ren, Yinhao ; Marks, Jeffrey R. ; King, Lorraine M. ; Maley, Carlo ; Hwang, E. Shelley ; Lo, Joseph Y. / Learning better deep features for the prediction of occult invasive disease in ductal carcinoma in situ through transfer learning. Medical Imaging 2018: Computer-Aided Diagnosis. Vol. 10575 SPIE, 2018.
@inproceedings{34272131675e427bb27f2de87aa9dc59,
title = "Learning better deep features for the prediction of occult invasive disease in ductal carcinoma in situ through transfer learning",
abstract = "Purpose: To determine whether domain transfer learning can improve the performance of deep features extracted from digital mammograms using a pre-trained deep convolutional neural network (CNN) in the prediction of occult invasive disease for patients with ductal carcinoma in situ (DCIS) on core needle biopsy. Method: In this study, we collected digital mammography magnification views for 140 patients with DCIS at biopsy, 35 of which were subsequently upstaged to invasive cancer. We utilized a deep CNN model that was pre-trained on two natural image data sets (ImageNet and DTD) and one mammographic data set (INbreast) as the feature extractor, hypothesizing that these data sets are increasingly more similar to our target task and will lead to better representations of deep features to describe DCIS lesions. Through a statistical pooling strategy, three sets of deep features were extracted using the CNNs at different levels of convolutional layers from the lesion areas. A logistic regression classifier was then trained to predict which tumors contain occult invasive disease. The generalization performance was assessed and compared using repeated random sub-sampling validation and receiver operating characteristic (ROC) curve analysis. Result: The best performance of deep features was from CNN model pre-trained on INbreast, and the proposed classifier using this set of deep features was able to achieve a median classification performance of ROC-AUC equal to 0.75, which is significantly better (p<=0.05) than the performance of deep features extracted using ImageNet data set (ROCAUC = 0.68). Conclusion: Transfer learning is helpful for learning a better representation of deep features, and improves the prediction of occult invasive disease in DCIS.",
keywords = "Breast cancer, deep learning, mammogram, transfer learning",
author = "Bibo Shi and Rui Hou and Mazurowski, {MacIej A.} and Grimm, {Lars J.} and Yinhao Ren and Marks, {Jeffrey R.} and King, {Lorraine M.} and Carlo Maley and Hwang, {E. Shelley} and Lo, {Joseph Y.}",
year = "2018",
month = "1",
day = "1",
doi = "10.1117/12.2293594",
language = "English (US)",
volume = "10575",
booktitle = "Medical Imaging 2018",
publisher = "SPIE",

}

TY - GEN

T1 - Learning better deep features for the prediction of occult invasive disease in ductal carcinoma in situ through transfer learning

AU - Shi, Bibo

AU - Hou, Rui

AU - Mazurowski, MacIej A.

AU - Grimm, Lars J.

AU - Ren, Yinhao

AU - Marks, Jeffrey R.

AU - King, Lorraine M.

AU - Maley, Carlo

AU - Hwang, E. Shelley

AU - Lo, Joseph Y.

PY - 2018/1/1

Y1 - 2018/1/1

N2 - Purpose: To determine whether domain transfer learning can improve the performance of deep features extracted from digital mammograms using a pre-trained deep convolutional neural network (CNN) in the prediction of occult invasive disease for patients with ductal carcinoma in situ (DCIS) on core needle biopsy. Method: In this study, we collected digital mammography magnification views for 140 patients with DCIS at biopsy, 35 of which were subsequently upstaged to invasive cancer. We utilized a deep CNN model that was pre-trained on two natural image data sets (ImageNet and DTD) and one mammographic data set (INbreast) as the feature extractor, hypothesizing that these data sets are increasingly more similar to our target task and will lead to better representations of deep features to describe DCIS lesions. Through a statistical pooling strategy, three sets of deep features were extracted using the CNNs at different levels of convolutional layers from the lesion areas. A logistic regression classifier was then trained to predict which tumors contain occult invasive disease. The generalization performance was assessed and compared using repeated random sub-sampling validation and receiver operating characteristic (ROC) curve analysis. Result: The best performance of deep features was from CNN model pre-trained on INbreast, and the proposed classifier using this set of deep features was able to achieve a median classification performance of ROC-AUC equal to 0.75, which is significantly better (p<=0.05) than the performance of deep features extracted using ImageNet data set (ROCAUC = 0.68). Conclusion: Transfer learning is helpful for learning a better representation of deep features, and improves the prediction of occult invasive disease in DCIS.

AB - Purpose: To determine whether domain transfer learning can improve the performance of deep features extracted from digital mammograms using a pre-trained deep convolutional neural network (CNN) in the prediction of occult invasive disease for patients with ductal carcinoma in situ (DCIS) on core needle biopsy. Method: In this study, we collected digital mammography magnification views for 140 patients with DCIS at biopsy, 35 of which were subsequently upstaged to invasive cancer. We utilized a deep CNN model that was pre-trained on two natural image data sets (ImageNet and DTD) and one mammographic data set (INbreast) as the feature extractor, hypothesizing that these data sets are increasingly more similar to our target task and will lead to better representations of deep features to describe DCIS lesions. Through a statistical pooling strategy, three sets of deep features were extracted using the CNNs at different levels of convolutional layers from the lesion areas. A logistic regression classifier was then trained to predict which tumors contain occult invasive disease. The generalization performance was assessed and compared using repeated random sub-sampling validation and receiver operating characteristic (ROC) curve analysis. Result: The best performance of deep features was from CNN model pre-trained on INbreast, and the proposed classifier using this set of deep features was able to achieve a median classification performance of ROC-AUC equal to 0.75, which is significantly better (p<=0.05) than the performance of deep features extracted using ImageNet data set (ROCAUC = 0.68). Conclusion: Transfer learning is helpful for learning a better representation of deep features, and improves the prediction of occult invasive disease in DCIS.

KW - Breast cancer

KW - deep learning

KW - mammogram

KW - transfer learning

UR - http://www.scopus.com/inward/record.url?scp=85046284722&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85046284722&partnerID=8YFLogxK

U2 - 10.1117/12.2293594

DO - 10.1117/12.2293594

M3 - Conference contribution

VL - 10575

BT - Medical Imaging 2018

PB - SPIE

ER -