A Systematic Benchmarking Analysis of Transfer Learning for Medical Image Analysis

Mohammad Reza Hosseinzadeh Taher, Fatemeh Haghighi, Ruibin Feng, Michael B. Gotway, Jianming Liang

Research output: Chapter in Book/Report/Conference proceedingConference contribution

29 Scopus citations

Abstract

Transfer learning from supervised ImageNet models has been frequently used in medical image analysis. Yet, no large-scale evaluation has been conducted to benchmark the efficacy of newly-developed pre-training techniques for medical image analysis, leaving several important questions unanswered. As the first step in this direction, we conduct a systematic study on the transferability of models pre-trained on iNat2021, the most recent large-scale fine-grained dataset, and 14 top self-supervised ImageNet models on 7 diverse medical tasks in comparison with the supervised ImageNet model. Furthermore, we present a practical approach to bridge the domain gap between natural and medical images by continually (pre-)training supervised ImageNet models on medical images. Our comprehensive evaluation yields new insights: (1) pre-trained models on fine-grained data yield distinctive local representations that are more suitable for medical segmentation tasks, (2) self-supervised ImageNet models learn holistic features more effectively than supervised ImageNet models, and (3) continual pre-training can bridge the domain gap between natural and medical images. We hope that this large-scale open evaluation of transfer learning can direct the future research of deep learning for medical imaging. As open science, all codes and pre-trained models are available on our GitHub page https://github.com/JLiangLab/BenchmarkTransferLearning.

Original languageEnglish (US)
Title of host publicationDomain Adaptation and Representation Transfer, and Affordable Healthcare and AI for Resource Diverse Global Health - 3rd MICCAI Workshop, DART 2021, and 1st MICCAI Workshop, FAIR 2021, Held in Conjunction with MICCAI 2021, Proceedings
EditorsShadi Albarqouni, M. Jorge Cardoso, Qi Dou, Konstantinos Kamnitsas, Bishesh Khanal, Islem Rekik, Nicola Rieke, Debdoot Sheet, Sotirios Tsaftaris, Daguang Xu, Ziyue Xu
PublisherSpringer Science and Business Media Deutschland GmbH
Pages3-13
Number of pages11
ISBN (Print)9783030877217
DOIs
StatePublished - 2021
Event3rd MICCAI Workshop on Domain Adaptation and Representation Transfer, DART 2021, and the 1st MICCAI Workshop on Affordable Healthcare and AI for Resource Diverse Global Health, FAIR 2021, held in conjunction with 24th International Conference on Medical Image Computing and Computer Assisted Intervention, MICCAI 2021 - Virtual, Online
Duration: Sep 27 2021Oct 1 2021

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Volume12968 LNCS
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349

Conference

Conference3rd MICCAI Workshop on Domain Adaptation and Representation Transfer, DART 2021, and the 1st MICCAI Workshop on Affordable Healthcare and AI for Resource Diverse Global Health, FAIR 2021, held in conjunction with 24th International Conference on Medical Image Computing and Computer Assisted Intervention, MICCAI 2021
CityVirtual, Online
Period9/27/2110/1/21

Keywords

  • ImageNet pre-training
  • Self-supervised learning
  • Transfer learning

ASJC Scopus subject areas

  • Theoretical Computer Science
  • General Computer Science

Fingerprint

Dive into the research topics of 'A Systematic Benchmarking Analysis of Transfer Learning for Medical Image Analysis'. Together they form a unique fingerprint.

Cite this