Iterative Image Tfor Unsupervised Domain Adaptation

Sachin Chhabra, Hemanth Venkateswara, Baoxin Li

Research output: Chapter in Book/Report/Conference proceedingConference contribution

3 Scopus citations

Abstract

In this paper, we propose an image-Translation-based unsupervised domain adaptation approach that iteratively trains an image translation and a classification network using each other. In Phase A, a classification network is used to guide the image translation to preserve the content and generate images. In Phase B, the generated images are used to train the classification network. With each step, the classification network and generator improve each other to learn the target domain representation. Detailed analysis and the experiments are testimony of the strength of our approach.

Original languageEnglish (US)
Title of host publicationMULL 2021 - Proceedings of the 1st Workshop on Multimedia Understanding with Less Labeling, co-located with ACM MM 2021
PublisherAssociation for Computing Machinery, Inc
Pages37-44
Number of pages8
ISBN (Electronic)9781450386814
DOIs
StatePublished - Oct 24 2021
Event1st Workshop on Multimedia Understanding with Less Labeling, MULL 2021, co-located with ACM MM 2021 - Virtual, Online, China
Duration: Oct 24 2021 → …

Publication series

NameMULL 2021 - Proceedings of the 1st Workshop on Multimedia Understanding with Less Labeling, co-located with ACM MM 2021

Conference

Conference1st Workshop on Multimedia Understanding with Less Labeling, MULL 2021, co-located with ACM MM 2021
Country/TerritoryChina
CityVirtual, Online
Period10/24/21 → …

Keywords

  • generative domain adaptation
  • image translation
  • source-like target
  • ternary feature alignment
  • unsupervised domain adaptation

ASJC Scopus subject areas

  • Computational Theory and Mathematics
  • Computer Vision and Pattern Recognition
  • Software
  • Computer Science Applications

Fingerprint

Dive into the research topics of 'Iterative Image Tfor Unsupervised Domain Adaptation'. Together they form a unique fingerprint.

Cite this