Abstract

User-adaptive image retrieval/recommendation has drawn a lot of research interests in recent years, owing to fast development of various Web applications where retrieving images is a key enabling task. Existing challenges include the lack of user-adaptive training data, the ambiguity of user query and the real-time interactivity of a system. This paper proposes a hybrid learning strategy that fuses knowledge from both pointwise and pairwise training data into one framework for attribute-based, user-adaptive image retrieval. Under this framework, we develop an online learning algorithm for updating the ranking performance based on user feedback. Furthermore, we derive the framework into a kernel form, allowing easy application of kernel techniques. The proposed approach is evaluated on two image datasets and experimental results show that it achieves obvious performance gains over ranking and zero-shot learning from either type of training data independently. In addition, the online learning algorithm is able to deliver much better performance than batch learning, given the same elapsed running time, or can achieve better performance in much less time.

Original languageEnglish (US)
Title of host publicationICMR 2015 - Proceedings of the 2015 ACM International Conference on Multimedia Retrieval
PublisherAssociation for Computing Machinery, Inc
Pages67-74
Number of pages8
ISBN (Print)9781450332743
DOIs
StatePublished - Jun 22 2015
Event5th ACM International Conference on Multimedia Retrieval, ICMR 2015 - Shanghai, China
Duration: Jun 23 2015Jun 26 2015

Other

Other5th ACM International Conference on Multimedia Retrieval, ICMR 2015
CountryChina
CityShanghai
Period6/23/156/26/15

Keywords

  • Adaptive image retrieval
  • Attribute learning
  • Learning to rank

ASJC Scopus subject areas

  • Computer Graphics and Computer-Aided Design
  • Software
  • Computer Vision and Pattern Recognition

Fingerprint Dive into the research topics of 'Fusing pointwise and pairwise labels for supporting user-adaptive image retrieval'. Together they form a unique fingerprint.

Cite this