9 Citations (Scopus)

Abstract

Multiple-instance learning (MIL) is a unique learning problem in which training data labels are available only for collections of objects (called bags) instead of individual objects (called instances). A plethora of approaches have been developed to solve this problem in the past years. Popular methods include the diverse density, MILIS and DD-SVM. While having been widely used, these methods, particularly those in computer vision have attempted fairly sophisticated solutions to solve certain unique and particular configurations of the MIL space. In this paper, we analyze the MIL feature space using modified versions of traditional non-parametric techniques like the Parzen window and k-nearest-neighbour, and develop a learning approach employing distances to k-nearest neighbours of a point in the feature space. We show that these methods work as well, if not better than most recently published methods on benchmark datasets. We compare and contrast our analysis with the well-established diverse-density approach and its variants in recent literature, using benchmark datasets including the Musk, Andrews' and Corel datasets, along with a diabetic retinopathy pathology diagnosis dataset. Experimental results demonstrate that, while enjoying an intuitive interpretation and supporting fast learning, these method have the potential of delivering improved performance even for complex data arising from real-world applications.

Original languageEnglish (US)
Title of host publicationProceedings - 2015 IEEE International Conference on Computer Vision, ICCV 2015
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages2605-2613
Number of pages9
Volume11-18-December-2015
ISBN (Electronic)9781467383912
DOIs
StatePublished - Feb 17 2016
Event15th IEEE International Conference on Computer Vision, ICCV 2015 - Santiago, Chile
Duration: Dec 11 2015Dec 18 2015

Other

Other15th IEEE International Conference on Computer Vision, ICCV 2015
CountryChile
CitySantiago
Period12/11/1512/18/15

Fingerprint

Pathology
Computer vision
Labels

ASJC Scopus subject areas

  • Software
  • Computer Vision and Pattern Recognition

Cite this

Venkatesan, R., Chandakkar, P. S., & Li, B. (2016). Simpler non-parametric methods provide as good or better results to multiple-instance learning. In Proceedings - 2015 IEEE International Conference on Computer Vision, ICCV 2015 (Vol. 11-18-December-2015, pp. 2605-2613). [7410656] Institute of Electrical and Electronics Engineers Inc.. https://doi.org/10.1109/ICCV.2015.299

Simpler non-parametric methods provide as good or better results to multiple-instance learning. / Venkatesan, Ragav; Chandakkar, Parag Shridhar; Li, Baoxin.

Proceedings - 2015 IEEE International Conference on Computer Vision, ICCV 2015. Vol. 11-18-December-2015 Institute of Electrical and Electronics Engineers Inc., 2016. p. 2605-2613 7410656.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Venkatesan, R, Chandakkar, PS & Li, B 2016, Simpler non-parametric methods provide as good or better results to multiple-instance learning. in Proceedings - 2015 IEEE International Conference on Computer Vision, ICCV 2015. vol. 11-18-December-2015, 7410656, Institute of Electrical and Electronics Engineers Inc., pp. 2605-2613, 15th IEEE International Conference on Computer Vision, ICCV 2015, Santiago, Chile, 12/11/15. https://doi.org/10.1109/ICCV.2015.299
Venkatesan R, Chandakkar PS, Li B. Simpler non-parametric methods provide as good or better results to multiple-instance learning. In Proceedings - 2015 IEEE International Conference on Computer Vision, ICCV 2015. Vol. 11-18-December-2015. Institute of Electrical and Electronics Engineers Inc. 2016. p. 2605-2613. 7410656 https://doi.org/10.1109/ICCV.2015.299
Venkatesan, Ragav ; Chandakkar, Parag Shridhar ; Li, Baoxin. / Simpler non-parametric methods provide as good or better results to multiple-instance learning. Proceedings - 2015 IEEE International Conference on Computer Vision, ICCV 2015. Vol. 11-18-December-2015 Institute of Electrical and Electronics Engineers Inc., 2016. pp. 2605-2613
@inproceedings{c7743de015f8491eb35a9b1a766c18da,
title = "Simpler non-parametric methods provide as good or better results to multiple-instance learning",
abstract = "Multiple-instance learning (MIL) is a unique learning problem in which training data labels are available only for collections of objects (called bags) instead of individual objects (called instances). A plethora of approaches have been developed to solve this problem in the past years. Popular methods include the diverse density, MILIS and DD-SVM. While having been widely used, these methods, particularly those in computer vision have attempted fairly sophisticated solutions to solve certain unique and particular configurations of the MIL space. In this paper, we analyze the MIL feature space using modified versions of traditional non-parametric techniques like the Parzen window and k-nearest-neighbour, and develop a learning approach employing distances to k-nearest neighbours of a point in the feature space. We show that these methods work as well, if not better than most recently published methods on benchmark datasets. We compare and contrast our analysis with the well-established diverse-density approach and its variants in recent literature, using benchmark datasets including the Musk, Andrews' and Corel datasets, along with a diabetic retinopathy pathology diagnosis dataset. Experimental results demonstrate that, while enjoying an intuitive interpretation and supporting fast learning, these method have the potential of delivering improved performance even for complex data arising from real-world applications.",
author = "Ragav Venkatesan and Chandakkar, {Parag Shridhar} and Baoxin Li",
year = "2016",
month = "2",
day = "17",
doi = "10.1109/ICCV.2015.299",
language = "English (US)",
volume = "11-18-December-2015",
pages = "2605--2613",
booktitle = "Proceedings - 2015 IEEE International Conference on Computer Vision, ICCV 2015",
publisher = "Institute of Electrical and Electronics Engineers Inc.",
address = "United States",

}

TY - GEN

T1 - Simpler non-parametric methods provide as good or better results to multiple-instance learning

AU - Venkatesan, Ragav

AU - Chandakkar, Parag Shridhar

AU - Li, Baoxin

PY - 2016/2/17

Y1 - 2016/2/17

N2 - Multiple-instance learning (MIL) is a unique learning problem in which training data labels are available only for collections of objects (called bags) instead of individual objects (called instances). A plethora of approaches have been developed to solve this problem in the past years. Popular methods include the diverse density, MILIS and DD-SVM. While having been widely used, these methods, particularly those in computer vision have attempted fairly sophisticated solutions to solve certain unique and particular configurations of the MIL space. In this paper, we analyze the MIL feature space using modified versions of traditional non-parametric techniques like the Parzen window and k-nearest-neighbour, and develop a learning approach employing distances to k-nearest neighbours of a point in the feature space. We show that these methods work as well, if not better than most recently published methods on benchmark datasets. We compare and contrast our analysis with the well-established diverse-density approach and its variants in recent literature, using benchmark datasets including the Musk, Andrews' and Corel datasets, along with a diabetic retinopathy pathology diagnosis dataset. Experimental results demonstrate that, while enjoying an intuitive interpretation and supporting fast learning, these method have the potential of delivering improved performance even for complex data arising from real-world applications.

AB - Multiple-instance learning (MIL) is a unique learning problem in which training data labels are available only for collections of objects (called bags) instead of individual objects (called instances). A plethora of approaches have been developed to solve this problem in the past years. Popular methods include the diverse density, MILIS and DD-SVM. While having been widely used, these methods, particularly those in computer vision have attempted fairly sophisticated solutions to solve certain unique and particular configurations of the MIL space. In this paper, we analyze the MIL feature space using modified versions of traditional non-parametric techniques like the Parzen window and k-nearest-neighbour, and develop a learning approach employing distances to k-nearest neighbours of a point in the feature space. We show that these methods work as well, if not better than most recently published methods on benchmark datasets. We compare and contrast our analysis with the well-established diverse-density approach and its variants in recent literature, using benchmark datasets including the Musk, Andrews' and Corel datasets, along with a diabetic retinopathy pathology diagnosis dataset. Experimental results demonstrate that, while enjoying an intuitive interpretation and supporting fast learning, these method have the potential of delivering improved performance even for complex data arising from real-world applications.

UR - http://www.scopus.com/inward/record.url?scp=84973864011&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84973864011&partnerID=8YFLogxK

U2 - 10.1109/ICCV.2015.299

DO - 10.1109/ICCV.2015.299

M3 - Conference contribution

VL - 11-18-December-2015

SP - 2605

EP - 2613

BT - Proceedings - 2015 IEEE International Conference on Computer Vision, ICCV 2015

PB - Institute of Electrical and Electronics Engineers Inc.

ER -