Robust two-view external calibration by combining lines and scale invariant point features

Xiaolong Zhang, Jin Zhou, Baoxin Li

Research output: Chapter in Book/Report/Conference proceedingConference contribution

1 Scopus citations

Abstract

In this paper we present a new approach for automatic external calibration for two camera views under general motion based on both line and point features. Detected lines are classified into two classes: either vertical or horizontal. We make use of these lines extensively to determine the camera pose. First, the rotation is estimated directly from line features using a novel algorithm. Then normalized point features are used to compute the translation based on epipolar constraint. Compared with point-feature-based approaches, the proposed method can handle well images with little texture. Also, our method bypasses sophisticated post-processing stage that is typically employed by other line-feature-based approaches. Experiments show that, although our approach is simple to implement, the performance is reliable in practice.

Original languageEnglish (US)
Title of host publicationAdvances in Visual Computing - 4th International Symposium, ISVC 2008, Proceedings
Pages825-835
Number of pages11
EditionPART 1
DOIs
StatePublished - 2008
Event4th International Symposium on Visual Computing, ISVC 2008 - Las Vegas, NV, United States
Duration: Dec 1 2008Dec 3 2008

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
NumberPART 1
Volume5358 LNCS
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349

Other

Other4th International Symposium on Visual Computing, ISVC 2008
Country/TerritoryUnited States
CityLas Vegas, NV
Period12/1/0812/3/08

ASJC Scopus subject areas

  • Theoretical Computer Science
  • General Computer Science

Fingerprint

Dive into the research topics of 'Robust two-view external calibration by combining lines and scale invariant point features'. Together they form a unique fingerprint.

Cite this