An active trinocular vision system of sensing indoor navigation environment for mobile robots

Min Young Kim, Hyungsuck Cho

Research output: Contribution to journalArticlepeer-review

31 Scopus citations

Abstract

Intelligent autonomous mobile robots must be able to sense and recognize 3D indoor space where they live or work. However, robots are frequently situated in cluttered environments with various objects hard to be robustly perceived. Although the monocular and binocular vision sensors have been widely used for mobile robots, they suffer from image intensity variations, insufficient feature information and correspondence problems. In this paper, we propose a new 3D sensing system, in which the laser-structured-lighting method is basically utilized because of the robustness on the nature of the navigation environment and the easy extraction of feature information of interest. The proposed active trinocular vision system is composed of the flexible multi-stripe laser projector and two cameras arranged with a triangular shape. By modeling the laser projector as a virtual camera and using the trinocular epipolar constraints, the matching pairs of line features observed into two real camera images are established, and 3D information from one-shot image can be extracted on the patterned scene. For robust feature matching, here we propose a new correspondence matching technique based on line grouping and probabilistic voting. Finally, a series of experimental tests is performed to show the simplicity, efficiency, and accuracy of this proposed sensor system for 3D environment sensing and recognition.

Original languageEnglish
Pages (from-to)192-209
Number of pages18
JournalSensors and Actuators A: Physical
Volume125
Issue number2
DOIs
StatePublished - 10 Jan 2006

Keywords

  • Active sensor
  • Laser pattern
  • Mobile robots
  • Three-dimensional range sensor
  • Trinocular vision

Fingerprint

Dive into the research topics of 'An active trinocular vision system of sensing indoor navigation environment for mobile robots'. Together they form a unique fingerprint.

Cite this