Autonomous and interactive improvement of binocular visual depth estimation through sensorimotor interaction

Timothy A. Mann, Yunjung Park, Sungmoon Jeong, Minho Lee, Yoonsuck Choe

Research output: Contribution to journalArticlepeer-review

8 Scopus citations

Abstract

We investigate how a humanoid robot with a randomly initialized binocular vision system can learn to improve judgments about egocentric distances using limited action and interaction that might be available to human infants. First, we show how distance estimation can be improved autonomously. We consider our approach to be autonomous because the robot learns to accurately estimate distance without a human teacher providing the distances to training targets. We find that actions that, in principle, do not alter the robot's distance to the target are a powerful tool for exposing estimation errors. These errors can be used to train a distance estimator. Furthermore, the simple action used (i.e., neck rotation) does not require high level cognitive processing or fine motor skill. Next, we investigate how interaction with humans can further improve visual distance estimates. We find that human interaction can improve distance estimates for far targets outside of the robot's peripersonal space. This is accomplished by extending our autonomous approach above to integrate additional information provided by a human. Together these experiments suggest that both action and interaction are important tools for improving perceptual estimates.

Original languageEnglish
Article number6293858
Pages (from-to)74-84
Number of pages11
JournalIEEE Transactions on Autonomous Mental Development
Volume5
Issue number1
DOIs
StatePublished - Mar 2013

Keywords

  • Action
  • Autonomy
  • Depth estimation
  • Learning
  • Perception
  • Vision

Fingerprint

Dive into the research topics of 'Autonomous and interactive improvement of binocular visual depth estimation through sensorimotor interaction'. Together they form a unique fingerprint.

Cite this