Abstract
We investigate how a humanoid robot with a randomly initialized binocular vision system can learn to improve judgments about egocentric distances using limited action and interaction that might be available to human infants. First, we show how distance estimation can be improved autonomously. We consider our approach to be autonomous because the robot learns to accurately estimate distance without a human teacher providing the distances to training targets. We find that actions that, in principle, do not alter the robot's distance to the target are a powerful tool for exposing estimation errors. These errors can be used to train a distance estimator. Furthermore, the simple action used (i.e., neck rotation) does not require high level cognitive processing or fine motor skill. Next, we investigate how interaction with humans can further improve visual distance estimates. We find that human interaction can improve distance estimates for far targets outside of the robot's peripersonal space. This is accomplished by extending our autonomous approach above to integrate additional information provided by a human. Together these experiments suggest that both action and interaction are important tools for improving perceptual estimates.
Original language | English |
---|---|
Article number | 6293858 |
Pages (from-to) | 74-84 |
Number of pages | 11 |
Journal | IEEE Transactions on Autonomous Mental Development |
Volume | 5 |
Issue number | 1 |
DOIs | |
State | Published - Mar 2013 |
Keywords
- Action
- Autonomy
- Depth estimation
- Learning
- Perception
- Vision