The efAA project has been selected by the...
A SOLID case for active Bayesian perception in robot touch
|Title||A SOLID case for active Bayesian perception in robot touch|
|Publication Type||Conference Proceedings|
|Year of Conference||2013|
|Authors||Lepora, N, Martinez-Hernandez U, Prescott TJ|
|Conference Name||Biomimetic and Biohybrid Systems; Second International Conference on Living Machines|
|Series Title||Lecture Notes in Computer Science|
In a series of papers, we have formalized a Bayesian perception approach for robotics based on recent progress in understanding animal perception. The main principle is to accumulate evidence for multiple perceptual alternatives until reaching a preset belief threshold, formally related to sequential analysis methods for optimal decision making. Here, we extend this approach to active perception, by moving the sensor with a control strategy that depends on the posterior beliefs during decision making. This method can be used to solve problems involving Simultaneous Object Localization and IDentification (SOLID), or ‘where and what’. Considering an example in robot touch, we find that active perception gives an efficient, accurate solution to the SOLID problem for uncertain object locations; in contrast, passive Bayesian perception, which lacked sensorimotor feedback, then performed poorly. Thus, active perception can enable robust sensing in unstructured environments.