Active vision controlling foveation in a pan-tilt camera on a mobile robot

By Max Versace | October 7, 2013

The CELEST and CompNet Neuromorphics Lab and Neurala was awarded a Phase II NASA Small Business Technology Transfer (STTR) Award to build “Adaptive bio-inspired navigation for planetary exploration” . As part of this effort,the team is building neural software for planetary exploration robots such as Curiosity to autonomously explore novel environments, memorize locations of obstacles or objects, learn about objects, build and update an environment map, and return to a safe location. The following video shows a neural model controlling an active visual system in a simulated Mars rover and a corresponding robotic platform.

The neural model, MoNETA, autonomously generates simulated camera movements that mimic the way humans generate eye movements when actively exploring their environment. The work is a collaborative effort between Neurala, the Boston University Neuromorphics Lab, and NASA Langley.

Collaborators

The Neuromorphics Lab is highly collaborative with connections across both academia and industry.