Neuromorphic Hardware and Robots

Mobile land and aerial robots suffer from the large quantities of data generated by their sensors and the inability of these devices to process and evaluate this data locally. To solve this problem we suggest the combination of two ideas. First, to evaluate only the sensor information that seems most relevant to solve the current task and, second, to implement biologically inspired algorithms in customized hardware to meet computation time, power, and weight constraints not possible to fulfill with general purpose hardware.

Data from a video stream of a passive sensor provides rich information about the environment and the movement of the sensor. Optic flow and models thereof are key concepts to access this information. We develop biologically inspired algorithms for the computation of optic flow from video data, the extraction of information about sensor movement and environment from computed optic flow, and the integration of this extracted information into a reinforcement learning strategy to train for obstacle avoidance and, thus, save navigation in small cluttered environments. In collaboration with the Integrated Circuits and System Design Group at the Boston University Electrical and Computer Engineering Department we deploy our algorithms to Field Programmable Gate Array (FPGA) preserving the flexibility to make changes on algorithms. Refined algorithms are used to design Application Specific Integrated Circuits (ASIC).

The long term goal is to develop and test neuromorphic hardware that adapts its behavior to different environments using visual navigation of land and aerial vehicles as an example.

Figure Simulations with a robot that can make counterclockwise turns and forward movements driving in a 2 meter by 4 meter box show that bumps can be reduced over time using a reinforcement learning approach.

Figure Experimental settings

References

Raudies, F., Eldridge, S., Joshi, A., Versace, M. (2011). Reinforcement learning for visual navigation. NSF SLC PI meeting, Washington DC, November 2011.

Eldridge, S., Joshi, A., Raudies, F., Versace, M. (2011). A Neuromorphic Hardware that Learns to Navigate Based on Optic Flow. Mark Motter NASA visit, Boston, MA March 31, 2011

Samuel Kim and Vincent Kee (2011). Optical Flow Based Navigation. See link or PDF

Kim, S., Kee, V., Joshi, A., Raudies, F. (2011). Optic flow based navigation. Boston University Research Internship in Science and Engineering Poster Session, August 12th, 2011, Boston, MA

Kim, S., Kee, V., Joshi, A., Raudies, F. (2011). Optic flow based navigation using Gabor filters. Boston University Research Internship in Science and Engineering Poster Session, August 12th, 2011, Boston, MA

Kim, S., Kee, V., Joshi, A., Raudies, F. (2011). Optic flow based navigation using correlation techniques. Boston University Research Internship in Science and Engineering Poster Session, August 12th, 2011, Boston, MA

NL team working on this project: Florian Raudies, Schuyler Eldridge, Mahmoud Zangeneh, Ajay Joshi, Max Versace

Collaborators

The Neuromorphics Lab is highly collaborative with connections across both academia and industry.