Brain-Robot Interaction

Can a non-invasive brain-machine interface (BMI) be developed that allows a user the ability to direct a semi-autonomous robot to perform tasks through thought alone? If so, this would provide the foundation for a large number of potential applications ranging from rehabilitative to commercial to military.

The primary objective of this project is to develop an EEG-based brain-machine interface (BMI) for controlling an adaptive mobile agent. Specifically, the agent is an iRobot Create enhanced with a rotatable camera and robotic arm. Using EEG signals, subjects will be tasked with navigating the robot to a desired location in a room, orienting the camera to fixate upon a target object, and picking up the attended object with the robotic arm. This complex task will be broken into two major components: 1) EEG-based robotic navigation / object selection and 2) biologically inspired, autonomous and goal-directed robotic movement and arm control. To accomplish this task we employ a two-way co-adaptation paradigm where both the subject and robot adapt to each other. The subject learns to use EEG signals to improve control by practicing movements; this type of learning has proven crucial to BMI performance in other domains. A computer onboard the robot will use adaptive algorithms to continually improve its ability to identify the key EEG signal components signaling the subject’s intent.

This collaborative effort combines research in the BU Neuromorphics and Neural Prosthesis Lab, the latter involved in developing algorithms for decoding 2D movements via EEG of motor imagery. This software allows for complex signal processing and model execution to occur separate from the physical robot and enables relatively easy integration of EEG decoding software with the neural models underlying the robot’s behavior.

The project has significant potential for clinical applications. Patients suffering from severe motor impairment could regain some agency through control of these mobile devices, increasing their quality of life. Novel commercial applications for healthy subjects are also possible.
The following NSF Science Nation features the CELEST work on BCI, among which is the work of the Neuromorphics Lab on adaptive robotics and BCI.

Shown below is a preliminary video of the iRobot Create with a robotic arm controlled by an EEG-based BMI. The video below depicts the first successful attempts by Byron Galbraith and Sean Lorenz to control a robot with BCI.

The video below shows the initial implementation of the neuromorphic algorithm controlling reaching in a virtual environment, based on the DIRECT model. 

References

Galbraith, B.V., Brumberg, J.S., Lorenz, S.D., Versace, M., and Guenther, F.H. (2013). Unlock: A Python-based framework for rapid development of practical brain-computer interface applications. Boston University Graduate Program for Neuroscience Recruitment, Boston, MA, March 4, 2013.

Galbraith, B.V., Smith, D.J., Versace, M., and Guenther, F.H. (2013). Visually-guided autonomous reach control of a robotic arm using the DIRECT neural model. Boston University Graduate Program for Neuroscience Recruitment, Boston, MA, March 4, 2013.

Galbraith, B.V. (2012). A brain-machine interface for assistive robotic control. NSF CELEST Site Visit, Boston, MA.

Galbraith, B.V. (2012). A brain-machine interface for assistive robotic control. NSF CELEST site visit, Boston, MA, November, 2012.

Galbraith B.V., Brumberg J.S., Lorenz, S.D., Versace, M., and Guenther F.H. (2012). Unlock: A Python-based framework for rapid development of practical brain-computer interface applications. Poster presentation, 11th Annual Python In Science Conference, Austin, TX, July 19, 2012.

Galbraith B., Versace M., and Chandler B. (2011) Asimov: Middleware for Modeling the Brain on the iRobot Create. Submitted to PyCon, Atlanta, March 11-13th 2011.

NL team working on this project: Byron Galbraith, Sean Lorenz, Max Versace

Collaborators

The Neuromorphics Lab is highly collaborative with connections across both academia and industry.