Varsha Shankar will be presenting at the IEEE EMBS Conference on Neural Engineering, San Diego, CA 6-8 November 2013. The conference paper features the work done in collaboration between the CELEST Neuromorphics Lab and the Speech Lab. The conference paper is titled A Co-Robotic Assistant capable of Object Selection and Search via a Brain Machine Interface.
The primary objective of this project is to develop an EEG-based brain-machine interface (BMI) for controlling an adaptive mobile agent. Specifically, the agent is an iRobot Create enhanced with a rotatable camera and robotic arm. Using EEG signals, subjects will be tasked with navigating the robot to a desired location in a room, orienting the camera to fixate upon a target object, and picking up the attended object with the robotic arm. This complex task will be broken into two major components: 1) EEG-based robotic navigation / object selection and 2) biologically inspired, autonomous and goal-directed robotic movement and arm control. To accomplish this task we employ a two-way co-adaptation paradigm where both the subject and robot adapt to each other. The subject learns to use EEG signals to improve control by practicing movements; this type of learning has proven crucial to BMI performance in other domains. A computer onboard the robot will use adaptive algorithms to continually improve its ability to identify the key EEG signal components signaling the subject’s intent.
Varsha Shankar, Anatoli Gorchetchnikov, Lena Sherbakov, Gennady Livitz, Heather Ames, Byron Galbraith, Frank H. Guenther, Aisha Sohail, and Massimiliano Versace (2013) A Co-Robotic Assistant capable of Object Selection and Search via a Brain Machine Interface. IEEE EMBS Conference on Neural Engineering, San Diego, CA 6-8 November 2013.