Article ID Journal Published Year Pages File Type
4351974 Neuroscience Research 2010 4 Pages PDF
Abstract

A brain–machine interface (BMI) uses neurophysiological signals from the brain to control external devices, such as robot arms or computer cursors. Combining augmented reality with a BMI, we show that the user's brain signals successfully controlled an agent robot and operated devices in the robot's environment. The user's thoughts became reality through the robot's eyes, enabling the augmentation of real environments outside the anatomy of the human body.

Related Topics
Life Sciences Neuroscience Neuroscience (General)
Authors
, , ,