| Article ID | Journal | Published Year | Pages | File Type | 
|---|---|---|---|---|
| 4948812 | Robotics and Autonomous Systems | 2017 | 29 Pages | 
Abstract
												This paper describes a brain-machine interface for the online control of a powered lower-limb exoskeleton based on electroencephalogram (EEG) signals recorded over the user's sensorimotor cortical areas. We train a binary decoder that can distinguish two different mental states, which is applied in a cascaded manner to efficiently control the exoskeleton in three different directions: walk front, turn left and turn right. This is realized by first classifying the user's intention to walk front or change the direction. If the user decides to change the direction, a subsequent classification is performed to decide turn left or right. The user's mental command is conditionally executed considering the possibility of obstacle collision. All five subjects were able to successfully complete the 3-way navigation task using brain signals while mounted in the exoskeleton. We observed on average 10.2% decrease in overall task completion time compared to the baseline protocol.
											Keywords
												
											Related Topics
												
													Physical Sciences and Engineering
													Computer Science
													Artificial Intelligence
												
											Authors
												Kyuhwa Lee, Dong Liu, Laetitia Perroud, Ricardo Chavarriaga, José del R. Millán, 
											