한빛사논문
Georgia Institute of Technology
Musa Mahmood, Shinjae Kwon, Hojoong Kim, Yun-Soung Kim, Panote Siriaraya, Jeongmoon Choi, Boris Otkhmezuri, Kyowon Kang, Ki Jun Yu, Young C. Jang, Chee Siang Ang, and WoonHong Yeo*
M. Mahmood†, S. Kwon†, H. Kim, Y.-S. Kim, Prof. W.-H. Yeo
George W. Woodruff School of Mechanical Engineering, College of Engineering, Georgia
Institute of Technology, Georgia Institute of Technology, Atlanta, GA 30332, USA
Center for Human-Centric Interfaces and Engineering, Institute for Electronics and
Nanotechnology, Georgia Institute of Technology, Atlanta, GA 30332, USA
J. Choi, Prof. Y.C. Jang
School of Biological Sciences, Georgia Institute of Technology, Atlanta, GA 30332, USA
P. Siriaraya, B. Otkhmezuri, Prof. C.S. Ang.
School of Computing, University of Kent, Canterbury, Kent CT2 7NT, UK
K. Kang, K.J. Yu
School of Electrical and Electronic Engineering, Yonsei University, Seoul 03722, Republic of
Korea
Prof. W.-H. Yeo
Wallace H. Coulter Department of Biomedical Engineering, Parker H. Petit Institute for
Bioengineering and Biosciences, Institute for Materials, Neural Engineering Center, Institute for Robotics and Intelligent Machines, Georgia Institute of Technology, Atlanta, GA 30332, USA
†These authors contributed equally to this work.
*Corresponding author.
Abstract
Motor imagery offers an excellent opportunity as a stimulus-free paradigm for brain–machine interfaces. Conventional electroencephalography (EEG) for motor imagery requires a hair cap with multiple wired electrodes and messy gels, causing motion artifacts. Here, a wireless scalp electronic system with virtual reality for real-time, continuous classification of motor imagery brain signals is introduced. This low-profile, portable system integrates imperceptible microneedle electrodes and soft wireless circuits. Virtual reality addresses subject variance in detectable EEG response to motor imagery by providing clear, consistent visuals and instant biofeedback. The wearable soft system offers advantageous contact surface area and reduced electrode impedance density, resulting in significantly enhanced EEG signals and classification accuracy. The combination with convolutional neural network-machine learning provides a real-time, continuous motor imagery-based brain–machine interface. With four human subjects, the scalp electronic system offers a high classification accuracy (93.22 ± 1.33% for four classes), allowing wireless, real-time control of a virtual reality game.
논문정보
관련 링크
연구자 키워드
연구자 ID
관련분야 연구자보기
소속기관 논문보기
관련분야 논문보기