350 rub
Journal Neurocomputers №1 for 2017 г.
Article in number:
Deep learning for BCI application
Keywords:
brain-computer interface (BCI)
motor imagery
electroencephalography (EEG)
deep learning
convolutional neural network (CNN)
Authors:
F.V. Stankevich - Post-graduate Student, Tomsk Polytechnical University
E-mail: stankevichfv@tpu.ru
V.G. Spitsyn - Dr.Sc. (Eng.), Professor, Tomsk Polytechnical University
E-mail: spvg@tpu.ru
Abstract:
Deep learning methods became quite popular over the last years [2]. They allowed us significantly improve recognition accuracy in different fields [3], [4]. In this work we aimed to evaluate deep learning approach for classification of physiological signals. To be more specific our goal is to classify electroencephalography signal in brain-computer interface (BCI) system. There are several types of BCI systems. In this work we focused on BCI systems based on motor imagery. We used Data Set 2a from BCI Competition IV (Berlin, 2008) to evaluate the classification accuracy. This data set has 4 classes. In the experiment participated 9 subjects, from each subject 576 trials were recorded. Half of the trials were used to train the classifier and other half of them were used to evaluate the classifier performance. The resulted deep convolutional neural network had 7 main layers. As the input for the neural network we used Fourier spectrum of the signal. The network performance achieved on the data set was 0.8467 (kappa value). This performance exceeds the known to the authors methods. The computational complexity of the classification process is acceptable is to use the classifier in real time.
Pages: 48-55
References
- Brunner C., Leeb R., Muller-Putz G.R., Schlogl A., Pfurtscheller G. BCI Competition 2008-Graz data set A // Institute for Knowledge Discovery (Laboratory of Brain-Computer Interfaces). Graz University of Technology. 2008. S. 136?142.
- Schmidhuber J. Deep Learning in neural networks: An overview // Neural Networks. 2015. V. 61. P. 85-117.
- Krizhevsky A., Hinton G.E. ImageNet Classification with Deep Convolutional Neural Networks // Adv. Neural Inf. Process. Syst. 2012. P. 1097-1105.
- Zeiler M.D. et al. On rectified linear units for speech processing // ICASSP, IEEE Int. Conf. Acoust. Speech Signal Process. Proc. 2013. P. 3517-3521.
- Martinez H.P., Bengio Y., Yannakakis G. Learning Deep Physiological Models of Affect // IEEE Comput. Intell. Mag. 2013. V. 8. № April. P. 20-33.
- Wolpaw J.R. et al. Brain - Computer Interface Technology: A Review of the First International Meeting. 2000. V. 8. № 2. P. 164-173.
- Donchin E., Spencer K.M., Wijesinghe R. The mental prosthesis: Assessing the speed of a P300-based brain- computer interface // IEEE Trans. Rehabil. Eng. 2000. V. 8. № 2. P. 174-179.
- LaFleur K. et al. Quadcopter control in three-dimensional space using a noninvasive motor imagery-based brain-computer interface // J. Neural Eng. 2013. V. 10, № 4. P. 46003.
- Tattoli G. et al. A novel BCI-SSVEP based approach for control of walking in Virtual Environment using a Convolutional Neural Network // International Joint Conference on Neural Networks (IJCNN). Beijing, China. 2014. P. 4121-4128.
- Cecotti H., Gräser A. Convolutional neural networks for P300 detection with application to brain-computer interfaces // IEEE Trans. Pattern Anal. Mach. Intell. 2011. V. 33. № 3. P. 433-445.
- Sobhani A. P300 classification using deep belief nets. Colorado State University. 2014.
- Hochberg T. et al. Grasp-and-Lift EEG Detection Winners - Interview: 3rd place, Team HEDJ | No Free Hunch [Electronic resource] // Kaggle. 2015.
- LeCun Y. et al. Backpropagation Applied to Handwritten Zip Code Recognition // Neural Computation. 1989. V. 1. № 4. P. 541-551.
- Cohen J. A coefficient of agreement for nominal scales // Educ. Psychol. Meas. 1960. V. XX. № 1. P. 37-46.
- Schögl A. et al. Evaluation Criteria for BCI Research // Towar. Brain-computer Interfacing. 2007. P. 327-342.
- Frolov A., Húsek D., Bobrov P. Comparison of four classification methods for brain-computer interface // Neural Netw. World. 2011. V. 21. № 2. P. 101-115.