Journal Home > Volume 28 , Issue 2

Although notable progress has been made in the study of Steady-State Visual Evoked Potential (SSVEP)-based Brain-Computer Interface (BCI), several factors that limit the practical applications of BCIs still exist. One of these factors is the importability of the stimulator. In this study, Augmented Reality (AR) technology was introduced to present the visual stimuli of SSVEP-BCI, while the robot grasping experiment was designed to verify the applicability of the AR-BCI system. The offline experiment was designed to determine the best stimulus time, while the online experiment was used to complete the robot grasping task. The offline experiment revealed that better information transfer rate performance could be achieved when the stimulation time is 2 s. Results of the online experiment indicate that all 12 subjects could control the robot to complete the robot grasping task, which indicates the applicability of the AR-SSVEP-humanoid robot (NAO) system. This study verified the reliability of the AR-BCI system and indicated the applicability of the AR-SSVEP-NAO system in robot grasping tasks.


menu
Abstract
Full text
Outline
About this article

Study on Robot Grasping System of SSVEP-BCI Based on Augmented Reality Stimulus

Show Author's information Shangen Zhang1Yuanfang Chen2Lijian Zhang2Xiaorong Gao3Xiaogang Chen4( )
School of Computer and Communication Engineering, University of Science and Technology Beijing, Beijing 100083, China
Beijing Institute of Mechanical Equipment, Beijing 100854, China
Department of Biomedical Engineering, School of Medicine, Tsinghua University, Beijing, 100084, China
Institute of Biomedical Engineering, Chinese Academy of Medical Sciences and Peking Union Medical College, Tianjin 300192, China

Abstract

Although notable progress has been made in the study of Steady-State Visual Evoked Potential (SSVEP)-based Brain-Computer Interface (BCI), several factors that limit the practical applications of BCIs still exist. One of these factors is the importability of the stimulator. In this study, Augmented Reality (AR) technology was introduced to present the visual stimuli of SSVEP-BCI, while the robot grasping experiment was designed to verify the applicability of the AR-BCI system. The offline experiment was designed to determine the best stimulus time, while the online experiment was used to complete the robot grasping task. The offline experiment revealed that better information transfer rate performance could be achieved when the stimulation time is 2 s. Results of the online experiment indicate that all 12 subjects could control the robot to complete the robot grasping task, which indicates the applicability of the AR-SSVEP-humanoid robot (NAO) system. This study verified the reliability of the AR-BCI system and indicated the applicability of the AR-SSVEP-NAO system in robot grasping tasks.

Keywords: Brain-Computer Interface (BCI), Steady-State Visual Evoked Potential (SSVEP), Augmented Reality (AR), robot, grasping system

References(20)

[1]
G. Bin, X. Gao, Z. Yan, B. Hong, and S. Gao, An online multi-channel SSVEP-based brain–computer interface using a canonical correlation analysis method, J. Neural Eng., vol. 6, no. 4, p. 046002, 2009.
[2]
F. B. Vialatte, M. Maurice, J. Dauwels, and A. Cichocki, Steady-state visually evoked potentials: Focus on essential paradigms and future perspectives, Prog. Neurobiol., vol. 90, no. 4, pp. 418–438, 2010.
[3]
S. Gao, Y. Wang, X. Gao, and B. Hong, Visual and auditory brain–computer interfaces, IEEE Trans. Biomed. Eng., vol. 61, no. 5, pp. 1436–1447, 2014.
[4]
X. Chen, Y. Wang, M. Nakanishi, X. Gao, T. P. Jung, and S. Gao, High-speed spelling with a noninvasive brain–computer interface, Proc. Natl. Acad. Sci. USA, vol. 112, no. 44, pp. E6058–E6067, 2015.
[5]
S. Zhang, X. Han, X. Chen, Y. Wang, S. Gao, and X. Gao, A study on dynamic model of steady-state visual evoked potentials, J. Neural Eng., vol. 15, no. 4, p. 046010, 2018.
[6]
X. Chen, Y. Wang, S. Gao, T. P. Jung, and X. Gao, Filter bank canonical correlation analysis for implementing a high-speed SSVEP-based brain-computer interface,J. Neural Eng., vol. 12, no. 4, p. 046008, 2015.
[7]
S. Zhang, X. Han, and X. Gao, Studying the effect of the pre-stimulation paradigm on steady-state visual evoked potentials with dynamic models based on the zero-pole analytical method, Tsinghua Sci. Technol., vol. 25, no. 3, pp. 435–446, 2020.
[8]
S. Zhang, J. Sun, and X. Gao, The effect of fatigue on brain connectivity networks, Brain Sci. Adv., vol. 6, no. 2, pp. 120–131, 2020.
[9]
P. Arpaia, L. Duraccio, N. Moccaldi, and S. Rossi, Wearable brain-computer interface instrumentation for robot-based rehabilitation by augmented reality, IEEE Trans. Instrum. Meas., vol. 69, no. 9, pp. 6362–6371, 2020.
[10]
X. Chen, X. Huang, Y. Wang, and X. Gao, Combination of augmented reality based brain-computer interface and computer vision for high-level control of a robotic arm, IEEE Trans. Neur. Sys. Reh., vol. 28, no. 12, pp. 3140–3147, 2020.
[11]
K. Kansaku, N. Hata, and K. Takano, My thoughts through a robot’s eyes: An augmented reality-brain-machine interface, Neurosci. Res., vol. 66, no. 2, pp. 219–222, 2010.
[12]
K. Takano, N. Hata, and K. Kansaku, Towards intelligent environments: An augmented reality-brain-machine interface operated with a see-through head-mount display, Front. Neurosci., vol. 5, p. 60, 2011.
[13]
S. Horii, S. Nakauchi, and M. Kitazaki, AR-SSVEP for brain-machine interface: Estimating user’s gaze in head-mounted display with USB camera, in Proc. 2015 IEEE Virtual Reality (VR), Arles, France, 2015, pp. 193–194.
[14]
Y. Wang, K. Li, X. Zhang, J. Wang, and R. Wei, Research on the application of augmented reality in SSVEP-BCI, in Proc. 2020 6th Int. Conf. Computing and Artificial Intelligence, Tianjin, China, 2020, pp. 505–509.
[15]
J. Scholz, S. Chitta, B. Marthi, and M. Likhachev, Cart pushing with a mobile manipulation system: Towards navigation with moveable objects, in Proc. 2011 IEEE Int. Conf. Robotics and Automation, Shanghai, China, 2011, pp. 6115–6120.
[16]
L. Hawley and W. Suleiman, Control strategy and implementation for a humanoid robot pushing a heavy load on a rolling cart, in Proc. 2017 IEEE/RSJ Int. Conf. Intelligent Robots and Systems (IROS), Vancouver, Canada, 2017, pp. 4997–5002.
[17]
Q. Gao, Y. Zhang, Z. Wang, E. Dong, X. Song, and Y. Song, Channel projection-based CCA target identification method for an SSVEP-based BCI system of Quadrotor helicopter control, Comput. Intell. Neurosci., vol. 2019, p. 2361282, 2019.
[18]
R. Spataro, A. Chella, B. Allison, M. Giardina, R. Sorbello, S. Tramonte, C. Guger, and V. La Bella, Reaching and grasping a glass of water by locked-in ALS patients through a BCI-controlled humanoid robot, Front. Hum. Neurosci., vol. 11, p. 68, 2017.
[19]
Y. Chae, J. Jeong, and S. Jo, Toward brain-actuated humanoid robots: Asynchronous direct control using an EEG-based BCI, IEEE Trans. Robot., vol. 28, no. 5, pp. 1131–1144, 2012.
[20]
F. Duan, D. Lin, W. Li, and Z. Zhang, Design of a multimodal EEG-based hybrid BCI system with visual servo module, IEEE Trans. Auton. Ment. Dev., vol. 7, no. 4, pp. 332–341, 2017.
Publication history
Copyright
Acknowledgements
Rights and permissions

Publication history

Received: 27 October 2021
Revised: 02 November 2021
Accepted: 05 November 2021
Published: 29 September 2022
Issue date: April 2023

Copyright

© The author(s) 2023.

Acknowledgements

Research was supported in part by the National Natural Science Foundation of China (No. 62171473), Beijing Science and Technology Program (No. Z201100004420015), and Fundamental Research Funds for the Central Universities of China (No. FRF-TP-20-017A1).

Rights and permissions

The articles published in this open access journal are distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/).

Return