Journal Home > Volume 9 , Issue 3

The steady-state visual evoked potential (SSVEP)-based speller has emerged as a widely adopted paradigm in current brain–computer interface (BCI) systems due to its rapid processing and consistent performance across different individuals. Calibration-free SSVEP algorithms, as opposed to their calibration-based counterparts, offer clear and intuitive mathematical principles, making them accessible to novice developers. During the World Robot Contest (WRC) 2022, participants in the undergraduate category utilized various approaches to accomplish target detection in the calibration-free setting, successfully implementing the algorithms using MATLAB. The winning approach achieved an average information transfer rate of 198.94 bits/min in the final test, which is notably high given the calibration-free scenario. This paper presents an introduction to the underlying principles of the selected methods, accompanied by a comparison of their effectiveness through analysis of results from both the final test and offline experiments. Additionally, we propose that the youth competition of WRC could serve as an ideal starting point for beginners interested in studying and developing their own BCI systems.


menu
Abstract
Full text
Outline
About this article

Overview of recognition methods for SSVEP-based BCIs in World Robot Contest 2022: MATLAB undergraduate group

Show Author's information Chengzhi Yi1,§Yuxuan Wu1,§Fan Ye1Xinchen Zhang2Jingjing Chen3,4( )
Department of Biomedical Engineering, School of Medicine, Tsinghua University, Beijing 100084, China
College of Science, Beijing Forestry University, Beijing 100083, China
Department of Psychology, School of Social Sciences, Tsinghua University, Beijing 100084, China
Tsinghua Laboratory of Brain and Intelligence, Tsinghua University, Beijing 100084, China

§ These authors contributed equally to this work.

Abstract

The steady-state visual evoked potential (SSVEP)-based speller has emerged as a widely adopted paradigm in current brain–computer interface (BCI) systems due to its rapid processing and consistent performance across different individuals. Calibration-free SSVEP algorithms, as opposed to their calibration-based counterparts, offer clear and intuitive mathematical principles, making them accessible to novice developers. During the World Robot Contest (WRC) 2022, participants in the undergraduate category utilized various approaches to accomplish target detection in the calibration-free setting, successfully implementing the algorithms using MATLAB. The winning approach achieved an average information transfer rate of 198.94 bits/min in the final test, which is notably high given the calibration-free scenario. This paper presents an introduction to the underlying principles of the selected methods, accompanied by a comparison of their effectiveness through analysis of results from both the final test and offline experiments. Additionally, we propose that the youth competition of WRC could serve as an ideal starting point for beginners interested in studying and developing their own BCI systems.

Keywords: MATLAB, steady-state visual evoked potential, electroencephalogram, calibration-free, BCI spellers, brain–computer interfaces

References(26)

[1]
Gao XR, Wang YJ, Chen XG, et al. Interface, interaction, and intelligence in generalized brain–computer interfaces. Trends Cogn Sci 2021, 25(8): 671–684.
[2]
Wolpaw JR, Birbaumer N, McFarland DJ, et al. Brain–computer interfaces for communication and control. Clin Neurophysiol 2002, 113(6): 767–791.
[3]
Norcia AM, Appelbaum LG, Ales JM, et al. The steady-state visual evoked potential in vision research: A review. J Vis 2015, 15(6): 4.
[4]
Cecotti H. Spelling with non-invasive brain–computer interfaces—Current and future trends. J Physiol Paris 2011, 105(1/2/3): 106–114.
[5]
Müller-Putz GR, Scherer R, Brauneis C, et al. Steady-state visual evoked potential (SSVEP)-based communication: Impact of harmonic frequency components. J Neural Eng 2005, 2(4): 123–130.
[6]
Lin ZL, Zhang CS, Wu W, et al. Frequency recognition based on canonical correlation analysis for SSVEP-based BCIs. IEEE Trans Biomed Eng 2006, 53(12 Pt 2): 2610–2614.
[7]
Bin GY, Gao XR, Yan Z, et al. An online multi-channel SSVEP-based brain-computer interface using a canonical correlation analysis method. J Neural Eng 2009, 6(4): 046002.
[8]
Chen XG, Wang YJ, Gao SK, et al. Filter bank canonical correlation analysis for implementing a high-speed SSVEP-based brain–computer interface. J Neural Eng 2015, 12(4): 046008.
[9]
Wong CM, Wan F, Wang BY, et al. Learning across multi-stimulus enhances target recognition methods in SSVEP-based BCIs. J Neural Eng 2020, 17(1): 016026.
[10]
Lao KF, Wong CM, Wang Z, et al. Learning prototype spatial filters for subject-independent SSVEP-based brain-computer interface. In 2018 IEEE International Conference on Systems, Man, and Cybernetics (SMC). Miyazaki, Japan, 2019, pp 485–490.
[11]
Yang C, Han X, Wang YJ, et al. A dynamic window recognition algorithm for SSVEP-based brain-computer interfaces using a spatio-temporal equalizer. Int J Neural Syst 2018, 28(10): 1850028.
[12]
Wong CM, Wang Z, Nakanishi M, et al. Online adaptation boosts SSVEP-based BCI performance. IEEE Trans Biomed Eng 2022, 69(6): 2018–2028.
[13]
Liu BC, Chen XG, Shi NL, et al. Improving the performance of individually calibrated SSVEP-BCI by task-discriminant component analysis. IEEE Trans Neural Syst Rehabil Eng 2021, 29: 1998–2007.
[14]
Nakanishi M, Wang YJ, Chen XG, et al. Enhancing detection of SSVEPs for a high-speed brain speller using task-related component analysis. IEEE Trans Biomed Eng 2018, 65(1): 104–112.
[15]
2022 World Robot Contest. http://www.worldrobotconference.com/cn/about/138.html (Accessed February 11, 2023).
[16]
Rappaport T. Wireless Communications: Principles and Practice (2nd edition). New Jersey, USA: Pearson, 2001.
[17]
Friman O, Volosyak I, Gräser A. Multiple channel detection of steady-state visual evoked potentials for brain–computer interfaces. IEEE Trans Biomed Eng 2007, 54(4): 742–750.
[18]
Yuan P, Chen XG, Wang YJ, et al. Enhancing performances of SSVEP-based brain–computer interfaces via exploiting inter-subject information. J Neural Eng 2015, 12(4): 046006.
[19]
Chen YH, Yang C, Chen XG, et al. A novel training-free recognition method for SSVEP-based BCIs using dynamic window strategy. J Neural Eng 2021, 18(3): .
[20]
Volosyak I. SSVEP-based Bremen-BCI interface— boosting information transfer rates. J Neural Eng 2011, 8(3): 036020.
[21]
Kuś R, Duszyk A, Milanowski P, et al. On the quantification of SSVEP frequency responses in human EEG in realistic BCI conditions. PLoS One 2013, 8(10): e77536.
[22]
Shi NL, Li X, Liu BC, et al. Representative-based cold start for adaptive SSVEP-BCI. IEEE Trans Neural Syst Rehabil Eng 2023, 31: 1521–1531.
[23]
Bian R, Wu DR. Overview of the winning approaches in BCI Controlled Robot Contest in World Robot Contest 2021: Calibration-free SSVEP. Brain Science Advances 2022, 8(2): 99–110.
[24]
Bassi PRAS, Rampazzo W, Attux R. Transfer learning and SpecAugment applied to SSVEP based BCI classification. Biomed Signal Process Contr 2021, 67: 102542.
[25]
Neuroscience - MATLAB and Simulink solutions. https://ww2.mathworks.cn/solutions/neuroscience.html (Accessed March 19, 2023).
[26]
Liu BC, Huang XS, Wang YJ, et al. BETA: a large benchmark database toward SSVEP-BCI application. Front Neurosci 2020, 14: 627.
Publication history
Copyright
Rights and permissions

Publication history

Received: 14 April 2023
Revised: 21 May 2023
Accepted: 06 June 2023
Published: 05 September 2023
Issue date: September 2023

Copyright

© The authors 2023.

Rights and permissions

This article is published with open access at journals.sagepub.com/home/BSA

Creative Commons Non Commercial CC BY-NC: This article is distributed under the terms of the Creative Commons Attribution-NonCommercial 4.0 License (http://www.creativecommons.org/licenses/by-nc/4.0/) which permits non-commercial use, reproduction and distribution of the work without further permission provided the original work is attributed as specified on the SAGE and Open Access pages (https://us.sagepub.com/en-us/nam/open-access-at-sage).

Return