596
Views
35
Downloads
4
Crossref
N/A
WoS
3
Scopus
N/A
CSCD
Using a brain-computer interface (BCI) rather than limbs to control multiple robots (i.e., brain-controlled multi-robots) can better assist people with disabilities in daily life than a brain-controlled single robot. For example, one person with disabilities can move by a brain-controlled wheelchair (leader robot) and simultaneously transport objects by follower robots. In this paper, we explore how to control the direction, speed, and formation of a brain-controlled multi-robot system (consisting of leader and follower robots) for the first time and propose a novel multi-robot predictive control framework (MRPCF) that can track users' control intents and ensure the safety of multiple robots. The MRPCF consists of the leader controller, follower controller, and formation planner. We build a whole brain-controlled multi-robot physical system for the first time and test the proposed system through human-in-the-loop actual experiments. The experimental results indicate that the proposed system can track users' direction, speed, and formation control intents when guaranteeing multiple robots’ safety. This paper can promote the study of brain-controlled robots and multi-robot systems and provide some novel views into human-machine collaboration and integration.
Using a brain-computer interface (BCI) rather than limbs to control multiple robots (i.e., brain-controlled multi-robots) can better assist people with disabilities in daily life than a brain-controlled single robot. For example, one person with disabilities can move by a brain-controlled wheelchair (leader robot) and simultaneously transport objects by follower robots. In this paper, we explore how to control the direction, speed, and formation of a brain-controlled multi-robot system (consisting of leader and follower robots) for the first time and propose a novel multi-robot predictive control framework (MRPCF) that can track users' control intents and ensure the safety of multiple robots. The MRPCF consists of the leader controller, follower controller, and formation planner. We build a whole brain-controlled multi-robot physical system for the first time and test the proposed system through human-in-the-loop actual experiments. The experimental results indicate that the proposed system can track users' direction, speed, and formation control intents when guaranteeing multiple robots’ safety. This paper can promote the study of brain-controlled robots and multi-robot systems and provide some novel views into human-machine collaboration and integration.
M. A. Lebedev and M. A. L. Nicolelis, Brain-machine interfaces: Past, present and future, Trends in Neurosciences, vol. 29, no. 9, pp. 536–546, 2006.
J. D. R. Millán, F. Renkens, J. Mouriño, and W. Gerstner, Noninvasive brain-actuated control of a mobile robot by human EEG, IEEE Trans. Biomed. Eng., vol. 51, no. 6, pp. 1026–1033, 2004.
K. Tanaka, K. Matsunaga, and H. O. Wang, Electroencephalogram-based control of an electric wheelchair, IEEE Trans. Robot., vol. 21, no. 4, pp. 762–766, 2005.
P. L. Lee, H. C. Chang, T. Y. Hsieh, H. T. Deng, and C. W. Sun, A brain-wave-actuated small robot car using ensemble empirical mode decomposition-based approach, IEEE Trans. Syst. Man,Cybern. Part A:Systems Humans, vol. 42, no. 5, pp. 1053–1064, 2012.
I. Iturrate, J. M. Antelis, A. Kübler, and J. Minguez, A noninvasive brain-actuated wheelchair based on a P300 neurophysiological protocol and automated navigation, IEEE Trans. Robot., vol. 25, no. 3, pp. 614–627, 2009.
X. Deng, Z. L. Yu, C. Lin, Z. Gu, and Y. Li, Self-adaptive shared control with brain state evaluation network for human-wheelchair cooperation, J. Neural Eng., vol. 17, no. 4, p. 045005, 2020.
H. Li, L. Bi, and J. Yi, Sliding-mode nonlinear predictive control of brain-controlled mobile robots, IEEE Trans. Cybern., vol. 52, no. 6, pp. 5419–5431, 2020.
R. Liu, Y. X. Wang, and L. Zhang, An FDES-based shared control method for asynchronous brain-actuated robot, IEEE Trans. Cybern., vol. 46, no. 6, pp. 1452–1462, 2016.
Y. U. Cao, A. S. Fukunaga, and A. B. Kahng, Cooperative mobile robotics: Antecedents and directions, Auton. Robots, vol. 4, no. 1, pp. 7–27, 1997.
T. Balch and R. C. Arkin, Behavior-based formation control for multirobot teams, IEEE Trans. Robot. Autom., vol. 14, no. 6, pp. 926–939, 1998.
W. Dai, Y. Liu, H. Lu, Z. Zheng, and Z. Zhou, A shared control framework for human-multirobot foraging with brain-computer interface, IEEE Robotics and Automation Letters, vol. 6, no. 4, pp. 6305–6312, 2021.
W. Dai, Y. Liu, H. Lu, Z. Zhou, and Z. Zhen, Shared control based on a brain-computer interface for human-multirobot cooperation, IEEE Robotics and Automation Letters, vol. 6, no. 3, pp. 6123–6130, 2021.
E. A. Kirchner, S. K. Kim, M. Tabie, H. Wohrle, M. Maurus, and F. Kirchner, An intelligent man-machine interface-multi-robot control adapted for task engagement based on single-trial detectability of P300, Front Hum Neurosci, vol. 10, p. 291, 2016.
M. Liu, K. Wang, X. Chen, J. Zhao, Y. Chen, H. Wang, J. Wang, and S. Xu, Indoor simulated training environment for brain-controlled wheelchair based on steady-state visual evoked potentials, Front. Neurorobot., vol. 13, p. 101, 2019.
L. Bi, X. A. Fan, K. Jie, T. Teng, H. Ding, and Y. Liu, Using a head-up display-based steady-state visually evoked potential brain-computer interface to control a simulated vehicle, IEEE Trans. Intell. Transp. Syst., vol. 15, no. 3, pp. 959–966, 2014.
X. Chen, Y. Wang, S. Gao, T. P. Jung, and X. Gao, Filter bank canonical correlation analysis for implementing a high-speed SSVEP-based brain-computer interface, J. Neural Eng., vol. 12, no. 4, p. 046008, 2015.
G. Campion, G. Bastin, and B. Dandréa-Novel, Structural properties and classification of kinematic and dynamic models of wheeled mobile robots, IEEE Trans. Robot. Autom., vol. 12, no. 1, pp. 47–62, 1996.
A. K. Das, R. Fierro, V. Kumar, J. P. Ostrowski, J. Spletzer, and C. J. Taylor, A vision-based formation control framework, IEEE Trans. Robot. Autom., vol. 18, no. 5, pp. 813–825, 2002.
This work was supported by the National Natural Science Foundation of China (No. 51975052).
The articles published in this open access journal are distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/).