Publications
Sort:
Open Access Review Article Issue
Embodied tactile perception and learning
Brain Science Advances 2020, 6 (2): 132-158
Published: 31 August 2020
Downloads:43

Various living creatures exhibit embodiment intelligence, which is reflected by a collaborative interaction of the brain, body, and environment. The actual behavior of embodiment intelligence is generated by a continuous and dynamic interaction between a subject and the environment through information perception and physical manipulation. The physical interaction between a robot and the environment is the basis for realizing embodied perception and learning. Tactile information plays a critical role in this physical interaction process. It can be used to ensure safety, stability, and compliance, and can provide unique information that is difficult to capture using other perception modalities. However, due to the limitations of existing sensors and perception and learning methods, the development of robotic tactile research lags significantly behind other sensing modalities, such as vision and hearing, thereby seriously restricting the development of robotic embodiment intelligence. This paper presents the current challenges related to robotic tactile embodiment intelligence and reviews the theory and methods of robotic embodied tactile intelligence. Tactile perception and learning methods for embodiment intelligence can be designed based on the development of new large-scale tactile array sensing devices, with the aim to make breakthroughs in the neuromorphic computing technology of tactile intelligence.

Open Access Issue
Fabric Recognition Using Zero-Shot Learning
Tsinghua Science and Technology 2019, 24 (6): 645-653
Published: 05 December 2019
Downloads:27

In this work, we use a deep learning method to tackle the Zero-Shot Learning (ZSL) problem in tactile material recognition by incorporating the advanced semantic information into a training model. Our main technical contribution is our proposal of an end-to-end deep learning framework for solving the tactile ZSL problem. In this framework, we use a Convolutional Neural Network (CNN) to extract the spatial features and Long Short-Term Memory (LSTM) to extract the temporal features in dynamic tactile sequences, and develop a loss function suitable for the ZSL setting. We present the results of experimental evaluations on publicly available datasets, which show the effectiveness of the proposed method.

total 2