References(162)
[1]
M Shimojo, A Namiki, M Ishikawa, et al. A tactile sensor sheet using pressure conductive rubber with electrical-wires stitched method. IEEE Sens J. 2004, 4(5): 589-596.
[2]
S Teshigawara, S Shimizu, K Tadakuma, et al. High sensitivity slip sensor using pressure conductive rubber. In 2009 IEEE Sensors, Christchurch, New Zealand, 2009, pp 988-991.
[3]
PS Girão, PMP Ramos, O Postolache, et al. Tactile sensors for robotic applications. Measurement. 2013, 46(3): 1257-1271.
[4]
HB Muhammad, CM Oddo, L Beccai, et al. Development of a bioinspired MEMS based capacitive tactile sensor for a robotic finger. Sensor Actuat A: Phys. 2011, 165(2): 221-229.
[5]
A Schmitz, P Maiolino, M Maggiali, et al. Methods and technologies for the implementation of large-scale robot tactile sensors. IEEE Trans Robot. 2011, 27(3): 389-400.
[6]
D Goger, N Gorges, H Worn. Tactile sensing for an anthropomorphic robotic hand: Hardware and signal processing. In 2009 IEEE International Conference on Robotics and Automation, Kobe, Japan, 2009, pp 895-901.
[7]
L Seminara, M Capurro, P Cirillo, et al. Electromechanical characterization of piezoelectric PVDF polymer films for tactile sensors in robotics applications. Sensor Actuat A: Phys. 2011, 169(1): 49-58.
[8]
CH Chuang, MS Wang, YC Yu, et al. Flexible tactile sensor for the grasping control of robot fingers. In 2013 International Conference on Advanced Robotics and Intelligent Systems. Tainan, China, 2013, pp 135-140.
[9]
RS Dahiya, G Metta, M Valle, et al. Tactile sensing—from humans to humanoids. IEEE Trans Robot. 2010, 26(1): 1-20.
[10]
T Zhang, H Liu, L Jiang, et al. Development of a flexible 3-D tactile sensor system for anthropomorphic artificial hand. IEEE Sens J. 2013, 13(2): 510-518.
[11]
MK Johnson, F Cole, A Raj, et al. Microgeometry capture using an elastomeric sensor. ACM T Graphic. 2011, 30(4): 1.
[12]
H Xie, A Jiang, HA Wurdemann, et al. Magnetic resonance-compatible tactile force sensor using fiber optics and vision sensor. IEEE Sens J. 2014, 14(3): 829-838.
[13]
B Fang, FC Sun, C Yang, et al. A dual-modal vision-based tactile sensor for robotic hand grasping. In 2018 IEEE International Conference on Robotics and Automation (ICRA), Brisbane, Australia, 2018, pp 4740-4745.
[14]
JM Romano, K Hsiao, G Niemeyer, et al. Human-inspired robotic grasp control with tactile sensing. IEEE Trans Robot. 2011, 27(6): 1067-1079.
[15]
LT Jiang, JR Smith. Seashell effect pretouch sensing for robotic grasping. In 2012 IEEE International Conference on Robotics and Automation, St Paul, USA, 2012, pp 2851-2858.
[16]
H Hasegawa, Y Mizoguchi, K Tadakuma, et al. Development of intelligent robot hand using proximity, contact and slip sensing. In 2010 IEEE International Conference on Robotics and Automation, Anchorage, USA, 2010, pp 777-784.
[17]
T Yamamoto, N Wettels, JA Fishel, et al. BioTac: Biomimetic multi-modal tactile sensor. J Robotics Soc Jpn. 2012, 30(5): 496-498.
[18]
Z Kappassov, JA Corrales, V Perdereau. Tactile sensing in dexterous robot hands — Review. Robot Auton Syst. 2015, 74: 195-220.
[19]
S Luo, J Bimbo, R Dahiya, et al. Robotic tactile perception of object properties: a review. Mechatronics. 2017, 48: 54-67.
[20]
B Winstone, G Griffiths, T Pipe, et al. TACTIP - tactile fingertip device, texture analysis through optical tracking of skin features. In Biomimetic and Biohybrid Systems. NF Lepora, A Mura, HG Krapp, et al, Eds. Berlin, Heidelberg: Springer, 2013.
[21]
DF Xu, GE Loeb, JA Fishel. Tactile identification of objects using Bayesian exploration. In 2013 IEEE International Conference on Robotics and Automation, Karlsruhe, Germany, 2013, pp 3056-3061.
[22]
A Schmitz, M Maggiali, L Natale, et al. A tactile sensor for the fingertips of the humanoid robot iCub. In 2010 IEEE/RSJ International Conference on Intelligent Robots and Systems, Taipei, China, 2010, pp 2212-2217.
[23]
R Koiva, M Zenker, C Schurmann, et al. A highly sensitive 3D-shaped tactile sensor. In 2013 IEEE/ASME International Conference on Advanced Intelligent Mechatronics, Wollongong, USA, 2013, pp 1084-1089.
[24]
CA Jara, J Pomares, FA Candelas, et al. Control framework for dexterous manipulation using dynamic visual servoing and tactile sensors' feedback. Sensors (Basel). 2014, 14(1): 1787-1804.
[25]
HJ Song, T Bhattacharjee, SS Srinivasa. Sensing shear forces during food manipulation: resolving the trade-off between range and sensitivity. In 2019 International Conference on Robotics and Automation (ICRA), Montreal, Canada, 2019, pp 8367-8373.
[26]
PA Schmidt, E Maël, RP Würtz. A sensor for dynamic tactile information with applications in human–robot interaction and object exploration. Robot Auton Syst. 2006, 54(12): 1005-1014.
[27]
N Jamali, C Sammut. Majority voting: material classification by tactile sensing using surface texture. IEEE Trans Robot. 2011, 27(3): 508-521.
[28]
H Yousef, M Boukallel, K Althoefer. Tactile sensing for dexterous in-hand manipulation in robotics—A review. Sensor Actuat A: Phys. 2011, 167(2): 171-187.
[29]
S Teshigawara, T Tsutsumi, S Shimizu, et al. Highly sensitive sensor for detection of initial slip and its application in a multi-fingered robot hand. In 2011 IEEE International Conference on Robotics and Automation, Shanghai, China, 2011, pp 1097-1102.
[30]
B Heyneman, MR Cutkosky. Biologically inspired tactile classification of object-hand and object-world interactions. In 2012 IEEE International Conference on Robotics and Biomimetics (ROBIO), Guangzhou, China, 2012, pp 167-173.
[31]
A Drimus, G Kootstra, A Bilberg, et al. Design of a flexible tactile sensor for classification of rigid and deformable objects. Robot Auton Syst. 2014, 62(1): 3-15.
[32]
LU Odhner, LP Jentoft, MR Claffee, et al. A compliant, underactuated hand for robust manipulation. Int J Robot Res. 2014, 33(5): 736-752.
[33]
F Suárez-Ruiz, I Galiana, Y Tenzer, et al. Grasp mapping between a 3-finger haptic device and a robotic hand. In Haptics: Neuroscience, Devices, Modeling, and Applications. Auvray Malika, Duriez Christian, Eds. Berlin, Heidelberg: Springer Berlin Heidelberg, 2014.
[34]
Y Tenzer, LP Jentoft, RD Howe. The feel of MEMS barometers: inexpensive and easily customized tactile array sensors. IEEE Robot Automat Mag. 2014, 21(3): 89-95.
[35]
L Jamone, L Natale, G Metta, et al. Highly sensitive soft tactile sensors for an anthropomorphic robotic hand. IEEE Sens J. 2015, 15(8): 4226-4233.
[36]
T Paulino, P Ribeiro, M Neto, et al. Low-cost 3-axis soft tactile sensors for the human-friendly robot Vizzy. In 2017 IEEE International Conference on Robotics and Automation (ICRA), Singapore, 2017, pp 966-71.
[37]
S Funabashi, S Morikuni, A Geier, et al. Object recognition through active sensing using a multi-fingered robot hand with 3D tactile sensors. In 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Madrid, Spain, 2018, pp 2589-2595.
[38]
A Wilson, S Wang, B Romero, et al. Design of a fully actuated robotic hand with multiple Gelsight tactile sensors. arXiv preprint. 2020, arXiv: 2002.02474.
[39]
HP Liu, D Guo, FC Sun. Object recognition using tactile measurements: kernel sparse coding methods. IEEE Trans Instrum Meas. 2016, 65(3): 656-665.
[40]
TP Tomo, WK Wong, A Schmitz, et al. A modular, distributed, soft, 3-axis sensor system for robot hands. In 2016 IEEE-RAS 16th International Conference on Humanoid Robots (Humanoids), Cancun, Mexico, 2016, pp 454-460.
[41]
T Wang, ZX Geng, B Kang, et al. Eagle Shoal: A new designed modular tactile sensing dexterous hand for domestic service robots. In 2019 International Conference on Robotics and Automation (ICRA), Montreal, Canada, 2019, pp 9087-9093.
[42]
F Pastor, JM Gandarias, AJ García-Cerezo, et al. Using 3D convolutional neural networks for tactile object recognition with robotic palpation. Sensors (Basel). 2019, 19(24): E5356.
[43]
T Bhattacharjee, AA Shenoi, D Park, et al. Combining tactile sensing and vision for rapid haptic mapping. In 2015 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Hamburg, Germany, 2015, pp 1200-1207.
[44]
A Albini, S Denei, G Cannata. Enabling natural human-robot physical interaction using a robotic skin feedback and a prioritized tasks robot control architecture. In 2017 IEEE-RAS 17th International Conference on Humanoid Robotics (Humanoids), Birmingham, UK, 2017, pp 99-106.
[45]
C Vergara, G Borghesan, E Aertbelien, et al. Incorporating artificial skin signals in the constraint-based reactive control of human-robot collaborative manipulation tasks. In 2018 3rd International Conference on Advanced Robotics and Mechatronics (ICARM), Singapore, 2018, pp 280-287.
[46]
P Mittendorfer, G Cheng. Humanoid multimodal tactile-sensing modules. IEEE Trans Robot. 2011, 27(3): 401-410.
[47]
Q Leboutet, E Dean-Leon, F Bergner, et al. Tactile-based whole-body compliance with force propagation for mobile manipulators. IEEE Trans Robot. 2019, 35(2): 330-342.
[48]
JL Liang, JH Wu, HL Huang, et al. Soft sensitive skin for safety control of a nursing robot using proximity and tactile sensors. IEEE Sens J. 2020, 20(7): 3822-3830.
[49]
P Mittendorfer, E Yoshida, G Cheng. Realizing whole-body tactile interactions with a self-organizing, multi-modal artificial skin on a humanoid robot. Adv Robotics. 2015, 29(1): 51-67.
[50]
HJ Ku, JJ Choi, S Jang, et al. Online social touch pattern recognition with multi-modal-sensing modular tactile interface. In 2019 16th International Conference on Ubiquitous Robots (UR), Jeju, Korea, 2019, pp 271-277.
[51]
BD Argall, AG Billard. A survey of tactile human–robot interactions. Robot Auton Syst. 2010, 58(10): 1159-1176.
[52]
G Metta, L Natale, F Nori, et al. The iCub humanoid robot: an open-systems platform for research in cognitive development. Neural Netw. 2010, 23(8/9): 1125-1134.
[53]
S Harada, K Kanao, Y Yamamoto, et al. Fully printed flexible fingerprint-like three-axis tactile and slip force and temperature sensors for artificial skin. ACS Nano. 2014, 8(12): 12851-12857.
[54]
Kaboli M. New methods for active tactile object perception and learning with artificial robotic skin. PhD Dissertation, Technische Universität München, Germany, 2017.
[55]
GH Büscher, R Kõiva, C Schürmann, et al. Flexible and stretchable fabric-based tactile sensor. Robot Auton Syst. 2015, 63: 244-252.
[56]
JQ Wei, HP Liu, BW Wang, et al. Lifelong learning for tactile emotion recognition. 2019, 20(1): 25-41.
[57]
CG Núñez, WT Navaraj, EO Polat, et al. Energy-autonomous, flexible, and transparent tactile skin. Adv Funct Mater. 2017, 27(18): 1606287.
[58]
JJ Shill, EG Collins Jr, E Coyle, et al. Tactile surface classification for limbed robots using a pressure sensitive robot skin. Bioinspir Biomim. 2015, 10(1): 016012.
[59]
J Bednarek, M Bednarek, L Wellhausen, et al. What am I touching? Learning to classify terrain via haptic sensing. In 2019 International Conference on Robotics and Automation (ICRA), Montreal, Canada, 2019, pp 7187-7193.
[60]
XA Wu, TM Huh, A Sabin, et al. Tactile sensing and terrain-based gait control for small legged robots. IEEE Trans Robot. 2020, 36(1): 15-27.
[61]
J Rogelio Guadarrama Olvera, ED Leon, F Bergner, et al. Plantar tactile feedback for biped balance and locomotion on unknown terrain. Int J Human Robot. 2020, 17(1): 1950036.
[62]
HW Park, PM Wensing, S Kim. High-speed bounding with the MIT Cheetah 2: Control design and experiments. Int J Robot Res. 2017, 36(2): 167-192.
[63]
CC Bai, JF Guo, HX Zheng. Three-dimensional vibration-based terrain classification for mobile robots. IEEE Access. 2019, 7: 63485-63492.
[64]
ZM Lin, ZY Wu, BB Zhang, et al. A triboelectric nanogenerator-based smart insole for multifunctional gait monitoring. Adv Mater Technol. 2019, 4(2): 1800360.
[65]
F Visentin, P Fiorini, K Suzuki. A deformable smart skin for continuous sensing based on electrical impedance tomography. Sensors (Basel). 2016, 16(11): E1928.
[66]
H Lee, D Kwon, H Cho, et al. Soft nanocomposite based multi-point, multi-directional strain mapping sensor using anisotropic electrical impedance tomography. Sci Rep. 2017, 7: 39837.
[67]
S Russo, R Assaf, N Carbonaro, et al. Touch position detection in electrical tomography tactile sensors through quadratic classifier. IEEE Sens J. 2019, 19(2): 474-483.
[68]
A Nagakubo, H Alirezaei, Y Kuniyoshi. A deformable and deformation sensitive tactile distribution sensor. In 2007 IEEE International Conference on Robotics and Biomimetics (ROBIO), Sanya, China, 2007, pp 1301-1308.
[69]
Y Kato, T Mukai, T Hayakawa, et al. Tactile sensor without wire and sensing element in the tactile region based on EIT method. In 2007 IEEE Sensors. Atlanta, USA, 2007, pp 792-795.
[70]
DS Tawil, D Rye, M Velonaki. Improved image reconstruction for an EIT-based sensitive skin with multiple internal electrodes. IEEE Trans Robot. 2011, 27(3): 425-435.
[71]
D Silvera Tawil, D Rye, M Velonaki. Interpretation of the modality of touch on an artificial arm covered with an EIT-based sensitive skin. Int J Robot Res. 2012, 31(13): 1627-1641.
[72]
G Pugach, A Pitti, P Gaussier. Neural learning of the topographic tactile sensory information of an artificial skin through a self-organizing map. Adv Robotics. 2015, 29(21): 1393-1409.
[73]
H Park, H Lee, K Park, et al. Deep neural network approach in electrical impedance tomography-based real-time soft tactile sensor. In 2019 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Macau, China, 2019, pp 7447-7452.
[74]
D Silvera-Tawil, D Rye, M Soleimani, et al. Electrical impedance tomography for artificial sensitive robotic skin: a review. IEEE Sens J. 2015, 15(4): 2001-2016.
[75]
S Mühlbacher-Karrer, J Padilha Leitzke, LM Faller, et al. Non-iterative object detection methods in electrical tomography for robotic applications. COMPEL. 2017, 36(5): 1411-1420.
[76]
LL Cao, FC Sun, R Kotagiri, et al. Real-time recurrent tactile recognition: momentum batch- sequential echo state networks. IEEE Trans Syst Man Cybern, Syst. 2020, 50(4): 1350-1361.
[77]
HP Liu, FC Sun. Robotic Tactile Perception and Understanding. Singapore: Springer Singapore, 2018.
[78]
M Madry, LF Bo, D Kragic, et al. ST-HMP: Unsupervised Spatio-Temporal feature learning for tactile data. In 2014 IEEE International Conference on Robotics and Automation (ICRA), Hong Kong, China, 2014, 2262-2269.
[79]
SS Baishya, B Bauml. Robust material classification with a tactile skin using deep learning. In 2016 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Daejeon, South Korea, 2016, 8-15.
[80]
MQ Ji, L Fang, HT Zheng, et al. Preprocessing-free surface material classification using convolutional neural networks pretrained by sparse Autoencoder. In 2015 IEEE 25th International Workshop on Machine Learning for Signal Processing (MLSP), Boston, USA, 2015, pp 1-6.
[81]
HT Zheng, L Fang, MQ Ji, et al. Deep learning for surface material classification using haptic and visual information. IEEE Trans Multimedia. 2016, 18(12): 2407-2416.
[82]
D Hughes, A Krauthammer, N Correll. Recognizing social touch gestures using recurrent and convolutional neural networks. In 2017 IEEE International Conference on Robotics and Automation (ICRA), Singapore, 2017, pp 2315-2321.
[83]
KS Sohn, J Chung, MY Cho, et al. An extremely simple macroscale electronic skin realized by deep machine learning. Sci Rep. 2017, 7(1): 11061.
[84]
WZ Yuan, CZ Zhu, A Owens, et al. Shape-independent hardness estimation using deep learning and a GelSight tactile sensor. In 2017 IEEE International Conference on Robotics and Automation (ICRA), Singapore, 2017, pp 951-958.
[85]
M Polic, I Krajacic, N Lepora, et al. Convolutional autoencoder for feature extraction in tactile sensing. IEEE Robot Autom Lett. 2019, 4(4): 3671-3678.
[86]
Z Erickson, S Chernova, CC Kemp. Semi-supervised haptic material recognition for robots using generative adversarial networks. arXiv preprint. 2017, arXiv:1707.02796.
[87]
F Wang, HP Liu, FC Sun, et al. Fabric recognition using zero-shot learning. Tinshhua Sci Technol. 2019, 24(6): 645-653.
[88]
JW Hao, Y Zhu, EB Dong. An optical tactile sensor with structural color using deep learning method. In 2019 IEEE International Conference on Mechatronics and Automation (ICMA), Tianjin, China, 2019, pp 707-712.
[89]
KY Goldberg, R Bajcsy. Active touch and robot perception. Cogn Brain Theo. 1984, 7 (2): 199-214.
[90]
L Seminara, P Gastaldo, SJ Watt, et al. Active haptic perception in robots: a review. Front Neurorobot. 2019, 13: 53.
[91]
C Strub, F Worgotter, H Ritter, et al. Correcting pose estimates during tactile exploration of object shape: a neuro-robotic study. In 4th International Conference on Development and Learning and on Epigenetic Robotics, Genoa, Italy, 2014, pp 26-33.
[92]
I Abraham, A Prabhakar, MJZ Hartmann, et al. Ergodic exploration using binary sensing for nonparametric shape estimation. IEEE Robot Autom Lett. 2017, 2(2): 827-834.
[93]
N Jamali, C Ciliberto, L Rosasco, et al. Active perception: Building objects' models using tactile exploration. In 2016 IEEE-RAS 16th International Conference on Humanoid Robots (Humanoids), Cancun, Mexico, 2016, pp 179-185.
[94]
N Sommer, A Billard. Multi-contact haptic exploration and grasping with tactile sensors. Robot Auton Syst. 2016, 85: 48-61.
[95]
D Driess, P Englert, M Toussaint. Active learning with query paths for tactile object shape exploration. In 2017 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Vancouver, Canada, 2017, pp 65-72.
[96]
T Matsubara, K Shibata. Active tactile exploration with uncertainty and travel cost for fast shape estimation of unknown objects. Robot Auton Syst. 2017, 91: 314-326.
[97]
S Ottenhaus, L Kaul, N Vahrenkamp, et al. Active tactile exploration based on cost-aware information gain maximization. Int J Human Robot. 2018, 15(1): 1850015.
[98]
HP Saal, JA Ting, S Vijayakumar. Active estimation of object dynamics parameters with tactile sensors. In 2010 IEEE/RSJ International Conference on Intelligent Robots and Systems, Taipei, China, 2010, pp 916-921.
[100]
N Lepora, U Martinez-Hernandez, T Prescott. Active Bayesian perception for simultaneous object localization and identification. In Robotics: Science and Systems Foundation, 2013.
[101]
NF Lepora. Biomimetic active touch with fingertips and whiskers. IEEE Trans Haptics. 2016, 9(2): 170-183.
[102]
U Martinez-Hernandez, TJ Dodd, TJ Prescott. Feeling the shape: active exploration behaviors for object recognition with a robotic hand. IEEE Trans Syst Man Cybern, Syst. 2018, 48(12): 2339-2348.
[103]
U Martinez-Hernandez, TJ Dodd, MH Evans, et al. Active sensorimotor control for tactile exploration. Robot Auton Syst. 2017, 87: 15-27.
[104]
D Tanaka, T Matsubara, K Sugimoto. An optimal control approach for exploratory actions in active tactile object recognition. In 2014 IEEE-RAS International Conference on Humanoid Robots, Madrid, Spain, 2014, pp 787-793.
[105]
T Sun, J Back, HB Liu. Combining contact forces and geometry to recognize objects during surface haptic exploration. IEEE Robot Autom Lett. 2018, 3(3): 2509-2514.
[106]
SL Xu, N Lin, R Fan, et al. Exploring Hardness and Geometry Information through Active Perception. In 2019 WRC Symposium on Advanced Robotics and Automation (WRC SARA), Beijing, China, 2019, pp 2509-2514.
[107]
M Kaboli, KP Yao, D Feng, et al. Tactile-based active object discrimination and target object search in an unknown workspace. Auton Robot. 2019, 43(1): 123-152.
[108]
J Ilonen, J Bohg, V Kyrki. Fusing visual and tactile sensing for 3-D object reconstruction while grasping. In 2013 IEEE International Conference on Robotics and Automation, Karlsruhe, Germany, 2013, pp 3547-3554.
[109]
O Kroemer, CH Lampert, J Peters. Learning dynamic tactile sensing with robust vision-based training. IEEE Trans Robot. 2011, 27(3): 545-557.
[110]
HP Liu, YL Yu, FC Sun, et al. Visual–tactile fusion for object recognition. IEEE Trans Automat Sci Eng. 2017, 14(2): 996-1008.
[111]
HP Liu, YP Wu, FC Sun, et al. Weakly paired multimodal fusion for object recognition. IEEE Trans Automat Sci Eng. 2018, 15(2): 784-795.
[112]
Y Gao, LA Hendricks, KJ Kuchenbecker, et al. Deep learning for tactile understanding from visual and haptic data. In 2016 IEEE International Conference on Robotics and Automation (ICRA), Stockholm, Sweden, 2016, pp 536-543.
[113]
M Kerzel, M Ali, HG Ng, et al. Haptic material classification with a multi-channel neural network. In 2017 International Joint Conference on Neural Networks (IJCNN), Anchorage, USA, 2017, pp 439-446.
[114]
WZ Yuan, SX Wang, SY Dong, et al. Connecting look and feel: associating the visual and tactile properties of physical materials. In 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Honolulu, USA, 2017, pp 5580-5588.
[115]
WD Zheng, HP Liu, BW Wang, et al. Cross-modal surface material retrieval using discriminant adversarial learning. IEEE Trans Ind Inf. 2019, 15(9): 4978-4987.
[116]
WD Zheng, HP Liu, BW Wang, et al. Cross-modal material perception for novel objects: a deep adversarial learning method. IEEE Trans Automat Sci Eng. 2020, 17(2): 697-707.
[117]
M Strese, C Schuwerk, A Iepure, et al. Multimodal feature-based surface material classification. IEEE Trans Haptics. 2017, 10(2): 226-239.
[118]
HP Liu, FC Sun, B Fang, et al. Multimodal measurements fusion for surface material categorization. IEEE Trans Instrum Meas. 2018, 67(2): 246-256.
[119]
HP Liu, F Wang, FC Sun, et al. Surface material retrieval using weakly paired cross-modal learning. IEEE Trans Automat Sci Eng. 2019, 16(2): 781-791.
[120]
C Yang, NF Lepora. Object exploration using vision and active touch. In 2017 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Vancouver, Canada, 2017, pp 6363-6370.
[121]
WZ Yuan, YC Mo, SX Wang, et al. Active clothing material perception using tactile sensing and deep learning. In 2018 IEEE International Conference on Robotics and Automation (ICRA), Brisbane, Australia, 2018, pp 4842-4849.
[122]
JF Ferreira, C Tsiourti, J Dias. Learning emergent behaviours for a hierarchical Bayesian framework for active robotic perception. Cogn Process. 2012, 13(Suppl 1): S155-S159.
[123]
T Taniguchi, R Yoshino, T Takano. Multimodal hierarchical dirichlet process-based active perception by a robot. Front Neurorobot. 2018, 12: 22.
[124]
HP Liu, FC Sun, XY Zhang. Robotic material perception using active multimodal fusion. IEEE Trans Ind Electron. 2019, 66(12): 9878-9886.
[125]
H Liu, F Wang, F Sun, et al. Active visual-tactile cross-modal matching. IEEE Trans Cogn Dev Syst. 2019, 11(2): 176-187.
[126]
HP Liu, FC Sun, B Fang, et al. Cross-modal zero-shot-learning for tactile object recognition. IEEE Trans Syst Man Cybern, Syst. 2018: 1-9.
[127]
S Luo, WZ Yuan, E Adelson, et al. ViTac: Feature sharing between vision and tactile sensing for cloth texture recognition. In 2018 IEEE International Conference on Robotics and Automation (ICRA), Brisbane, Australia, 2018, 2722-2727.
[128]
K Takahashi, J Tan. Deep visuo-tactile learning: estimation of tactile properties from images. In 2019 International Conference on Robotics and Automation (ICRA), Montreal, Canada, 2019, 8951-8957.
[129]
JM Gandarias, AJ Garcia-Cerezo, JM Gomez-De-gabriel. CNN-based methods for object recognition with high-resolution tactile sensors. IEEE Sens J. 2019, 19(16): 6872-6882.
[130]
P Falco, S Lu, C Natale, et al. A transfer learning approach to cross-modal object recognition: from visual observation to robotic haptic exploration. IEEE Trans Robot. 2019, 35(4): 987-998.
[131]
L Pinto, D Gandhi, YF Han, et al. The curious robot: learning visual representations via physical interactions. In Computer Vision – ECCV 2016. Cham: Springer International Publishing, 2016.
[132]
ZJ Xu, JJ Wu, A Zeng, et al. DensePhysNet: Learning dense physical object representations via multi-step dynamic interactions. In Robotics: Science and Systems Foundation, 2019.
[133]
J Kirkpatrick, R Pascanu, N Rabinowitz, et al. Overcoming catastrophic forgetting in neural networks. Proc Natl Acad Sci USA. 2017, 114(13): 3521-3526.
[134]
GX Zeng, Y Chen, B Cui, et al. Continual learning of context-dependent processing in neural networks. Nat Mach Intell. 2019, 1(8): 364-372.
[135]
FC Sun, HP Liu, C Yang, et al. Multi-modal continual learning using online dictionary updating. IEEE Trans Cogn Dev Syst. 2020: 1.
[136]
WD Zheng, HP Liu, FC Sun. Lifelong visual-tactile cross-modal learning for robotic material perception. IEEE Trans Neural Netw Learn Syst. 2020, in press, .
[137]
SB Furber, F Galluppi, S Temple, et al. The SpiNNaker project. Proc IEEE. 2014, 102(5): 652-665.
[138]
YX Yan, D Kappel, F Neumarker, et al. Efficient reward-based structural plasticity on a SpiNNaker 2 prototype. IEEE Trans Biomed Circuits Syst. 2019, 13(3): 579-591.
[139]
T Serrano-Gotarredona, B Linares-Barranco, F Galluppi, et al. ConvNets experiments on SpiNNaker. In 2015 IEEE International Symposium on Circuits and Systems (ISCAS), Lisbon, Portugal, 2015, pp 2405-2408.
[140]
DR Mendat, S Chin, S Furber, et al. Markov Chain Monte Carlo inference on graphical models using event-based processing on the SpiNNaker neuromorphic architecture. In 2015 49th Annual Conference on Information Sciences and Systems (CISS), Baltimore, USA, 2015, pp 1-6.
[141]
F Walter, M Sandner, F Rcohrbein, et al. Towards a neuromorphic implementation of hierarchical temporal memory on SpiNNaker. In 2017 IEEE International Symposium on Circuits and Systems (ISCAS), Baltimore, USA, 2017, pp 1-4.
[142]
R Tapiador-Morales, JP Dominguez-Morales, D Gutierrez-Galan, et al. Live demonstration: Neuromorphic row-by-row multi-convolution FPGA processor-SpiNNaker Architecture for dynamic-vision feature extraction. In 2019 IEEE International Symposium on Circuits and Systems (ISCAS), Sapporo, Japan, 2019, 1-1.
[143]
JP Dominguez-Morales, A Jimenez-Fernandez, A Rios-Navarro, et al. Multilayer spiking neural network for audio samples classification using SpiNNaker. In Artificial Neural Networks and Machine Learning - ICANN 2016. Cham: Springer International Publishing, 2016.
[144]
D Gutierrez-Galan, JP Dominguez-Morales, F Perez- Pena, et al. Live demonstration: neuromorphic robotics, from audio to locomotion through spiking CPG on SpiNNaker. In 2019 IEEE International Symposium on Circuits and Systems (ISCAS), Sapporo, Japan, 2019, 1-1.
[145]
T Kawasetsu, R Ishida, T Sanada, et al. Live demonstration: A hardware system for emulating the early vision utilizing a silicon retina and SpiNNaker chips. In 2014 IEEE Biomedical Circuits and Systems Conference (BioCAS) Proceedings, Lausanne, Switzerland, 2014, pp 552-555.
[146]
E Stromatias, D Neil, F Galluppi, et al. Live demonstration: Handwritten digit recognition using spiking deep belief networks on SpiNNaker. In 2015 IEEE International Symposium on Circuits and Systems (ISCAS), Lisbon, Portugal, 2015, pp 1901-1901.
[147]
G Orchard, X Lagorce, C Posch, et al. Real-time event-driven spiking neural network object recognition on the SpiNNaker platform. In 2015 IEEE International Symposium on Circuits and Systems (ISCAS), Lisbon, Portugal, 2015, pp 2413-2416.
[148]
G Haessig, F Galluppi, X Lagorce, et al. Neuromorphic networks on the SpiNNaker platform. In 2019 IEEE International Conference on Artificial Intelligence Circuits and Systems (AICAS), Hsinchu, China, 2019, pp 86-91.
[149]
F Galluppi, J Conradt, T Stewart, et al. Live Demo: Spiking ratSLAM: Rat hippocampus cells in spiking neural hardware. In 2012 IEEE Biomedical Circuits and Systems Conference, Zhubei, China, 2012, pp 91.
[150]
F Galluppi, C Denk, MC Meiner, et al. Event-based neural computing on an autonomous mobile platform. In 2014 IEEE International Conference on Robotics and Automation (ICRA), Hong Kong, China, 2014, pp 2862-2867.
[151]
G Chen, ZS Bing, F Rohrbein, et al. Toward brain-inspired learning with the neuromorphic snake-like robot and the neurorobotic platform. IEEE Trans Cogn Dev Syst. 2019, 11(1): 1-12.
[152]
AD Rast, SV Adams, S Davidson, et al. Behavioral learning in a cognitive neuromorphic robot: an integrative approach. IEEE Trans Neural Netw Learn Syst. 2018, 29(12): 6132-6144.
[153]
MJ Pearson, AG Pipe, B Mitchinson, et al. Implementing spiking neural networks for real-time signal-processing and control applications: a model- validated FPGA approach. IEEE Trans Neural Netw. 2007, 18(5): 1472-1487.
[154]
LL Bologna, J Pinoteau, JB Passot, et al. A closed-loop neurobotic system for fine touch sensing. J Neural Eng. 2013, 10(4): 046019.
[155]
G Spigler, CM Oddo, MC Carrozza. Soft-neuromorphic artificial touch for applications in neuro-robotics. In 2012 4th IEEE RAS & EMBS International Conference on Biomedical Robotics and Biomechatronics (BioRob), Rome, Italy, 2012, pp 1913-1917.
[156]
WW Lee, J Cabibihan, NV Thakor. Bio-mimetic strategies for tactile sensing. In 2013 IEEE SENSORS, Baltimore, USA, 2013, pp 1-4.
[157]
UB Rongala, A Mazzoni, CM Oddo. Neuromorphic artificial touch for categorization of naturalistic textures. IEEE Trans Neural Netw Learn Syst. 2017, 28(4): 819-829.
[158]
KE Friedl, AR Voelker, A Peer, et al. Human- inspired neurorobotic system for classifying surface textures by touch. IEEE Robot Autom Lett. 2016, 1(1): 516-523.
[159]
ZK Yi, YL Zhang. Recognizing tactile surface roughness with a biomimetic fingertip: a soft neuromorphic approach. Neurocomputing. 2017, 244: 102-111.
[160]
M Rasouli, Y Chen, A Basu, et al. An extreme learning machine-based neuromorphic tactile sensing system for texture recognition. IEEE Trans Biomed Circuits Syst. 2018, 12(2): 313-325.
[161]
MM Iskarous, HH Nguyen, LE Osborn, et al. Unsupervised learning and adaptive classification of neuromorphic tactile encoding of textures. In 2018 IEEE Biomedical Circuits and Systems Conference (BioCAS), Cleveland, USA, 2018, pp 1-4.
[162]
H Nguyen, L Osborn, M Iskarous, et al. Dynamic texture decoding using a neuromorphic multilayer tactile sensor. In 2018 IEEE Biomedical Circuits and Systems Conference (BioCAS), Cleveland, USA, 2018, pp 1-4.