File Download
There are no files associated with this item.
Links for fulltext
(May Require Subscription)
- Publisher Website: 10.1109/WRCSARA60131.2023.10261858
- Scopus: eid_2-s2.0-85174182323
Supplementary
-
Citations:
- Scopus: 0
- Appears in Collections:
Conference Paper: Recognition of Dynamic Grasping Motion through Feature Extraction for Teleoperation
Title | Recognition of Dynamic Grasping Motion through Feature Extraction for Teleoperation |
---|---|
Authors | |
Issue Date | 27-Sep-2023 |
Publisher | IEEE |
Abstract | Detecting human movement intentions and transmitting human operational capabilities to robots is of great significance for teleoperations. However, the difference in kinematic structure between human hands and robotic manipulators brings difficulties to teleoperation mapping. In this paper, we present an approach to hand grasp recognition and mapping based on feature extraction to drive heterogeneous remote manipulator. The hand grasp data set HUST recorded with a data glove was used for this study. The data were preprocessed and their features were extracted by PCA. A neural network-based classifier was obtained through offline training of human grasping activities. We compared different network models and selected the network with the best performance based on the training time and the task requirements of teleoperation. The experimental results showed that the recognition accuracy of dynamic grasping motion with feature extraction is higher than that without feature extraction and can be up to 97.47%. Finally, we validated the feasibility of this method to tele-operate an Barrett hand through online experiments. Our proposed method provides an effective approach of mapping natural human grasping operations to heterogeneous robotic teleoperation systems. |
Persistent Identifier | http://hdl.handle.net/10722/343544 |
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Dai, Yanping | - |
dc.contributor.author | Chen, Wenyuan | - |
dc.contributor.author | Xi, Ning | - |
dc.contributor.author | Wang, Wenxue | - |
dc.date.accessioned | 2024-05-14T05:21:20Z | - |
dc.date.available | 2024-05-14T05:21:20Z | - |
dc.date.issued | 2023-09-27 | - |
dc.identifier.uri | http://hdl.handle.net/10722/343544 | - |
dc.description.abstract | <p>Detecting human movement intentions and transmitting human operational capabilities to robots is of great significance for teleoperations. However, the difference in kinematic structure between human hands and robotic manipulators brings difficulties to teleoperation mapping. In this paper, we present an approach to hand grasp recognition and mapping based on feature extraction to drive heterogeneous remote manipulator. The hand grasp data set HUST recorded with a data glove was used for this study. The data were preprocessed and their features were extracted by PCA. A neural network-based classifier was obtained through offline training of human grasping activities. We compared different network models and selected the network with the best performance based on the training time and the task requirements of teleoperation. The experimental results showed that the recognition accuracy of dynamic grasping motion with feature extraction is higher than that without feature extraction and can be up to 97.47%. Finally, we validated the feasibility of this method to tele-operate an Barrett hand through online experiments. Our proposed method provides an effective approach of mapping natural human grasping operations to heterogeneous robotic teleoperation systems.</p> | - |
dc.language | eng | - |
dc.publisher | IEEE | - |
dc.relation.ispartof | The 2023 WRC Symposiumon Advanced Robotics and Automation (WRC SARA) (16/08/2023-22/08/2023, Beijing) | - |
dc.title | Recognition of Dynamic Grasping Motion through Feature Extraction for Teleoperation | - |
dc.type | Conference_Paper | - |
dc.identifier.doi | 10.1109/WRCSARA60131.2023.10261858 | - |
dc.identifier.scopus | eid_2-s2.0-85174182323 | - |
dc.identifier.spage | 206 | - |
dc.identifier.epage | 212 | - |