File Download

There are no files associated with this item.

  Links for fulltext
     (May Require Subscription)
Supplementary

Conference Paper: Recognition of Dynamic Grasping Motion through Feature Extraction for Teleoperation

TitleRecognition of Dynamic Grasping Motion through Feature Extraction for Teleoperation
Authors
Issue Date27-Sep-2023
PublisherIEEE
Abstract

Detecting human movement intentions and transmitting human operational capabilities to robots is of great significance for teleoperations. However, the difference in kinematic structure between human hands and robotic manipulators brings difficulties to teleoperation mapping. In this paper, we present an approach to hand grasp recognition and mapping based on feature extraction to drive heterogeneous remote manipulator. The hand grasp data set HUST recorded with a data glove was used for this study. The data were preprocessed and their features were extracted by PCA. A neural network-based classifier was obtained through offline training of human grasping activities. We compared different network models and selected the network with the best performance based on the training time and the task requirements of teleoperation. The experimental results showed that the recognition accuracy of dynamic grasping motion with feature extraction is higher than that without feature extraction and can be up to 97.47%. Finally, we validated the feasibility of this method to tele-operate an Barrett hand through online experiments. Our proposed method provides an effective approach of mapping natural human grasping operations to heterogeneous robotic teleoperation systems.


Persistent Identifierhttp://hdl.handle.net/10722/343544

 

DC FieldValueLanguage
dc.contributor.authorDai, Yanping-
dc.contributor.authorChen, Wenyuan-
dc.contributor.authorXi, Ning-
dc.contributor.authorWang, Wenxue-
dc.date.accessioned2024-05-14T05:21:20Z-
dc.date.available2024-05-14T05:21:20Z-
dc.date.issued2023-09-27-
dc.identifier.urihttp://hdl.handle.net/10722/343544-
dc.description.abstract<p>Detecting human movement intentions and transmitting human operational capabilities to robots is of great significance for teleoperations. However, the difference in kinematic structure between human hands and robotic manipulators brings difficulties to teleoperation mapping. In this paper, we present an approach to hand grasp recognition and mapping based on feature extraction to drive heterogeneous remote manipulator. The hand grasp data set HUST recorded with a data glove was used for this study. The data were preprocessed and their features were extracted by PCA. A neural network-based classifier was obtained through offline training of human grasping activities. We compared different network models and selected the network with the best performance based on the training time and the task requirements of teleoperation. The experimental results showed that the recognition accuracy of dynamic grasping motion with feature extraction is higher than that without feature extraction and can be up to 97.47%. Finally, we validated the feasibility of this method to tele-operate an Barrett hand through online experiments. Our proposed method provides an effective approach of mapping natural human grasping operations to heterogeneous robotic teleoperation systems.</p>-
dc.languageeng-
dc.publisherIEEE-
dc.relation.ispartofThe 2023 WRC Symposiumon Advanced Robotics and Automation (WRC SARA) (16/08/2023-22/08/2023, Beijing)-
dc.titleRecognition of Dynamic Grasping Motion through Feature Extraction for Teleoperation-
dc.typeConference_Paper-
dc.identifier.doi10.1109/WRCSARA60131.2023.10261858-
dc.identifier.scopuseid_2-s2.0-85174182323-
dc.identifier.spage206-
dc.identifier.epage212-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats