File Download
There are no files associated with this item.
Links for fulltext
(May Require Subscription)
- Publisher Website: 10.1002/cav.260
- Scopus: eid_2-s2.0-52949096876
- WOS: WOS:000259628200006
- Find via
Supplementary
- Citations:
- Appears in Collections:
Conference Paper: Emulating human perception of motion similarity
Title | Emulating human perception of motion similarity |
---|---|
Authors | |
Keywords | 3D human motion similarity Human perception Pattern recognition |
Issue Date | 2008 |
Citation | Computer Animation and Virtual Worlds, 2008, v. 19, n. 3-4, p. 211-221 How to Cite? |
Abstract | Evaluating the similarity of motions is useful for motion retrieval, motion blending, and performance analysis of dancers and athletes. Euclidean distance between corresponding joints has been widely adopted in measuring similarity of postures and hence motions. However, such a measure does not necessarily conform to the human perception of motion similarity. In this paper, we propose a new similarity measure based on machine learning techniques. We make use of the results of questionnaires from subjects answering whether arbitrary pairs of motions appear similar or not. Using the relative distance between the joints as the basic features, we train the system to compute the similarity of arbitrary pair of motions. Experimental results show that our method outperforms methods based on Euclidean distance between corresponding joints. Our method is applicable to content-based motion retrieval of human motion for large-scale database systems. It is also applicable to e-Learning systems which automatically evaluates the performance of dancers and athletes by comparing the subjects' motions with those by experts. Copyright © 2008 John Wiley & Sons, Ltd. |
Persistent Identifier | http://hdl.handle.net/10722/288972 |
ISSN | 2023 Impact Factor: 0.9 2023 SCImago Journal Rankings: 0.403 |
ISI Accession Number ID |
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Tang, Jeff K.T. | - |
dc.contributor.author | Leung, Howard | - |
dc.contributor.author | Komura, Taku | - |
dc.contributor.author | Shum, Hubert P.H. | - |
dc.date.accessioned | 2020-10-12T08:06:21Z | - |
dc.date.available | 2020-10-12T08:06:21Z | - |
dc.date.issued | 2008 | - |
dc.identifier.citation | Computer Animation and Virtual Worlds, 2008, v. 19, n. 3-4, p. 211-221 | - |
dc.identifier.issn | 1546-4261 | - |
dc.identifier.uri | http://hdl.handle.net/10722/288972 | - |
dc.description.abstract | Evaluating the similarity of motions is useful for motion retrieval, motion blending, and performance analysis of dancers and athletes. Euclidean distance between corresponding joints has been widely adopted in measuring similarity of postures and hence motions. However, such a measure does not necessarily conform to the human perception of motion similarity. In this paper, we propose a new similarity measure based on machine learning techniques. We make use of the results of questionnaires from subjects answering whether arbitrary pairs of motions appear similar or not. Using the relative distance between the joints as the basic features, we train the system to compute the similarity of arbitrary pair of motions. Experimental results show that our method outperforms methods based on Euclidean distance between corresponding joints. Our method is applicable to content-based motion retrieval of human motion for large-scale database systems. It is also applicable to e-Learning systems which automatically evaluates the performance of dancers and athletes by comparing the subjects' motions with those by experts. Copyright © 2008 John Wiley & Sons, Ltd. | - |
dc.language | eng | - |
dc.relation.ispartof | Computer Animation and Virtual Worlds | - |
dc.subject | 3D human motion similarity | - |
dc.subject | Human perception | - |
dc.subject | Pattern recognition | - |
dc.title | Emulating human perception of motion similarity | - |
dc.type | Conference_Paper | - |
dc.description.nature | link_to_subscribed_fulltext | - |
dc.identifier.doi | 10.1002/cav.260 | - |
dc.identifier.scopus | eid_2-s2.0-52949096876 | - |
dc.identifier.volume | 19 | - |
dc.identifier.issue | 3-4 | - |
dc.identifier.spage | 211 | - |
dc.identifier.epage | 221 | - |
dc.identifier.eissn | 1546-427X | - |
dc.identifier.isi | WOS:000259628200006 | - |
dc.identifier.issnl | 1546-4261 | - |