File Download
There are no files associated with this item.
Links for fulltext
(May Require Subscription)
- Publisher Website: 10.1109/MCG.2017.3271464
- Scopus: eid_2-s2.0-85028884994
- PMID: 28829292
- WOS: WOS:000411626600007
- Find via

Supplementary
- Citations:
- Appears in Collections:
Article: Fast neural style transfer for motion data
| Title | Fast neural style transfer for motion data |
|---|---|
| Authors | |
| Keywords | deep learning motion capture style transfer machine learning computer graphics |
| Issue Date | 2017 |
| Citation | IEEE Computer Graphics and Applications, 2017, v. 37, n. 4, p. 42-49 How to Cite? |
| Abstract | © 1981-2012 IEEE. Automating motion style transfer can help save animators time by allowing them to produce a single set of motions, which can then be automatically adapted for use with different characters. The proposed fast, efficient technique for performing neural style transfer of human motion data uses a feed-forward neural network trained on a large motion database. The proposed framework can transform the style of motion thousands of times faster than previous approaches that use optimization. |
| Persistent Identifier | http://hdl.handle.net/10722/288877 |
| ISSN | 2023 Impact Factor: 1.7 2023 SCImago Journal Rankings: 0.385 |
| ISI Accession Number ID |
| DC Field | Value | Language |
|---|---|---|
| dc.contributor.author | Holden, Daniel | - |
| dc.contributor.author | Habibie, Ikhsanul | - |
| dc.contributor.author | Kusajima, Ikuo | - |
| dc.contributor.author | Komura, Taku | - |
| dc.date.accessioned | 2020-10-12T08:06:06Z | - |
| dc.date.available | 2020-10-12T08:06:06Z | - |
| dc.date.issued | 2017 | - |
| dc.identifier.citation | IEEE Computer Graphics and Applications, 2017, v. 37, n. 4, p. 42-49 | - |
| dc.identifier.issn | 0272-1716 | - |
| dc.identifier.uri | http://hdl.handle.net/10722/288877 | - |
| dc.description.abstract | © 1981-2012 IEEE. Automating motion style transfer can help save animators time by allowing them to produce a single set of motions, which can then be automatically adapted for use with different characters. The proposed fast, efficient technique for performing neural style transfer of human motion data uses a feed-forward neural network trained on a large motion database. The proposed framework can transform the style of motion thousands of times faster than previous approaches that use optimization. | - |
| dc.language | eng | - |
| dc.relation.ispartof | IEEE Computer Graphics and Applications | - |
| dc.subject | deep learning | - |
| dc.subject | motion capture | - |
| dc.subject | style transfer | - |
| dc.subject | machine learning | - |
| dc.subject | computer graphics | - |
| dc.title | Fast neural style transfer for motion data | - |
| dc.type | Article | - |
| dc.description.nature | link_to_subscribed_fulltext | - |
| dc.identifier.doi | 10.1109/MCG.2017.3271464 | - |
| dc.identifier.pmid | 28829292 | - |
| dc.identifier.scopus | eid_2-s2.0-85028884994 | - |
| dc.identifier.volume | 37 | - |
| dc.identifier.issue | 4 | - |
| dc.identifier.spage | 42 | - |
| dc.identifier.epage | 49 | - |
| dc.identifier.isi | WOS:000411626600007 | - |
| dc.identifier.issnl | 0272-1716 | - |
