File Download
There are no files associated with this item.
Links for fulltext
(May Require Subscription)
- Publisher Website: 10.5220/0001802805350540
- Scopus: eid_2-s2.0-70349293774
Supplementary
-
Citations:
- Scopus: 0
- Appears in Collections:
Conference Paper: Scale robust adaptive feature density approximation for visual object representation and tracking
Title | Scale robust adaptive feature density approximation for visual object representation and tracking |
---|---|
Authors | |
Keywords | Bayesian Adaptation Density Approximation EM Feature Scale Selection MAP Tracking |
Issue Date | 2009 |
Publisher | INSTICC Press. |
Citation | The 4th International Conference on Computer Vision Theory and Applications (VISAPP/GRAPP 2009), Lisboa, Portugal, 5-8 February 2009. In Proceedings of the 4th International Conference on Computer Vision Theory and Applications, 2009, v. 2, p. 535-540 How to Cite? |
Abstract | Feature density approximation (FDA) based visual object appearance representation is emerging as an effective method for object tracking, but its challenges come from object's complex motion (e.g. scaling, rotation) and the consequent object's appearance variation. The traditional adaptive FDA methods extract features in fixed scales ignoring the object's scale variation, and update FDA by sequential Maximum Likelihood estimation, which lacks robustness for sparse data. In this paper, to solve the above challenges, a robust multi-scale adaptive FDA object representation method is proposed for tracking, and its robust FDA updating method is provided. This FDA achieve robustness by extracting features in the selected scale and estimating feature density using a new likelihood function defined both by feature set and the feature's effectiveness probability. In FDA updating, robustness is achieved updating FDA in a Bayesian way by MAP-EM algorithm using density prior knowledge extracted from historical density. Object complex motion (e.g. scaling and rotation) is solved by correlating object appearance with its spatial alignment. Experimental results show that this method is efficient for complex motion, and robust in adapting the object appearance variation caused by changing scale, illumination, pose and viewing angel. |
Description | VISAPP is part of VISIGRAPP - The International Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications Paper no. 228 The Abstracts' web site is located at http://www.visapp.visigrapp.org/Abstracts/2009/VISAPP_2009_Abstracts.htm |
Persistent Identifier | http://hdl.handle.net/10722/158594 |
ISBN | |
References |
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Liu, C | en_US |
dc.contributor.author | Yung, NHC | en_US |
dc.contributor.author | Fang, RG | en_US |
dc.date.accessioned | 2012-08-08T09:00:23Z | - |
dc.date.available | 2012-08-08T09:00:23Z | - |
dc.date.issued | 2009 | en_US |
dc.identifier.citation | The 4th International Conference on Computer Vision Theory and Applications (VISAPP/GRAPP 2009), Lisboa, Portugal, 5-8 February 2009. In Proceedings of the 4th International Conference on Computer Vision Theory and Applications, 2009, v. 2, p. 535-540 | en_US |
dc.identifier.isbn | 978-989-8111-69-2 | - |
dc.identifier.uri | http://hdl.handle.net/10722/158594 | - |
dc.description | VISAPP is part of VISIGRAPP - The International Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications | - |
dc.description | Paper no. 228 | - |
dc.description | The Abstracts' web site is located at http://www.visapp.visigrapp.org/Abstracts/2009/VISAPP_2009_Abstracts.htm | - |
dc.description.abstract | Feature density approximation (FDA) based visual object appearance representation is emerging as an effective method for object tracking, but its challenges come from object's complex motion (e.g. scaling, rotation) and the consequent object's appearance variation. The traditional adaptive FDA methods extract features in fixed scales ignoring the object's scale variation, and update FDA by sequential Maximum Likelihood estimation, which lacks robustness for sparse data. In this paper, to solve the above challenges, a robust multi-scale adaptive FDA object representation method is proposed for tracking, and its robust FDA updating method is provided. This FDA achieve robustness by extracting features in the selected scale and estimating feature density using a new likelihood function defined both by feature set and the feature's effectiveness probability. In FDA updating, robustness is achieved updating FDA in a Bayesian way by MAP-EM algorithm using density prior knowledge extracted from historical density. Object complex motion (e.g. scaling and rotation) is solved by correlating object appearance with its spatial alignment. Experimental results show that this method is efficient for complex motion, and robust in adapting the object appearance variation caused by changing scale, illumination, pose and viewing angel. | en_US |
dc.language | eng | en_US |
dc.publisher | INSTICC Press. | - |
dc.relation.ispartof | Proceedings of the 4th International Conference on Computer Vision Theory and Applications | en_US |
dc.subject | Bayesian Adaptation | en_US |
dc.subject | Density Approximation | en_US |
dc.subject | EM | en_US |
dc.subject | Feature Scale Selection | en_US |
dc.subject | MAP | en_US |
dc.subject | Tracking | en_US |
dc.title | Scale robust adaptive feature density approximation for visual object representation and tracking | en_US |
dc.type | Conference_Paper | en_US |
dc.identifier.email | Yung, NHC:nyung@eee.hku.hk | en_US |
dc.identifier.authority | Yung, NHC=rp00226 | en_US |
dc.description.nature | link_to_subscribed_fulltext | en_US |
dc.identifier.doi | 10.5220/0001802805350540 | - |
dc.identifier.scopus | eid_2-s2.0-70349293774 | en_US |
dc.identifier.hkuros | 164713 | - |
dc.relation.references | http://www.scopus.com/mlt/select.url?eid=2-s2.0-70349293774&selection=ref&src=s&origin=recordpage | en_US |
dc.identifier.volume | 2 | en_US |
dc.identifier.spage | 535 | en_US |
dc.identifier.epage | 540 | en_US |
dc.identifier.scopusauthorid | Liu, CY=26431035900 | en_US |
dc.identifier.scopusauthorid | Yung, NHC=7003473369 | en_US |
dc.identifier.scopusauthorid | Fang, RG=36336806400 | en_US |
dc.customcontrol.immutable | sml 140725 | - |