File Download
There are no files associated with this item.
Links for fulltext
(May Require Subscription)
- Publisher Website: 10.1016/j.patcog.2009.10.008
- Scopus: eid_2-s2.0-74449091092
- WOS: WOS:000274954100042
- Find via
Supplementary
- Citations:
- Appears in Collections:
Article: Illumination direction estimation for augmented reality using a surface input real valued output regression network
Title | Illumination direction estimation for augmented reality using a surface input real valued output regression network |
---|---|
Authors | |
Keywords | Illuminant direction estimation Neural network with functions as input Surface input pattern |
Issue Date | 2010 |
Citation | Pattern Recognition, 2010, v. 43 n. 4, p. 1700-1716 How to Cite? |
Abstract | Due to low cost for capturing depth information, it is worthwhile to reduce the illumination ambiguity by employing scenario depth information. In this article, a neural computation approach is reported that estimates illuminant direction from scenario reflectance map. Since the reflectance map recovered from depth map and image is a variable sized point cloud, we propose to parameterize it as a two dimensional polynomial function. Afterwards, a novel network model is presented for mapping from continuous function (reflectance map) to vectorial output (illuminant direction). Experimental results show that the proposed model works well on both synthetic and real scenes. © 2009 Elsevier Ltd. All rights reserved. |
Persistent Identifier | http://hdl.handle.net/10722/196677 |
ISSN | 2023 Impact Factor: 7.5 2023 SCImago Journal Rankings: 2.732 |
ISI Accession Number ID |
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Chow, CK | - |
dc.contributor.author | Yuen, SY | - |
dc.date.accessioned | 2014-04-24T02:10:33Z | - |
dc.date.available | 2014-04-24T02:10:33Z | - |
dc.date.issued | 2010 | - |
dc.identifier.citation | Pattern Recognition, 2010, v. 43 n. 4, p. 1700-1716 | - |
dc.identifier.issn | 0031-3203 | - |
dc.identifier.uri | http://hdl.handle.net/10722/196677 | - |
dc.description.abstract | Due to low cost for capturing depth information, it is worthwhile to reduce the illumination ambiguity by employing scenario depth information. In this article, a neural computation approach is reported that estimates illuminant direction from scenario reflectance map. Since the reflectance map recovered from depth map and image is a variable sized point cloud, we propose to parameterize it as a two dimensional polynomial function. Afterwards, a novel network model is presented for mapping from continuous function (reflectance map) to vectorial output (illuminant direction). Experimental results show that the proposed model works well on both synthetic and real scenes. © 2009 Elsevier Ltd. All rights reserved. | - |
dc.language | eng | - |
dc.relation.ispartof | Pattern Recognition | - |
dc.subject | Illuminant direction estimation | - |
dc.subject | Neural network with functions as input | - |
dc.subject | Surface input pattern | - |
dc.title | Illumination direction estimation for augmented reality using a surface input real valued output regression network | - |
dc.type | Article | - |
dc.description.nature | link_to_subscribed_fulltext | - |
dc.identifier.doi | 10.1016/j.patcog.2009.10.008 | - |
dc.identifier.scopus | eid_2-s2.0-74449091092 | - |
dc.identifier.volume | 43 | - |
dc.identifier.issue | 4 | - |
dc.identifier.spage | 1700 | - |
dc.identifier.epage | 1716 | - |
dc.identifier.isi | WOS:000274954100042 | - |
dc.identifier.issnl | 0031-3203 | - |