File Download

There are no files associated with this item.

  Links for fulltext
     (May Require Subscription)
Supplementary

Conference Paper: Geometric structure based and regularized depth estimation from 360 indoor imagery

TitleGeometric structure based and regularized depth estimation from 360<sup>◦</sup> indoor imagery
Authors
Issue Date2020
Citation
Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2020, p. 886-895 How to Cite?
AbstractMotivated by the correlation between the depth and the geometric structure of a 360◦ indoor image, we propose a novel learning-based depth estimation framework that leverages the geometric structure of a scene to conduct depth estimation. Specifically, we represent the geometric structure of an indoor scene as a collection of corners, boundaries and planes. On the one hand, once a depth map is estimated, this geometric structure can be inferred from the estimated depth map; thus, the geometric structure functions as a regularizer for depth estimation. On the other hand, this estimation also benefits from the geometric structure of a scene estimated from an image where the structure functions as a prior. However, furniture in indoor scenes makes it challenging to infer geometric structure from depth or image data. An attention map is inferred to facilitate both depth estimation from features of the geometric structure and also geometric inferences from the estimated depth map. To validate the effectiveness of each component in our framework under controlled conditions, we render a synthetic dataset, Shanghaitech-Kujiale Indoor 360◦ dataset with 3550 360◦ indoor images. Extensive experiments on popular datasets validate the effectiveness of our solution. We also demonstrate that our method can also be applied to counterfactual depth.
Persistent Identifierhttp://hdl.handle.net/10722/345017
ISSN
2023 SCImago Journal Rankings: 10.331

 

DC FieldValueLanguage
dc.contributor.authorJin, Lei-
dc.contributor.authorXu, Yanyu-
dc.contributor.authorZheng, Jia-
dc.contributor.authorZhang, Junfei-
dc.contributor.authorTang, Rui-
dc.contributor.authorXu, Shugong-
dc.contributor.authorYu, Jingyi-
dc.contributor.authorGao, Shenghua-
dc.date.accessioned2024-08-15T09:24:41Z-
dc.date.available2024-08-15T09:24:41Z-
dc.date.issued2020-
dc.identifier.citationProceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2020, p. 886-895-
dc.identifier.issn1063-6919-
dc.identifier.urihttp://hdl.handle.net/10722/345017-
dc.description.abstractMotivated by the correlation between the depth and the geometric structure of a 360◦ indoor image, we propose a novel learning-based depth estimation framework that leverages the geometric structure of a scene to conduct depth estimation. Specifically, we represent the geometric structure of an indoor scene as a collection of corners, boundaries and planes. On the one hand, once a depth map is estimated, this geometric structure can be inferred from the estimated depth map; thus, the geometric structure functions as a regularizer for depth estimation. On the other hand, this estimation also benefits from the geometric structure of a scene estimated from an image where the structure functions as a prior. However, furniture in indoor scenes makes it challenging to infer geometric structure from depth or image data. An attention map is inferred to facilitate both depth estimation from features of the geometric structure and also geometric inferences from the estimated depth map. To validate the effectiveness of each component in our framework under controlled conditions, we render a synthetic dataset, Shanghaitech-Kujiale Indoor 360◦ dataset with 3550 360◦ indoor images. Extensive experiments on popular datasets validate the effectiveness of our solution. We also demonstrate that our method can also be applied to counterfactual depth.-
dc.languageeng-
dc.relation.ispartofProceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition-
dc.titleGeometric structure based and regularized depth estimation from 360<sup>◦</sup> indoor imagery-
dc.typeConference_Paper-
dc.description.naturelink_to_subscribed_fulltext-
dc.identifier.doi10.1109/CVPR42600.2020.00097-
dc.identifier.scopuseid_2-s2.0-85094654736-
dc.identifier.spage886-
dc.identifier.epage895-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats