File Download

There are no files associated with this item.

Supplementary

Conference Paper: Zolly: Zoom Focal Length Correctly for Perspective-Distorted Human Mesh Reconstruction

TitleZolly: Zoom Focal Length Correctly for Perspective-Distorted Human Mesh Reconstruction
Authors
Issue Date2-Oct-2023
Abstract

    As it is hard to calibrate single-view RGB images in the wild, existing 3D human mesh reconstruction~(3DHMR) methods either use a constant large focal length or estimate one based on the background environment context, which can not tackle the problem of the torso, limb, hand or face distortion caused by perspective camera projection when the camera is close to the human body. The naive focal length assumptions can harm this task with the incorrectly formulated projection matrices. To solve this, we propose Zolly, the first 3DHMR method focusing on perspective-distorted images. Our approach begins with analysing the reason for perspective distortion, which we find is mainly caused by the relative location of the human body to the camera center. We propose a new camera model and a novel 2D representation, termed distortion image, which describes the 2D dense distortion scale of the human body. We then estimate the distance from distortion scale features rather than environment context features. Afterwards, We integrate the distortion feature with image features to reconstruct the body mesh. To formulate the correct projection matrix and locate the human body position, we simultaneously use perspective and weak-perspective projection loss. Since existing datasets could not handle this task, we propose the first synthetic dataset PDHuman and extend two real-world datasets tailored for this task, all containing perspective-distorted human images. Extensive experiments show that Zolly outperforms existing state-of-the-art methods on both perspective-distorted datasets and the standard benchmark (3DPW). Code and dataset will be released at https://wenjiawang0312.github.io/projects/zolly/


Persistent Identifierhttp://hdl.handle.net/10722/337949

 

DC FieldValueLanguage
dc.contributor.authorWang, Wenjia-
dc.contributor.authorGe, Yongtao-
dc.contributor.authorMei, Haiyi-
dc.contributor.authorCai, Zhongang-
dc.contributor.authorSun, Qingping-
dc.contributor.authorWang, Yanjun-
dc.contributor.authorShen, Chunhua-
dc.contributor.authorYang, Lei-
dc.contributor.authorKomura, Taku-
dc.date.accessioned2024-03-11T10:25:08Z-
dc.date.available2024-03-11T10:25:08Z-
dc.date.issued2023-10-02-
dc.identifier.urihttp://hdl.handle.net/10722/337949-
dc.description.abstract<p>    As it is hard to calibrate single-view RGB images in the wild, existing 3D human mesh reconstruction~(3DHMR) methods either use a constant large focal length or estimate one based on the background environment context, which can not tackle the problem of the torso, limb, hand or face distortion caused by perspective camera projection when the camera is close to the human body. The naive focal length assumptions can harm this task with the incorrectly formulated projection matrices. To solve this, we propose Zolly, the first 3DHMR method focusing on perspective-distorted images. Our approach begins with analysing the reason for perspective distortion, which we find is mainly caused by the relative location of the human body to the camera center. We propose a new camera model and a novel 2D representation, termed distortion image, which describes the 2D dense distortion scale of the human body. We then estimate the distance from distortion scale features rather than environment context features. Afterwards, We integrate the distortion feature with image features to reconstruct the body mesh. To formulate the correct projection matrix and locate the human body position, we simultaneously use perspective and weak-perspective projection loss. Since existing datasets could not handle this task, we propose the first synthetic dataset PDHuman and extend two real-world datasets tailored for this task, all containing perspective-distorted human images. Extensive experiments show that Zolly outperforms existing state-of-the-art methods on both perspective-distorted datasets and the standard benchmark (3DPW). Code and dataset will be released at https://wenjiawang0312.github.io/projects/zolly/<br></p>-
dc.languageeng-
dc.relation.ispartofIEEE International Conference on Computer Vision 2023 (02/10/2023-06/10/2023, Paris)-
dc.titleZolly: Zoom Focal Length Correctly for Perspective-Distorted Human Mesh Reconstruction-
dc.typeConference_Paper-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats