File Download

There are no files associated with this item.

  Links for fulltext
     (May Require Subscription)
Supplementary

Conference Paper: Local Gaussian Density Mixtures for Unstructured Lumigraph Rendering

TitleLocal Gaussian Density Mixtures for Unstructured Lumigraph Rendering
Authors
Issue Date3-Dec-2024
PublisherACM
Abstract

To improve novel view synthesis of curved-surface reflections and refractions, we revisit local geometry-guided ray interpolation techniques with modern differentiable rendering and optimization. In contrast to depth or mesh geometries, our approach uses a local or per-view density represented as Gaussian mixtures along each ray. To synthesize novel views, we warp and fuse local volumes, then alpha-composite using input photograph ray colors from a small set of neighboring images. For fusion, we use a neural blending weight from a shallow MLP. We optimize the local Gaussian density mixtures using both a reconstruction loss and a consistency loss. The consistency loss, based on per-ray KL-divergence, encourages more accurate geometry reconstruction. In scenes with complex reflections captured in our LGDM dataset, the experimental results show that our method outperforms state-of-the-art novel view synthesis methods by 12.2%–37.1% in PSNR, due to its ability to maintain sharper view-dependent appearances. Project webpage: https://xchaowu.github.io/papers/lgdm/index.html


Persistent Identifierhttp://hdl.handle.net/10722/362286

 

DC FieldValueLanguage
dc.contributor.authorWu, Xiuchao-
dc.contributor.authorXu, Jiamin-
dc.contributor.authorWang, Chi-
dc.contributor.authorPeng, Yifan-
dc.contributor.authorHuang, Qixing-
dc.contributor.authorTompkin, James-
dc.contributor.authorXu, Weiwei-
dc.date.accessioned2025-09-21T00:35:09Z-
dc.date.available2025-09-21T00:35:09Z-
dc.date.issued2024-12-03-
dc.identifier.urihttp://hdl.handle.net/10722/362286-
dc.description.abstract<p> To improve novel view synthesis of curved-surface reflections and refractions, we revisit local geometry-guided ray interpolation techniques with modern differentiable rendering and optimization. In contrast to depth or mesh geometries, our approach uses a local or per-view density represented as Gaussian mixtures along each ray. To synthesize novel views, we warp and fuse local volumes, then alpha-composite using input photograph ray colors from a small set of neighboring images. For fusion, we use a neural blending weight from a shallow MLP. We optimize the local Gaussian density mixtures using both a reconstruction loss and a consistency loss. The consistency loss, based on per-ray KL-divergence, encourages more accurate geometry reconstruction. In scenes with complex reflections captured in our LGDM dataset, the experimental results show that our method outperforms state-of-the-art novel view synthesis methods by 12.2%–37.1% in PSNR, due to its ability to maintain sharper view-dependent appearances. Project webpage: https://xchaowu.github.io/papers/lgdm/index.html <br></p>-
dc.languageeng-
dc.publisherACM-
dc.relation.ispartofACM SIGGRAPH Conference and Exhibition on Computer Graphics and Interactive Techniques in Asia (03/12/2024-06/12/2024, Tokyo)-
dc.titleLocal Gaussian Density Mixtures for Unstructured Lumigraph Rendering-
dc.typeConference_Paper-
dc.identifier.doi10.1145/3680528.3687659-
dc.identifier.spage1-
dc.identifier.epage11-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats