File Download

There are no files associated with this item.

  Links for fulltext
     (May Require Subscription)
Supplementary

Conference Paper: Point Cloud Compression with Implicit Neural Representations: A Unified Framework

TitlePoint Cloud Compression with Implicit Neural Representations: A Unified Framework
Authors
Keywordsimplicit neural representation
neural network compression
Point cloud compression
Issue Date2024
Citation
2024 IEEE Cic International Conference on Communications in China Iccc 2024, 2024, p. 1709-1714 How to Cite?
AbstractPoint clouds have become increasingly vital across various applications thanks to their ability to realistically depict 3D objects and scenes. Nevertheless, effectively compressing unstructured, high-precision point cloud data remains a significant challenge. In this paper, we present a pioneering point cloud compression framework capable of handling both geometry and attribute components. Unlike traditional approaches and existing learning-based methods, our framework utilizes two coordinatebased neural networks to implicitly represent a voxelized point cloud. The first network generates the occupancy status of a voxel, while the second network determines the attributes of an occupied voxel. To tackle an immense number of voxels within the volumetric space, we partition the space into smaller cubes and focus solely on voxels within non-empty cubes. By feeding the coordinates of these voxels into the respective networks, we reconstruct the geometry and attribute components of the original point cloud. The neural network parameters are further quantized and compressed. Experimental results underscore the superior performance of our proposed method compared to the octree-based approach employed in the latest G-PCC standards. Moreover, our method exhibits high universality when contrasted with existing learning-based techniques.
Persistent Identifierhttp://hdl.handle.net/10722/363769

 

DC FieldValueLanguage
dc.contributor.authorRuan, Hongning-
dc.contributor.authorShao, Yulin-
dc.contributor.authorYang, Qianqian-
dc.contributor.authorZhao, Liang-
dc.contributor.authorNiyato, Dusit-
dc.date.accessioned2025-10-10T07:49:13Z-
dc.date.available2025-10-10T07:49:13Z-
dc.date.issued2024-
dc.identifier.citation2024 IEEE Cic International Conference on Communications in China Iccc 2024, 2024, p. 1709-1714-
dc.identifier.urihttp://hdl.handle.net/10722/363769-
dc.description.abstractPoint clouds have become increasingly vital across various applications thanks to their ability to realistically depict 3D objects and scenes. Nevertheless, effectively compressing unstructured, high-precision point cloud data remains a significant challenge. In this paper, we present a pioneering point cloud compression framework capable of handling both geometry and attribute components. Unlike traditional approaches and existing learning-based methods, our framework utilizes two coordinatebased neural networks to implicitly represent a voxelized point cloud. The first network generates the occupancy status of a voxel, while the second network determines the attributes of an occupied voxel. To tackle an immense number of voxels within the volumetric space, we partition the space into smaller cubes and focus solely on voxels within non-empty cubes. By feeding the coordinates of these voxels into the respective networks, we reconstruct the geometry and attribute components of the original point cloud. The neural network parameters are further quantized and compressed. Experimental results underscore the superior performance of our proposed method compared to the octree-based approach employed in the latest G-PCC standards. Moreover, our method exhibits high universality when contrasted with existing learning-based techniques.-
dc.languageeng-
dc.relation.ispartof2024 IEEE Cic International Conference on Communications in China Iccc 2024-
dc.subjectimplicit neural representation-
dc.subjectneural network compression-
dc.subjectPoint cloud compression-
dc.titlePoint Cloud Compression with Implicit Neural Representations: A Unified Framework-
dc.typeConference_Paper-
dc.description.naturelink_to_subscribed_fulltext-
dc.identifier.doi10.1109/ICCC62479.2024.10681880-
dc.identifier.scopuseid_2-s2.0-85205332605-
dc.identifier.spage1709-
dc.identifier.epage1714-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats