Abstract
The autoregressive entropy model facilitates high compression efficiency by capturing intricate dependencies but suffers from slow decoding due to its serial context dependencies. To address this, we propose ParaPCAC, a lossy Parallel Point Cloud Attribute Compression scheme, designed to optimize the efficiency of the autoregressive entropy model. Our approach focuses on two main components: a parallel decoding strategy and a multi-stage context-based entropy model. In the parallel decoding strategy, we partition the voxels of the quantized latent features into non-overlapping groups for independent context entropy modeling, enabling parallel processing. The multi-stage context based entropy model is employed to decode neighboring features concurrently, utilizing previously decoded features at each stage. Global hyperprior is incorporated after the first stage to improve the estimation of attribute probability. Through these two techniques, ParaPCAC achieves significant decoding speed enhancements, with an acceleration of up to 160× and a 24.15% BD-Rate reduction compared to serial autoregressive entropy models. Furthermore, experimental results demonstrate that ParaPCAC outperforms existing learning-based methods in rate-distortion performance and decoding latency.
| Original language | English |
|---|---|
| Article number | e106 |
| Journal | APSIPA Transactions on Signal and Information Processing |
| Volume | 14 |
| Issue number | 2 |
| DOIs | |
| State | Published - 23 Apr 2025 |
| Externally published | Yes |
Keywords
- Point cloud compression
- learned data compression
- point cloud attribute compression
Fingerprint
Dive into the research topics of 'Efficient Multi-stage Context Based Entropy Model for Learned Lossy Point Cloud Attribute Compression'. Together they form a unique fingerprint.Cite this
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver