Papers
arxiv:2506.05327

Revisiting Depth Representations for Feed-Forward 3D Gaussian Splatting

Published on Jun 5
· Submitted by lhmd on Jun 6
Authors:
,
,
,
,

Abstract

PM-Loss, a regularization technique using pointmaps from a pre-trained transformer, enhances feed-forward 3D Gaussian Splatting by improving depth map accuracy and rendering quality.

AI-generated summary

Depth maps are widely used in feed-forward 3D Gaussian Splatting (3DGS) pipelines by unprojecting them into 3D point clouds for novel view synthesis. This approach offers advantages such as efficient training, the use of known camera poses, and accurate geometry estimation. However, depth discontinuities at object boundaries often lead to fragmented or sparse point clouds, degrading rendering quality -- a well-known limitation of depth-based representations. To tackle this issue, we introduce PM-Loss, a novel regularization loss based on a pointmap predicted by a pre-trained transformer. Although the pointmap itself may be less accurate than the depth map, it effectively enforces geometric smoothness, especially around object boundaries. With the improved depth map, our method significantly improves the feed-forward 3DGS across various architectures and scenes, delivering consistently better rendering results. Our project page: https://aim-uofa.github.io/PMLoss

Community

Paper author Paper submitter

We introduce PM-Loss, a novel regularization loss based on a learned point map for feed-forward 3DGS.

Sign up or log in to comment

Models citing this paper 0

No model linking this paper

Cite arxiv.org/abs/2506.05327 in a model README.md to link it from this page.

Datasets citing this paper 0

No dataset linking this paper

Cite arxiv.org/abs/2506.05327 in a dataset README.md to link it from this page.

Spaces citing this paper 0

No Space linking this paper

Cite arxiv.org/abs/2506.05327 in a Space README.md to link it from this page.

Collections including this paper 1