File size: 1,716 Bytes
fe00eb8
8a0742e
 
fe00eb8
8a0742e
 
 
 
 
2479b01
 
 
8a0742e
 
004e064
6df2bcf
004e064
12dde5c
004e064
8a0742e
004e064
8a0742e
004e064
8a0742e
004e064
 
 
 
 
 
 
 
 
 
 
8a0742e
 
004e064
 
 
8a0742e
004e064
 
fe00eb8
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
---
pipeline_tag: image-to-3d
license: apache-2.0
---

# GeoSVR: Taming Sparse Voxels for Geometrically Accurate Surface Reconstruction

This repository provides the reconstructed meshes and resources for the paper [GeoSVR: Taming Sparse Voxels for Geometrically Accurate Surface Reconstruction](https://huggingface.co/papers/2509.18090), which presents an explicit voxel-based framework for accurate, detailed, and complete surface reconstruction.

*   [๐Ÿ“š Paper](https://huggingface.co/papers/2509.18090)
*   [๐ŸŒ Project Page](https://fictionarry.github.io/GeoSVR-project/)
*   [๐Ÿ’ป Code](https://github.com/Fictionarry/GeoSVR)

## Reconstruction on Tanks and Temples and DTU Datasets

Here we provide the reconstructed meshes of the paper's experiments from GeoSVR.

You can browse all the released meshes at:

-   `meshes_complete/`: The complete meshes of the two datasets.

-   `DTU_meshes_eval/`: The meshes on DTU datasets, with strict filtering strategy for evaluation.

-   `TnT_meshes_eval/`: The meshes on TnT datasets, with strict filtering strategy for evaluation.

Metrics shall be reproduced with the results with postfix of `_eval`.

## Download

```python
from huggingface_hub import snapshot_download
snapshot_download(repo_id="Fictionary/GeoSVR", cache_dir='./GeoSVR/results', local_dir ='./GeoSVR/results')
```
or use Git to clone this repository with LFS.

## Citation
```bibtex
@article{li2025geosvr,
  title={GeoSVR: Taming Sparse Voxels for Geometrically Accurate Surface Reconstruction},
  author={Li, Jiahe and Zhang, Jiawei and Zhang, Youmin and Bai, Xiao and Zheng, Jin and Yu, Xiaohan and Gu, Lin},
  journal={Advances in Neural Information Processing Systems},
  year={2025}
}
```