yifutao nielsr HF Staff commited on
Commit
0020bc7
·
verified ·
1 Parent(s): 1f919df

Improve dataset card: Add task categories, sample usage, HF paper link, and citation (#2)

Browse files

- Improve dataset card: Add task categories, sample usage, HF paper link, and citation (ab8abc539a3c0eb0b0e1069aeefeef818e3c2300)


Co-authored-by: Niels Rogge <[email protected]>

Files changed (1) hide show
  1. README.md +70 -3
README.md CHANGED
@@ -1,8 +1,75 @@
1
  ---
2
  license: cc-by-nc-sa-4.0
 
 
 
 
 
 
 
 
 
 
 
 
 
 
3
  ---
 
4
  We present the Oxford Spires Dataset, captured in and around well-known landmarks in Oxford using a custom-built multi-sensor perception unit as well as a millimetre-accurate map from a terrestrial LiDAR scanner (TLS). The perception unit includes three global shutter colour cameras, an automotive 3D LiDAR scanner, and an inertial sensor — all precisely calibrated.
5
- - [project page](https://dynamic.robots.ox.ac.uk/datasets/oxford-spires/)
 
6
  - [Arxiv](https://arxiv.org/abs/2411.10546)
7
- - [video](https://youtu.be/AKZ-YrOob_4?si=rY94Gn96V2zfQBNH)
8
- - [code](https://github.com/ori-drs/oxford_spires_dataset)
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
  ---
2
  license: cc-by-nc-sa-4.0
3
+ task_categories:
4
+ - robotics
5
+ - image-to-3d
6
+ tags:
7
+ - slam
8
+ - lidar
9
+ - 3d-reconstruction
10
+ - nerf
11
+ - 3d-gaussian-splatting
12
+ - localization
13
+ - sfm
14
+ - mvs
15
+ - multimodal
16
+ - oxford
17
  ---
18
+
19
  We present the Oxford Spires Dataset, captured in and around well-known landmarks in Oxford using a custom-built multi-sensor perception unit as well as a millimetre-accurate map from a terrestrial LiDAR scanner (TLS). The perception unit includes three global shutter colour cameras, an automotive 3D LiDAR scanner, and an inertial sensor — all precisely calibrated.
20
+ - [Project page](https://dynamic.robots.ox.ac.uk/datasets/oxford-spires/)
21
+ - [Paper](https://huggingface.co/papers/2411.10546)
22
  - [Arxiv](https://arxiv.org/abs/2411.10546)
23
+ - [Video](https://youtu.be/AKZ-YrOob_4?si=rY94Gn96V2zfQBNH)
24
+ - [Code](https://github.com/ori-drs/oxford_spires_dataset)
25
+
26
+ ### Sample Usage
27
+
28
+ ### Download the Dataset
29
+
30
+ You can download the dataset from Hugging Face using the provided script. You can specify which folders to download by changing the `example_pattern`. Core sequences are also defined in the script.
31
+
32
+ ```bash
33
+ python scripts/dataset_download.py
34
+ ```
35
+
36
+ ### Install Python Tools
37
+
38
+ Install `oxspires_tools` to access Python utilities for using the dataset:
39
+
40
+ ```bash
41
+ pip install .
42
+ ```
43
+
44
+ To enable C++/Python bindings (requires PCL and Octomap):
45
+
46
+ ```bash
47
+ BUILD_CPP=1 pip install .
48
+ ```
49
+
50
+ Alternatively, use the provided Docker container:
51
+
52
+ ```bash
53
+ docker compose -f .docker/oxspires/docker-compose.yml run --build oxspires_utils
54
+ ```
55
+
56
+ ### Generate Depth Images
57
+
58
+ The following script downloads synchronised images and LiDAR data from a sequence on Hugging Face and generates depth images, LiDAR overlaid on camera images, and surface normal images:
59
+
60
+ ```bash
61
+ python scripts/generate_depth.py
62
+ ```
63
+
64
+ ### Citation
65
+
66
+ If you use The Oxford Spires Dataset in your research, please cite the following paper:
67
+
68
+ ```bibtex
69
+ @article{tao2025spires,
70
+ title={The Oxford Spires Dataset: Benchmarking Large-Scale LiDAR-Visual Localisation, Reconstruction and Radiance Field Methods},
71
+ author={Tao, Yifu and Mu{\~n}oz-Ba{\~n}{\'o}n, Miguel {\'A}ngel and Zhang, Lintong and Wang, Jiahao and Fu, Lanke Frank Tarimo and Fallon, Maurice},
72
+ journal={International Journal of Robotics Research},
73
+ year={2025},
74
+ }
75
+ ```