|
## Motivation |
|
|
|
Given the safety concerns and high costs associated with real-world autonomous driving testing, high-fidelity simulation techniques have become crucial for advancing the capabilities of autonomous systems. |
|
|
|
This workshop seeks to answer the following questions: |
|
1. How well can we render? While NVS methods have made significant progress in generating |
|
photorealistic urban scenes, their performance still lags in extrapolated viewpoints when only a |
|
limited viewpoint is provided during training. However, extrapolated viewpoints are essential for |
|
closed-loop simulation. Improving the accuracy and consistency of NVS across diverse viewing |
|
angles is critical for ensuring that these simulators provide reliable environments for driving |
|
evaluation. |
|
2. How well can we drive? Despite challenges in extrapolated viewpoint rendering, existing |
|
methods enable photorealistic simulators with reasonable performance when trained on dense |
|
views. These NVS-based simulators allow autonomous driving models to be |
|
tested in a fully closed-loop manner, bridging the gap between real-world data and interactive |
|
evaluation. This shift allows for benchmarking autonomous driving algorithms under realistic |
|
conditions, overcoming the limitations of static datasets. |
|
|
|
This challenge focuses on the second question regarding the performance of autonomous driving algorithms in closed-loop evaluation. If you are interested in the first topic, please refer to the [competition of extrapolated novel view synthesis](https://huggingface.co/spaces/XDimLab/ICCV2025-RealADSim-NVS). |
|
|
|
## Task Description |
|
 |
|
The challenge focuses on evaluating autonomous driving algorithms using [HUGSIM](https://github.com/hyzhou404/HUGSIM), a photorealistic, closed-loop driving simulator. HUGSIM provides a diverse set of challenging, photo-realistic urban scenarios, including oncoming traffic and cut-in behaviors. |
|
|
|
Unlike typical challenges that require submissions as static files, this competition requires participants to submit models and code. This is because the closed-loop evaluation demands real-time interaction between the driving algorithm and the simulator at every timestep. Please note that your submissions are securely protected and will only be used for closed-loop evaluation. For more information on how we handle your data and ensure privacy, please refer to the Privacy section. |
|
|
|
## Evaluation Metric |
|
The primary metric of this challenge is HD-Score (HUGSIM Driving Score) defined as: |
|
 |
|
The HD-Score at a single timestamp consists of two categories of components: |
|
- Driving policy items, including no collisions (NC) and drivable area compliance (DAC), which are critical for ensuring driving safety. |
|
- Contributory items, including time-to-collision (TTC) and comfort (COM), which may not directly lead to failure cases when low, but still contribute to overall driving quality. The TTC and COM terms are weighted by 5 and 2, respectively. |
|
|
|
The HD-Score is computed as a weighted average across all simulation timestamps and then multiplied by the global route completion score R<sub>c</sub>. |
|
|
|
## Privacy Assurance |
|
Your model checkpoints should be stored in a private Huggingface model hub. |
|
|
|
The tested algorithms are hosted on a Huggingface instance server, which will be destroyed once the evaluation is completed. We do not have access to this server at any point. |
|
|
|
To ensure transparency and prevent any form of cheating, the server’s behavior is fully defined in a Dockerfile, which is shared with all participants. |
|
|
|
## Timeline |
|
- Challenge Release: June 30, 2025 |
|
- Challenge Submission Due: August 31, 2025 |
|
- Release Results & Submit Technical Report: September 05, 2025 |
|
- Technical Report Due: September 20, 2025 |
|
|
|
## Award |
|
Winners will be announced at the ICCV2025 Workshop. |
|
|
|
## How to Participate |
|
1. Click the "Login with Huggingface" button. |
|
2. Click the "Register" button and complete the form. |
|
3. The "Submission Information" page will be available once you submit the form. |
|
4. We will review the submitted forms and grant authorization for the submitted results. |
|
|
|
- Innovation Award: $9,000 |
|
- Outstanding Champion: $9,000 |
|
- Honorable Runner-up: $3,000 |
|
|
|
## Citation |
|
If you find our work useful, please kindly cite us via: |
|
```bibtex |
|
@article{zhou2024hugsim, |
|
title={HUGSIM: A Real-Time, Photo-Realistic and Closed-Loop Simulator for Autonomous Driving}, |
|
author={Zhou, Hongyu and Lin, Longzhong and Wang, Jiabao and Lu, Yichong and Bai, Dongfeng and Liu, Bingbing and Wang, Yue and Geiger, Andreas and Liao, Yiyi}, |
|
journal={arXiv preprint arXiv:2412.01718}, |
|
year={2024} |
|
} |
|
``` |