Papers
arxiv:2507.01634

Depth Anything at Any Condition

Published on Jul 2
· Submitted by BBBBCHAN on Jul 3
#3 Paper of the day
Authors:
,
,

Abstract

We present Depth Anything at Any Condition (DepthAnything-AC), a foundation monocular depth estimation (MDE) model capable of handling diverse environmental conditions. Previous foundation MDE models achieve impressive performance across general scenes but not perform well in complex open-world environments that involve challenging conditions, such as illumination variations, adverse weather, and sensor-induced distortions. To overcome the challenges of data scarcity and the inability of generating high-quality pseudo-labels from corrupted images, we propose an unsupervised consistency regularization finetuning paradigm that requires only a relatively small amount of unlabeled data. Furthermore, we propose the Spatial Distance Constraint to explicitly enforce the model to learn patch-level relative relationships, resulting in clearer semantic boundaries and more accurate details. Experimental results demonstrate the zero-shot capabilities of DepthAnything-AC across diverse benchmarks, including real-world adverse weather benchmarks, synthetic corruption benchmarks, and general benchmarks. Project Page: https://ghost233lism.github.io/depthanything-AC-page Code: https://github.com/HVision-NKU/DepthAnythingAC

Community

Paper author Paper submitter

DepthAnything-AC is a robust monocular depth estimation (MDE) model designed for zero-shot depth estimation under diverse and challenging environmental conditions, including low light, adverse weather, and sensor distortions.

Project page: https://ghost233lism.github.io/depthanything-AC-page/
Github: https://github.com/HVision-NKU/DepthAnythingAC

Please feel free to communicate with us if there are any issuses.

Sign up or log in to comment

Models citing this paper 1

Datasets citing this paper 0

No dataset linking this paper

Cite arxiv.org/abs/2507.01634 in a dataset README.md to link it from this page.

Spaces citing this paper 0

No Space linking this paper

Cite arxiv.org/abs/2507.01634 in a Space README.md to link it from this page.

Collections including this paper 1