Zhang, LingBai, WenxuXiao, ChunxiaChen, RenjieRitschel, TobiasWhiting, Emily2024-10-132024-10-1320241467-8659https://doi.org/10.1111/cgf.15221https://diglib.eg.org/handle/10.1111/cgf15221Existing image dehazing methods have made remarkable progress. However, they generally perform poorly on images with dense haze, and often suffer from unsatisfactory results with detail degradation or color distortion. In this paper, we propose a density-aware diffusion model (DADM) for image dehazing. Guided by the haze density, our DADM can handle images with dense haze and complex environments. Specifically, we introduce a density-aware dehazing network (DADNet) in the reverse diffusion process, which can help DADM gradually recover a clear haze-free image from a haze image. To improve the performance of the network, we design a cross-feature density extraction module (CDEModule) to extract the haze density for the image and a density-guided feature fusion block (DFFBlock) to learn the effective contextual features. Furthermore, we introduce an indirect sampling strategy in the test sampling process, which not only suppresses the accumulation of errors but also ensures the stability of the results. Extensive experiments on popular benchmarks validate the superior performance of the proposed method. The code is released in https://github.com/benchacha/DADM.Attribution 4.0 International LicenseCCS Concepts: Computing methodologies → Image dehazing; Density-aware; Diffusion modelComputing methodologies → Image dehazingDensityawareDiffusion modelDensity-Aware Diffusion Model for Efficient Image Dehazing10.1111/cgf.1522112 pages