Conventional infrared and visible image fusion(IVIF) methods often assume high-quality inputs, neglecting real-world degradations such as low-light and noise, which limits their practical applicability. To address this, we propose a Degradation-Decoupled Fusion(DDFusion) framework, which achieves degradation decoupling and jointly models degradation suppression and image fusion in a unified manner. Specifically, the Degradation-Decoupled Optimization Network(DDON) performs degradation-specific decomposition to decouple inter-degradation and degradation-information components, followed by component-specific extraction paths for effective suppression of degradation and enhancement of informative features. The Interactive Local-Global Fusion Network (ILGFN) aggregates complementary features across multi-scale pathways and alleviates performance degradation caused by the decoupling between degradation optimization and image fusion. Extensive experiments demonstrate that DDFusion achieves superior fusion performance under both clean and degraded conditions. Our code is available at https://github.com/Lmmh058/DDFusion.
翻译:传统的红外与可见光图像融合方法通常假设输入为高质量图像,忽略了实际场景中的低光照与噪声等退化因素,限制了其实际应用。为此,我们提出一种退化解耦融合框架,该框架实现退化解耦,并以统一方式联合建模退化抑制与图像融合。具体而言,退化解耦优化网络通过退化特异性分解解耦出跨退化分量与退化信息分量,随后通过分量特异性提取路径有效抑制退化并增强信息特征。交互式局部-全局融合网络聚合多尺度路径上的互补特征,并缓解因退化优化与图像融合解耦导致的性能下降。大量实验表明,DDFusion在清晰与退化条件下均能实现卓越的融合性能。代码已发布于 https://github.com/Lmmh058/DDFusion。