Neural signed distance functions (SDFs) have become a powerful representation for geometric reconstruction from point clouds, yet they often require both gradient- and curvature-based regularization to suppress spurious warp and preserve structural fidelity. FlatCAD introduced the Off-Diagonal Weingarten (ODW) loss as an efficient second-order prior for CAD surfaces, approximating full-Hessian regularization at roughly half the computational cost. However, FlatCAD applies a fixed ODW weight throughout training, which is suboptimal: strong regularization stabilizes early optimization but suppresses detail recovery in later stages. We present scheduling strategies for the ODW loss that assign a high initial weight to stabilize optimization and progressively decay it to permit fine-scale refinement. We investigate constant, linear, quintic, and step interpolation schedules, as well as an increasing warm-up variant. Experiments on the ABC CAD dataset demonstrate that time-varying schedules consistently outperform fixed weights. Our method achieves up to a 35% improvement in Chamfer Distance over the FlatCAD baseline, establishing scheduling as a simple yet effective extension of curvature regularization for robust CAD reconstruction.
翻译:神经符号距离函数已成为从点云进行几何重建的强大表示方法,但其通常需要基于梯度和曲率的正则化来抑制虚假扭曲并保持结构保真度。FlatCAD引入了非对角魏因加滕损失作为CAD曲面的高效二阶先验,以大约一半的计算成本近似全海森正则化。然而,FlatCAD在整个训练过程中采用固定的ODW权重,这并非最优:强正则化可稳定早期优化,但会抑制后期细节恢复。我们提出了ODW损失的调度策略,通过分配较高的初始权重以稳定优化,并逐步衰减以允许精细尺度细化。我们研究了常数、线性、五次及阶梯插值调度方案,以及一种递增的预热变体。在ABC CAD数据集上的实验表明,时变调度方案始终优于固定权重。我们的方法在倒角距离指标上相比FlatCAD基线实现了高达35%的提升,确立了调度作为曲率正则化的一种简单而有效的扩展,用于鲁棒的CAD重建。