In recent years, there has been increasing interest to incorporate attention into deep learning architectures for biomedical image segmentation. The modular design of attention mechanisms enables flexible integration into convolutional neural network architectures, such as the U-Net. Whether attention is appropriate to use, what type of attention to use, and where in the network to incorporate attention modules, are all important considerations that are currently overlooked. In this paper, we investigate the role of the Focal parameter in modulating attention, revealing a link between attention in loss functions and networks. By incorporating a Focal distance penalty term, we extend the Unified Focal loss framework to include boundary-based losses. Furthermore, we develop a simple and interpretable, dataset and model-specific heuristic to integrate the Focal parameter into the Squeeze-and-Excitation block and Attention Gate, achieving optimal performance with fewer number of attention modules on three well-validated biomedical imaging datasets, suggesting judicious use of attention modules results in better performance and efficiency.
翻译:近年来,人们越来越有兴趣将注意力纳入生物医学图像分割的深层学习结构中。关注机制的模块设计使得能够灵活地融入进化神经网络结构,如U-Net。关注是否适宜使用,使用什么类型的关注,以及网络中哪些地方纳入关注模块,这些都是目前被忽视的重要考虑因素。在本文件中,我们调查了焦点参数在调控关注方面的作用,揭示了损失功能和网络中的关注联系。通过纳入一个焦点远程处罚术语,我们扩展了统一协调人损失框架,纳入了基于边界的损失。此外,我们开发了一个简单和可解释的数据集和具体模型超重,以便将焦点参数纳入挤压和热量区块和注意门,在三个有良好价值的生物医学成像数据集的注意模块中实现最佳性能,减少关注模块的数量,建议明智地使用关注模块,提高性能和效率。