Accurate and unbiased examinations of skin lesions are critical for the early diagnosis and treatment of skin diseases. Visual features of skin lesions vary significantly because the images are collected from patients with different lesion colours and morphologies by using dissimilar imaging equipment. Recent studies have reported that ensembled convolutional neural networks (CNNs) are practical to classify the images for early diagnosis of skin disorders. However, the practical use of these ensembled CNNs is limited as these networks are heavyweight and inadequate for processing contextual information. Although lightweight networks (e.g., MobileNetV3 and EfficientNet) were developed to achieve parameters reduction for implementing deep neural networks on mobile devices, insufficient depth of feature representation restricts the performance. To address the existing limitations, we develop a new lite and effective neural network, namely HierAttn. The HierAttn applies a novel deep supervision strategy to learn the local and global features by using multi-stage and multi-branch attention mechanisms with only one training loss. The efficacy of HierAttn was evaluated by using the dermoscopy images dataset ISIC2019 and smartphone photos dataset PAD-UFES-20 (PAD2020). The experimental results show that HierAttn achieves the best accuracy and area under the curve (AUC) among the state-of-the-art lightweight networks. The code is available at https://github.com/anthonyweidai/HierAttn.
翻译:对皮肤损伤的准确和公正的检查对皮肤病的早期诊断和治疗至关重要,皮肤损伤的视觉特征差异很大,因为通过使用不同成像设备从具有不同腐蚀颜色和形态的病人那里收集了图像,因为通过使用不同成像设备收集了不同腐蚀颜色和形态的图像。最近的研究表明,混合的卷发神经网络(CNNs)对早期诊断皮肤紊乱的图像进行分类是实用的。然而,这些混杂的CNN的实用应用有限,因为这些网络重量过重,不足以处理背景信息。虽然开发了轻量网络(例如,MiveNetV3和高效Net),以降低在移动设备上实施深神经网络的参数,但特征描述的深度不足限制了其性能。为了解决现有的局限性,我们开发了一个新的精度和有效的神经网络,即HierAttn。HierAtt在使用多级和多级网络关注机制学习当地和全球特征方面的情况时,这些网络只有一次培训损失。HierAtn(例如Hier-20),HierAttn)通过在 demolibal-listalalal ASmabal Asal Asmatial 数据显示SIS19下,在SISSISSIS20的SISSISAUDSyaldaldaldaldaldaldaldaldalsmaxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx的Syaldaldaldaldaldaldaldaldaldaldaldaldddddalddddddaldalddddaldaldaldaldalssssaldalsalsalsalsalsalsalsalsalsaldalsaldaldalsmsmsmsmsmalsmsmalsmalsalsaldalsaldalssssalsaldaldaldaldaldaldaldaldalsalsalsalsalsalsalsalsetdalsaldalsalsaldalsalsalsaldals