Deep convolutional neural networks (CNNs) have demonstrated dominant performance in person re-identification (Re-ID). Existing CNN based methods utilize global average pooling (GAP) to aggregate intermediate convolutional features for Re-ID. However, this strategy only considers the first-order statistics of local features and treats local features at different locations equally important, leading to sub-optimal feature representation. To deal with these issues, we propose a novel \emph{weighted bilinear coding} (WBC) model for local feature aggregation in CNN networks to pursue more representative and discriminative feature representations. In specific, bilinear coding is used to encode the channel-wise feature correlations to capture richer feature interactions. Meanwhile, a weighting scheme is applied on the bilinear coding to adaptively adjust the weights of local features at different locations based on their importance in recognition, further improving the discriminability of feature aggregation. To handle the spatial misalignment issue, we use a salient part net to derive salient body parts, and apply the WBC model on each part. The final representation, formed by concatenating the WBC eoncoded features of each part, is both discriminative and resistant to spatial misalignment. Experiments on three benchmarks including Market-1501, DukeMTMC-reID and CUHK03 evidence the favorable performance of our method against other state-of-the-art methods.
翻译:以CNN为基础的现有方法使用全球平均集合(GAP)来汇总再开发的中间进化特征。然而,这一战略仅考虑地方特征的第一阶统计,并在不同地点处理地方特征,这具有同等重要性,导致亚最佳特征的表达。为了处理这些问题,我们提议为CNN网络的本地特征汇总提供一个新型的\emph{加权双线编码(WBC)模式,以寻找更具代表性和歧视性的特征表现。在具体情况下,双线式编码用于对频道特性的关联进行编码,以捕捉更丰富的地貌互动。与此同时,在双线编码上采用加权办法,根据不同地点地方特征的承认重要性,调整当地特征的重量,进一步改善地貌组合的不均性。为了处理空间偏差问题,我们用一个突出的网来得出显著的体形部分,并在每一部分应用WBC模型。最后的显示方式是,即对WBC03-15MC的稳性化方法,即对WMF-MC-MC-MC-MC-MC-F-C-C-C-C-C-C-CFIal-C-Cal-MAC-MAC-C-C-C-C-C-C-C-Miscoal-C-C-MAC-C-MC-C-C-C-C-C-C-MC-C-C-C-C-C-C-C-C-C-C-C-C-F-IF-C-Miscod-C-C-C-C-C-C-C-C-C-C-F-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-F-F-F-F-F-F-F-F-F-CF-C-CFFFFFFFFF-C-C-C-C-F-F-F-F-C-C-C-C-F-F-C-C-C-C-C-C-C-FFFFFF-C-C-C-C-C-C-C-C-F