We present PYSKL: an open-source toolbox for skeleton-based action recognition based on PyTorch. The toolbox supports a wide variety of skeleton action recognition algorithms, including approaches based on GCN and CNN. In contrast to existing open-source skeleton action recognition projects that include only one or two algorithms, PYSKL implements six different algorithms under a unified framework with both the latest and original good practices to ease the comparison of efficacy and efficiency. We also provide an original GCN-based skeleton action recognition model named ST-GCN++, which achieves competitive recognition performance without any complicated attention schemes, serving as a strong baseline. Meanwhile, PYSKL supports the training and testing of nine skeleton-based action recognition benchmarks and achieves state-of-the-art recognition performance on eight of them. To facilitate future research on skeleton action recognition, we also provide a large number of trained models and detailed benchmark results to give some insights. PYSKL is released at https://github.com/kennymckormick/pyskl and is actively maintained. We will update this report when we add new features or benchmarks. The current version corresponds to PYSKL v0.2.
翻译:我们介绍了PYSKL:一个基于PyTorrch的基于骨架的行动识别的开放源码工具箱。工具箱支持各种各样的骨架行动识别算法,包括基于GCN和CNN的方法。与现有的只包含一两个算法的开放源骨架行动识别项目相比,PYSKL在一个统一的框架内实施六种不同的算法,同时采用最新和原始的良好做法,以便于比较效率和效益。我们还提供了一个基于GCN的原创骨架行动识别模型ST-GCN++,该模型在没有任何复杂关注计划的情况下实现竞争性承认业绩,作为强有力的基线。同时,PYSKL支持对九个基于骨架的行动识别基准的培训和测试,并在其中八个基准上达到最先进的识别业绩。为了便利今后对骨架行动识别的研究,我们还提供了大量经过培训的模型和详细的基准结果,以提供一些洞察力。PYSKL在https://github.com/kenymormick/pyskl上发布,并积极维护了这份报告。当我们增加新的特征或基准时,我们将更新PYS.2。