Feature importance techniques have enjoyed widespread attention in the explainable AI literature as a means of determining how trained machine learning models make their predictions. We consider Shapley value based approaches to feature importance, applied in the context of time series data. We present closed form solutions for the SHAP values of a number of time series models, including VARMAX. We also show how KernelSHAP can be applied to time series tasks, and how the feature importances that come from this technique can be combined to perform "event detection". Finally, we explore the use of Time Consistent Shapley values for feature importance.
翻译:在可解释的AI文献中,地物重要性技术得到广泛关注,作为确定经过培训的机器学习模型如何作出预测的一种手段。我们认为,在时间序列数据的背景下,基于沙普利价值的方法具有显著重要性,我们为包括VARMAX在内的若干时间序列模型的SHAP值提出了封闭形式的解决方案。我们还展示了如何将KernelSHAP应用到时间序列任务中,以及如何将这一技术的特质重要性结合起来进行“活动探测”。最后,我们探索了如何利用时间一致的沙普利值实现特征重要性。