Out-of-distribution detection is one of the most critical issue in the deployment of machine learning. The data analyst must assure that data in operation should be compliant with the training phase as well as understand if the environment has changed in a way that autonomous decisions would not be safe anymore. The method of the paper is based on eXplainable Artificial Intelligence (XAI); it takes into account different metrics to identify any resemblance between in-distribution and out of, as seen by the XAI model. The approach is non-parametric and distributional assumption free. The validation over complex scenarios (predictive maintenance, vehicle platooning, covert channels in cybersecurity) corroborates both precision in detection and evaluation of training-operation conditions proximity.
翻译:数据分析员必须确保运行中的数据符合培训阶段的要求,并了解环境是否已经发生变化,使自主决定不再安全。文件的方法是基于可移植人工智能(XAI);它考虑到不同的衡量标准,以便从XAI模式中查明在分配和不在分配之间有任何相似之处。这种方法是非参数假设和无分布假设的。对复杂假设的验证(预先维护、车辆排队、网络安全隐蔽渠道)证实在发现和评价培训业务条件接近方面十分精确。</s>