Outcomes with a natural order commonly occur in prediction tasks and often the available input data are a mixture of complex data like images and tabular predictors. Deep Learning (DL) models are state-of-the-art for image classification tasks but frequently treat ordinal outcomes as unordered and lack interpretability. In contrast, classical ordinal regression models consider the outcome's order and yield interpretable predictor effects but are limited to tabular data. We present ordinal neural network transformation models (ONTRAMs), which unite DL with classical ordinal regression approaches. ONTRAMs are a special case of transformation models and trade off flexibility and interpretability by additively decomposing the transformation function into terms for image and tabular data using jointly trained neural networks. The performance of the most flexible ONTRAM is by definition equivalent to a standard multi-class DL model trained with cross-entropy while being faster in training when facing ordinal outcomes. Lastly, we discuss how to interpret model components for both tabular and image data on two publicly available datasets.
翻译:深度学习(DL)模型是图像分类任务的最新技术,但经常将常规结果视为无序和缺乏解释性。相比之下,典型的典型回归模型考虑结果的顺序并产生可解释的预测效应,但仅限于表格数据。我们展示的是将DL与传统或非常规回归方法相结合的神经网络转换模型(ONTRAMs ) 。 ONTRAM 是一个特殊的变异模型,通过将变异功能通过联合培训的神经网络将变异功能添加成图像和表格式数据术语来交换灵活性和可解释性。最灵活的ONTRAM 的性能在定义上相当于一个标准的多级DL模型,该模型在面对交错结果时培训速度更快。最后,我们讨论如何解释两种公开数据集的表格和图像数据的模型组成部分。