Outcomes with a natural order commonly occur in prediction tasks and oftentimes the available input data are a mixture of complex data, like images, and tabular predictors. Deep Learning (DL) methods are state-of-the-art for image classification tasks but frequently treat ordinal outcomes as unordered and lack interpretability. In contrast, classical ordinal regression models consider the outcome's order and yield interpretable predictor effects but are limited to tabular data. We present ordinal neural network transformation models (ONTRAMs), which unite DL with classical ordinal regression methods. ONTRAMs are a special case of transformation models and trade off flexibility and interpretability by additively decomposing the transformation function into terms for image and tabular data using jointly trained neural networks. We discuss how to interpret model components for both tabular and image data. The proposed ONTRAMs achieve on-par performance with common DL models while being directly interpretable and more efficient in training.
翻译:深度学习(DL)方法在图像分类任务方面是最先进的,但经常将常规结果视为无序和缺乏解释性。相比之下,典型的典型回归模型考虑了结果的顺序并产生了可解释的预测效应,但仅限于表格数据。我们展示了将DL与古典或古典回归法方法相结合的神经网络转换模型(ONTRAMS)。