Numerous applications of machine learning involve representing probability distributions over high-dimensional data. We propose autoregressive quantile flows, a flexible class of normalizing flow models trained using a novel objective based on proper scoring rules. Our objective does not require calculating computationally expensive determinants of Jacobians during training and supports new types of neural architectures, such as neural autoregressive flows from which it is easy to sample. We leverage these models in quantile flow regression, an approach that parameterizes predictive conditional distributions with flows, resulting in improved probabilistic predictions on tasks such as time series forecasting and object detection. Our novel objective functions and neural flow parameterizations also yield improvements on popular generation and density estimation tasks, and represent a step beyond maximum likelihood learning of flows.
翻译:机器学习的许多应用都代表了高维数据的概率分布。 我们建议了自动递减四分位流,这是一个灵活的流流正常化模型,该模型使用基于正确评分规则的新目标进行培训。 我们的目标并不要求在培训过程中计算Jacobian人昂贵的计算决定因素,支持新型神经结构,例如容易取样的神经自动递减流。 我们将这些模型用于四分流回归中,这种方法将预测性有条件分布与流量参数化,从而改进对时间序列预测和物体探测等任务的概率预测。 我们的新的目标功能和神经流参数化也提高了大众生成和密度估计任务,并代表了超出最大可能了解流量的一步。