We propose a new framework that generalizes the parameters of neural network models to $C^*$-algebra-valued ones. $C^*$-algebra is a generalization of the space of complex numbers. A typical example is the space of continuous functions on a compact space. This generalization enables us to combine multiple models continuously and use tools for functions such as regression and integration. Consequently, we can learn features of data efficiently and adapt the models to problems continuously. We apply our framework to practical problems such as density estimation and few-shot learning and show that our framework enables us to learn features of data even with a limited number of samples. Our new framework highlights the potential possibility of applying the theory of $C^*$-algebra to general neural network models.
翻译:我们提出了一个新的框架,将神经网络模型的参数概括为 $C $$-algebra 值的模型。 $C $$-algebra 是复杂数字空间的概略。 一个典型的例子就是紧凑空间上连续功能的空间。 这种概略使我们能够连续地将多个模型结合起来,并使用回归和整合等功能的工具。 因此,我们可以有效地了解数据特征,并不断使模型适应问题。 我们将我们的框架应用于密度估计和微小的学习等实际问题,并表明我们的框架使我们能够了解数据特征,即使有数量有限的样本。我们的新框架强调了将 $C $-algebra 理论应用于一般神经网络模型的可能性。