We study the relationship between the eluder dimension for a function class and a generalized notion of rank, defined for any monotone "activation" $\sigma : \mathbb{R} \to \mathbb{R}$, which corresponds to the minimal dimension required to represent the class as a generalized linear model. When $\sigma$ has derivatives bounded away from $0$, it is known that $\sigma$-rank gives rise to an upper bound on eluder dimension for any function class; we show however that eluder dimension can be exponentially smaller than $\sigma$-rank. We also show that the condition on the derivative is necessary; namely, when $\sigma$ is the $\mathrm{relu}$ activation, we show that eluder dimension can be exponentially larger than $\sigma$-rank.
翻译:我们研究功能类的极分维度与一般等级概念之间的关系,这些概念针对任何单色“激活” $\ sgmam :\ mathbb{R}\ to\ mathb{R}$,它相当于代表该类作为通用线性模型所需的最小维度。当$\ gmam$ 的衍生物从0美元上绑起来时,人们知道,$\ gmam$ 会在任何功能类的极分维度上产生较高约束值;然而,我们显示,极分维度可能指数小于$\ sgmam$ 。我们还表明,衍生物的条件是必要的;也就是说,当 $\ sgmam{relu} 美元是 $ mathrm{relu} 美元激活值时,我们显示,极分值的指数可能大于$gmam美元。