We consider low-distortion embeddings for subspaces under \emph{entrywise nonlinear transformations}. In particular we seek embeddings that preserve the norm of all vectors in a space $S = \{y: y = f(x)\text{ for }x \in Z\}$, where $Z$ is a $k$-dimensional subspace of $\mathbb{R}^n$ and $f(x)$ is a nonlinear activation function applied entrywise to $x$. When $f$ is the identity, and so $S$ is just a $k$-dimensional subspace, it is known that, with high probability, a random embedding into $O(k/\epsilon^2)$ dimensions preserves the norm of all $y \in S$ up to $(1\pm \epsilon)$ relative error. Such embeddings are known as \emph{subspace embeddings}, and have found widespread use in compressed sensing and approximation algorithms. We give the first low-distortion embeddings for a wide class of nonlinear functions $f$. In particular, we give additive $\epsilon$ error embeddings into $O(\frac{k\log (n/\epsilon)}{\epsilon^2})$ dimensions for a class of nonlinearities that includes the popular Sigmoid SoftPlus, and Gaussian functions. We strengthen this result to give relative error embeddings under some further restrictions, which are satisfied e.g., by the Tanh, SoftSign, Exponential Linear Unit, and many other `soft' step functions and rectifying units. Understanding embeddings for subspaces under nonlinear transformations is a key step towards extending random sketching and compressing sensing techniques for linear problems to nonlinear ones. We discuss example applications of our results to improved bounds for compressed sensing via generative neural networks.
翻译:我们考虑在\ emph{ entrywitter 的非线性变换} 下为亚空间建立低扭曲嵌入。 特别是,我们寻求在 $S = ⁇ y: y = f(x)\ text{ = ⁇ xxx= ⁇ xxxxx $xxx 美元, $Z$是 $k$- 维度的子空间 $mathb{ R} 美元和 $f(x) 是一个非线性激活功能, 应用到 $x美元。 当 $f 是一个身份, 所以$S 仅是一个 $k$x 的线性变换空间时, 众所周知, 在 $( k/\\ exxx) 空间里, 随机嵌入 $(k) 美元(xxx) 内嵌入, 等内存到 美元 。