We present TropNNC, a framework for compressing neural networks with linear and convolutional layers and ReLU activations using tropical geometry. By representing a network's output as a tropical rational function, TropNNC enables structured compression via reduction of the corresponding tropical polynomials. Our method refines the geometric approximation of previous work by adaptively selecting the weights of retained neurons. Key contributions include the first application of tropical geometry to convolutional layers and the tightest known theoretical compression bound. TropNNC requires only access to network weights - no training data - and achieves competitive performance on MNIST, CIFAR, and ImageNet, matching strong baselines such as ThiNet and CUP.
翻译:本文提出TropNNC,一个利用热带几何对具有线性层、卷积层和ReLU激活函数的神经网络进行压缩的框架。通过将网络输出表示为热带有理函数,TropNNC能够通过约简相应的热带多项式实现结构化压缩。我们的方法通过自适应地选择保留神经元的权重,改进了先前工作的几何逼近。主要贡献包括首次将热带几何应用于卷积层,以及目前已知最紧的理论压缩界。TropNNC仅需访问网络权重(无需训练数据),并在MNIST、CIFAR和ImageNet数据集上取得了具有竞争力的性能,与ThiNet和CUP等强基线方法相当。