转自:爱可可-爱生活
There has been a tremendous surge in the amount of data generated. New types of data, such as Genomic data [1], 3D-360 degree VR Data, Autonomous Driving Point Cloud data are being generated. A lot of human effort is spent in analyzing the statistics of these new data formats for designing good compressors. We know from Information theory that good predictors form good Compressors [2]. We know that Recurrent Neural Networks (LSTM/GRU) based models are good at capturing long term dependencies [3], and can predict the next character/word very well. Thus can RNNs be efficiently used for compression? We analyze the usage of Recurrent Neural Networks for the problem of Data Compression. DeepZip Compressor consists of two major blocks: RNN based probability estimator and Arithmetic Coding block [4]. In the first section, we discuss existing literature and the basic model framework. We then take a look at experiment results on synthetic as well as real Text and Genomic datasets. Finally, we conclude by discussing the observations and further work.
链接:
https://web.stanford.edu/class/cs224n/reports/2761006.pdf
代码链接:
https://github.com/kedartatwawadi/NN_compression
原文链接:
https://m.weibo.cn/1402400261/4191242302774294