## 【推荐】用Tensorflow理解LSTM

2017 年 9 月 11 日 机器学习研究会

Long Short Term Memory(LSTM) are the most common types of Recurrent Neural Networks used these days.They are mostly used with sequential data.An in depth look at LSTMs can be found in this incredible blog post.

Our Aim

As the title suggests,the main aim of this blogpost is to make the reader comfortable with the implementation details of basic LSTM network in tensorflow.

For fulfilling this aim we will take MNIST as our dataset.

The MNIST dataset

The MNIST dataset consists of images of handwritten digits and their corresponding labels.We can download and read the data in tensorflow with the help of following in built functionality-

from tensorflow.examples.tutorials.mnist import input_data

The data is split into three parts-

Training data(mnist.train)-55000 images of training data

Test data(mnist.test)-10000 images of test data

Validation data(mnist.validation)-5000 images of validation data.

Shape of the data

Let us discuss the shape with respect to training data of MNIST dataset.Shapes of all three splits are identical.

The training set consists of 55000 images of 28 pixels X 28 pixels each.These 784(28X28) pixel values are flattened in form of a single vector of dimensionality 784.The collection of all such 55000 pixel vectors(one for each image) is stored in form of a numpy array of shape (55000,784) and is referred to as mnist.train.images.

Each of these 55000 training images are associated with a label representing the class to which that image belongs.There are 10 such classes(0,1,2…9).Class labels are represented in one hot encoded form.Thus the labels are stored in form of numpy array of shape (55000,10) and is referred to as mnist.train.labels.

Why MNIST?

LSTMs are generally used for complex sequence related problems like language modelling which involves NLP concepts such as word embeddings, encoders etc.These topics themselves need a lot of understanding.It would be nice to eliminate these topics to concentrate on implementation details of LSTMs in tensorflow such as input formatting,LSTM cells and network designing.

MNIST gives us such an opportunity.The input data here is just a set of pixel values.We can easily format these values and concentrate on implementation details.

Implementation

Before getting our hands dirty with code,let us first draw an outline of our implementation.This will make the coding part more intuitive.

A vanilla RNN

A Recurrent Neural Network,when unrolled through time,can be visualised as-

Here,

1. xt refers to the input at time step t.

2. st refers to the hidden state at time step t.It can be visualised as “memory” of our network.

3. ot refers to the output at time step t.

4. U,V and W are parameters that are shared across all the time steps.The significance of this parameter sharing is that our model performs same task at each time step with different input.

What we have achieved by unrolling the RNN,is that at each time step,the network can be visualised as feed forward network taking into account the output of the previous time step(signified by the connections between the time steps).

https://jasdeep06.github.io/posts/Understanding-LSTM-in-Tensorflow-MNIST/

https://m.weibo.cn/1402400261/4150668975538667

“完整内容”请点击【阅读原文】
↓↓↓

### 相关内容

11+阅读 · 2018年1月14日

5+阅读 · 2017年11月19日

24+阅读 · 2017年11月16日

5+阅读 · 2017年10月15日

16+阅读 · 2017年10月4日

9+阅读 · 2017年9月12日

24+阅读 · 2017年9月8日

5+阅读 · 2017年8月17日

23+阅读 · 2017年8月14日

114+阅读 · 2020年5月6日

88+阅读 · 2020年3月25日

12+阅读 · 2019年10月30日

82+阅读 · 2019年10月12日

31+阅读 · 2019年10月11日

27+阅读 · 2019年10月11日

47+阅读 · 2019年10月10日

54+阅读 · 2019年10月9日

Kyungwoo Song,JoonHo Jang,Seung jae Shin,Il-Chul Moon
5+阅读 · 2019年10月7日
Hongwei Wang,Fuzheng Zhang,Mengdi Zhang,Jure Leskovec,Miao Zhao,Wenjie Li,Zhongyuan Wang
11+阅读 · 2019年6月13日
Rama Kumar Pasumarthi,Sebastian Bruch,Xuanhui Wang,Cheng Li,Michael Bendersky,Marc Najork,Jan Pfeifer,Nadav Golbandi,Rohan Anil,Stephan Wolf
4+阅读 · 2019年5月17日
Jingkuan Song,Xiangpeng Li,Lianli Gao,Heng Tao Shen
5+阅读 · 2018年12月26日
Quanshi Zhang,Xin Wang,Ruiming Cao,Ying Nian Wu,Feng Shi,Song-Chun Zhu
4+阅读 · 2018年12月18日
Theo Ryffel,Andrew Trask,Morten Dahl,Bobby Wagner,Jason Mancuso,Daniel Rueckert,Jonathan Passerat-Palmbach
4+阅读 · 2018年11月13日
Sven Schmit,Carlos Riquelme
6+阅读 · 2018年3月28日
Jianxun Lian,Xiaohuan Zhou,Fuzheng Zhang,Zhongxia Chen,Xing Xie,Guangzhong Sun
6+阅读 · 2018年3月15日
W. James Murdoch,Peter J. Liu,Bin Yu
7+阅读 · 2018年1月16日
Ashish Vaswani,Noam Shazeer,Niki Parmar,Jakob Uszkoreit,Llion Jones,Aidan N. Gomez,Lukasz Kaiser,Illia Polosukhin
15+阅读 · 2017年12月6日
Top