CS224d: Deep Learning for Natural Language Processing from Stanford
Course homepage A complete survey of the field with videos, lecture slides, and sample student projects.
Course Lectures Video playlist.
Course notes Probably the best "book" on DL for NLP.
Neural Networks for NLP from Carnegie Mellon University
Neural Network Methods in Natural Language Processing by Yoav Goldberg and Graeme Hirst
Deep Learning in Natural Language Processing by Li Deng and Dang Liu
Natural Language Processing in Action by Hobson Lane, Cole Howard, and Hannes Hapke
Deep Learning for Natural Language Processing (without Magic)
A Primer on Neural Network Models for Natural Language Processing
Deep Learning for Natural Language Processing: Theory and Practice (Tutorial)
Practical Neural Networks for NLP from EMNLP 2016 using DyNet framework
Recurrent Neural Networks with Word Embeddings
LSTM Networks for Sentiment Analysis
TensorFlow demo using the Large Movie Review Dataset
LSTMVis: Visual Analysis for Recurrent Neural Networks
Ali Ghodsi's lecture on word2vec part 1 and part 2
Richard Socher's talk on sentiment analysis, question answering, and sentence-image embeddings
Deep Learning, an interactive introduction for NLP-ers
Deep Natural Language Understanding
Deep Learning Summer School, Montreal 2016 Includes state-of-art language modeling.
Keras - The Python Deep Learning library Emphasis on user friendliness, modularity, easy extensibility, and Pythonic.
TensorFlow - A cross-platform, general purpose Machine Intelligence library with Python and C++ API.
Genism: Topic modeling for humans - A Python package that includes word2vec and doc2vec implementations.
DyNet - The Dynamic Neural Network Toolkit "work well with networks that have dynamic structures that change for every training instance".
Google’s original word2vec implementation
Deeplearning4j’s NLP framework - Java implementation.
deepnl - A Python library for NLP based on Deep Learning neural network architecture.
Deep or shallow, NLP is breaking out - General overview of how Deep Learning is impacting NLP.
Natural Language Processing from Research at Google - Not all Deep Learning (but mostly).
Distributed Representations of Words and Phrases and their Compositionality - The original word2vec paper.
word2vec Parameter Learning Explained
Distributed Representations of Sentences and Documents
Context Dependent Recurrent Neural Network Language Model
Translation Modeling with Bidirectional Recurrent Neural Networks
Contextual LSTM (CLSTM) models for Large scale NLP tasks
LSTM Neural Networks for Language Modeling
Exploring the Limits of Language Modeling
Conversational Contextual Cues - Models context and participants in conversations.
Sequence to sequence learning with neural networks
Efficient Estimation of Word Representations in Vector Space
Learning Character-level Representations for Part-of-Speech Tagging
Representation Learning for Text-level Discourse Parsing
Fast and Robust Neural Network Joint Models for Statistical Machine Translation
Parsing With Compositional Vector Grammars
Smart Reply: Automated Response Suggestion for Email
Neural Architectures for Named Entity Recognition - State-of-the-art performance in NER with bidirectional LSTM with a sequential conditional random layer and transition-based parsing with stack LSTMs.
GloVe: Global Vectors for Word Representation - A "count-based"/co-occurrence model to learn word embeddings.
Grammar as a Foreign Language - State-of-the-art syntactic constituency parsing using generic sequence-to-sequence approach.
Skip-Thought Vectors - "unsupervised learning of a generic, distributed sentence encoder"（Paper&Code）
the morning paper: The amazing power of word vectors - Overview of word vectors.
Deep Learning, NLP, and Representations
The Unreasonable Effectiveness of Recurrent Neural Networks
Machine Learning for Emoji Trends
Teaching Robots to Feel: Emoji & Deep Learning
Computational Linguistics and Deep Learning - Opinion piece on how Deep Learning fits into the broader picture of text processing.
Dataset from "One Billion Word Language Modeling Benchmark" - Almost 1B words, already pre-processed text.
word2vec analogy demo