2019斯坦福CS224n深度学习自然语言处理课程视频和相关资料分享

2019 年 3 月 16 日 AINLP

斯坦福大学2019年新一季的CS224n深度学习自然语言处理课程(CS224n: Natural Language Processing with Deep Learning-Stanford/Winter 2019)1月份已经开课,不过视频资源一直没有对外放出,直到前几天官方在油管上更新了前5节视频:CS224n: Natural Language Processing with Deep Learning | Winter 2019。

这门自然语言处理课程是值得每个NLPer学习的NLP课程,由 Christopher Manning 大神坐镇主讲,面向斯坦福大学的学生,在斯坦福大学已经讲授很多年。此次2019年新课,有很多更新,除了增加一些新内容外,最大的一点大概是代码由Tensorflow迁移到PyTorch:

这几年,由于深度学习、人工智能的概念的普及和推广,NLP作为AI领域的一颗明珠也逐渐广为人知,很多同学由此进入这个领域或者转行进入这个领域。Manning大神在第一堂课的视频开头之处给学生找位子(大概还有很多同学站着),同时开玩笑的说他在斯坦福大学讲授自然语言处理课程的第一个十年,平均每次选课的学生大约只有45个。

这门课程的主要目标是希望学生:能学到现代深度学习相关知识,特别是和NLP相关的一些知识点;能从宏观上了解人类语言以及理解和产生人类语言的难度;能理解和用代码(PyTorch)实习NLP中的一些主要问题和任务,例如词义理解、依存句法分析、机器翻译、问答系统等。

关于课程视频,目前官方只放出了前5节课程视频,我下载了一份放到了百度网盘里,感兴趣的同学可以关注AINLP,回复"cs224n"获取,这份视频会持续更新,直到完整版,欢迎关注:


以下是相关slides和其他阅读材料的相关链接,可以直接从官网下载:

http://web.stanford.edu/class/cs224n/index.html

DATE DESCRIPTION COURSE MATERIALS EVENTS DEADLINES
Tue Jan 8 Introduction and Word Vectors
[slides] [notes]
Gensim word vectors example:
[zip] [preview]
Suggested Readings:


  1. Word2Vec Tutorial - The Skip-Gram Model

  2. Efficient Estimation of Word Representations in Vector Space(original word2vec paper)

  3. Distributed Representations of Words and Phrases and their Compositionality (negative sampling paper)

Assignment 1 out
[zip] [preview]

Thu Jan 10 Word Vectors 2 and Word Senses
[slides] [notes]
Suggested Readings:


  1. GloVe: Global Vectors for Word Representation (original GloVe paper)

  2. Improving Distributional Similarity with Lessons Learned from Word Embeddings

  3. Evaluation methods for unsupervised word embeddings

Additional Readings:

  1. A Latent Variable Model Approach to PMI-based Word Embeddings

  2. Linear Algebraic Structure of Word Senses, with Applications to Polysemy

  3. On the Dimensionality of Word Embedding.



Fri Jan 11 Python review session
[slides]
1:30 - 2:50pm
Skilling Auditorium [map]


Tue Jan 15 Word Window Classification, Neural Networks, and Matrix Calculus
[slides] [matrix calculus notes]
[notes (lectures 3 and 4)]
Suggested Readings:


  1. CS231n notes on backprop

  2. Review of differential calculus

Additional Readings:

  1. Natural Language Processing (Almost) from Scratch

Assignment 2 out
[zip] [handout]
Assignment 1 due
Thu Jan 17 Backpropagation and Computation Graphs
[slides]
[notes (lectures 3 and 4)]
Suggested Readings:


  1. CS231n notes on network architectures

  2. Learning Representations by Backpropagating Errors

  3. Derivatives, Backpropagation, and Vectorization

  4. Yes you should understand backprop



Tue Jan 22 Linguistic Structure: Dependency Parsing
[slides]
[scrawled-on slides]
[notes]
Suggested Readings:


  1. Incrementality in Deterministic Dependency Parsing

  2. A Fast and Accurate Dependency Parser using Neural Networks

  3. Dependency Parsing

  4. Globally Normalized Transition-Based Neural Networks

  5. Universal Stanford Dependencies: A cross-linguistic typology

  6. Universal Dependencies website

Assignment 3 out
[zip] [handout]
Assignment 2 due
Thu Jan 24 The probability of a sentence? Recurrent Neural Networks and Language Models
[slides]
[notes (lectures 6 and 7)]
Suggested Readings:


  1. N-gram Language Models (textbook chapter)

  2. The Unreasonable Effectiveness of Recurrent Neural Networks(blog post overview)

  3. Sequence Modeling: Recurrent and Recursive Neural Nets(Sections 10.1 and 10.2)

  4. On Chomsky and the Two Cultures of Statistical Learning



Tue Jan 29 Vanishing Gradients and Fancy RNNs
[slides] [notes (lectures 6 and 7)]
Suggested Readings:


  1. Sequence Modeling: Recurrent and Recursive Neural Nets(Sections 10.3, 10.5, 10.7-10.12)

  2. Learning long-term dependencies with gradient descent is difficult (one of the original vanishing gradient papers)

  3. On the difficulty of training Recurrent Neural Networks (proof of vanishing gradient problem)

  4. Vanishing Gradients Jupyter Notebook (demo for feedforward networks)

  5. Understanding LSTM Networks (blog post overview)

Assignment 4 out
[zip] [handout] [Azure Guide] [Practical Guide to VMs]
Assignment 3 due
Thu Jan 31 Machine Translation, Seq2Seq and Attention
[slides] [notes]
Suggested Readings:


  1. Statistical Machine Translation slides, CS224n 2015 (lectures 2/3/4)

  2. Statistical Machine Translation (book by Philipp Koehn)

  3. BLEU (original paper)

  4. Sequence to Sequence Learning with Neural Networks (original seq2seq NMT paper)

  5. Sequence Transduction with Recurrent Neural Networks (early seq2seq speech recognition paper)

  6. Neural Machine Translation by Jointly Learning to Align and Translate (original seq2seq+attention paper)

  7. Attention and Augmented Recurrent Neural Networks (blog post overview)

  8. Massive Exploration of Neural Machine Translation Architectures(practical advice for hyperparameter choices)



Tue Feb 5 Practical Tips for Final Projects
[slides][notes]
Suggested Readings:


  1. Practical Methodology (Deep Learning book chapter)



Thu Feb 7 Question Answering and the Default Final Project
[slides]

Project Proposal out
[instructions]


Default Final Project out[handout] [github repo]

Assignment 4 due
Tue Feb 12 ConvNets for NLP
[slides]
Suggested Readings:


  1. Convolutional Neural Networks for Sentence Classification

  2. A Convolutional Neural Network for Modelling Sentences



Thu Feb 14 Information from parts of words: Subword Models
[slides]

Assignment 5 out
[zip (requires Stanford login)] [handout]
Project Proposal due
Tue Feb 19 Modeling contexts of use: Contextual Representations and Pretraining
[slides]
Suggested readings:


  1. Smith, Noah A. Contextual Word Representations: A Contextual Introduction. (Published just in time for this lecture!)



Thu Feb 21 Transformers and Self-Attention For Generative Models
(guest lecture by Ashish Vaswaniand Anna Huang)
[slides]
Suggested readings:


  1. Attention is all you need

  2. Image Transformer

  3. Music Transformer: Generating music with long-term structure



Fri Feb 22

Project Milestone out
[instructions]
Assignment 5 due
Tue Feb 26 Natural Language Generation
[slides]



Thu Feb 28 Reference in Language and Coreference Resolution
[slides]



Tue Mar 5 Multitask Learning: A general model for NLP? (guest lecture by Richard Socher)
[slides]


Project Milestone due
Thu Mar 7 Constituency Parsing and Tree Recursive Neural Networks
[slides]
Suggested Readings:


  1. Parsing with Compositional Vector Grammars.

  2. Constituency Parsing with a Self-Attentive Encoder



Tue Mar 12 Safety, Bias, and Fairness (guest lecture by Margaret Mitchell)
[slides]



Thu Mar 14 Future of NLP + Deep Learning
[slides]



Sun Mar 17


Final Project Report due[instructions]
Wed Mar 20 Final project poster session
[details]
5:15 - 8:30pm
McCaw Hall at the Alumni Center [map]

Project Poster/Video due[instructions]

点击阅读原文可直达原文链接,下载更方便。


登录查看更多
10

相关内容

斯坦福大学经典《自然语言处理cs224n》2020课件合集
专知会员服务
95+阅读 · 2020年5月25日
深度学习自然语言处理概述,216页ppt,Jindřich Helcl
专知会员服务
212+阅读 · 2020年4月26日
【Facebook AI】低资源机器翻译,74页ppt
专知会员服务
29+阅读 · 2020年4月8日
【教程】自然语言处理中的迁移学习原理,41 页PPT
专知会员服务
95+阅读 · 2020年2月8日
【斯坦福新课】CS234:强化学习,附课程PPT下载
专知会员服务
119+阅读 · 2020年1月15日
【推荐系统/计算广告/机器学习/CTR预估资料汇总】
专知会员服务
87+阅读 · 2019年10月21日
【深度学习视频分析/多模态学习资源大列表】
专知会员服务
91+阅读 · 2019年10月16日
免费自然语言处理(NLP)课程及教材分享
深度学习与NLP
29+阅读 · 2019年1月18日
斯坦福NLP组-2019-《CS224n: NLP与深度学习》-分享
深度学习与NLP
7+阅读 · 2019年1月14日
Heterogeneous Deep Graph Infomax
Arxiv
12+阅读 · 2019年11月19日
Arxiv
15+阅读 · 2019年9月11日
Arxiv
6+阅读 · 2019年8月22日
Attend More Times for Image Captioning
Arxiv
6+阅读 · 2018年12月8日
Image Captioning based on Deep Reinforcement Learning
Arxiv
22+阅读 · 2018年8月30日
VIP会员
相关VIP内容
斯坦福大学经典《自然语言处理cs224n》2020课件合集
专知会员服务
95+阅读 · 2020年5月25日
深度学习自然语言处理概述,216页ppt,Jindřich Helcl
专知会员服务
212+阅读 · 2020年4月26日
【Facebook AI】低资源机器翻译,74页ppt
专知会员服务
29+阅读 · 2020年4月8日
【教程】自然语言处理中的迁移学习原理,41 页PPT
专知会员服务
95+阅读 · 2020年2月8日
【斯坦福新课】CS234:强化学习,附课程PPT下载
专知会员服务
119+阅读 · 2020年1月15日
【推荐系统/计算广告/机器学习/CTR预估资料汇总】
专知会员服务
87+阅读 · 2019年10月21日
【深度学习视频分析/多模态学习资源大列表】
专知会员服务
91+阅读 · 2019年10月16日
相关论文
Heterogeneous Deep Graph Infomax
Arxiv
12+阅读 · 2019年11月19日
Arxiv
15+阅读 · 2019年9月11日
Arxiv
6+阅读 · 2019年8月22日
Attend More Times for Image Captioning
Arxiv
6+阅读 · 2018年12月8日
Image Captioning based on Deep Reinforcement Learning
Arxiv
22+阅读 · 2018年8月30日
Top
微信扫码咨询专知VIP会员