推荐一份NLP学习新资料:旧金山大学自然语言处理课程,这门课程将于2019年夏季在旧金山大学数据科学硕士课程中讲授。该课程采用Python教学,使用Jupyter Notebooks,将用到sklearn,nltk,pytorch和fastai。
Github链接,点击阅读原文可直达:
https://github.com/fastai/course-nlp
This course is being taught in the University of San Francisco's Masters of Science in Data Science program, summer 2019. The course is taught in Python with Jupyter Notebooks, using libraries such as sklearn, nltk, pytorch, and fastai.
The following topics will be covered:
1. What is NLP?
A changing field
Resources
Tools
Python libraries
Example applications
Ethics issues
2. Topic Modeling with NMF and SVD
Stop words, stemming, & lemmatization
Term-document matrix
Topic Frequency-Inverse Document Frequency (TF-IDF)
Singular Value Decomposition (SVD)
Non-negative Matrix Factorization (NMF)
Truncated SVD, Randomized SVD
3. Sentiment classification with Naive Bayes, Logistic regression, and ngrams
Sparse matrix storage
Counters
the fastai library
Naive Bayes
Logistic regression
Ngrams
Logistic regression with Naive Bayes features, with trigrams
4. Regex (and re-visiting tokenization)
5. Language modeling & sentiment classification with deep learning
Language model
Transfer learning
Sentiment classification
6. Translation with RNNs
Review Embeddings
Bleu metric
Teacher Forcing
Bidirectional
Attention
7. Translation with the Transformer architecture
Transformer Model
Multi-head attention
Masking
Label smoothing
8. Bias & ethics in NLP
bias in word embeddings
types of bias
attention economy
drowning in fraudulent/fake info
This course is structured with a top-down teaching method, which is different from how most math courses operate. Typically, in a bottom-up approach, you first learn all the separate components you will be using, and then you gradually build them up into more complex structures. The problems with this are that students often lose motivation, don't have a sense of the "big picture", and don't know what they'll need.
Harvard Professor David Perkins has a book, Making Learning Whole in which he uses baseball as an analogy. We don't require kids to memorize all the rules of baseball and understand all the technical details before we let them play the game. Rather, they start playing with a just general sense of it, and then gradually learn more rules/details as time goes on.
If you took the fast.ai deep learning course, that is what we used. You can hear more about my teaching philosophy in this blog post or this talk I gave at the San Francisco Machine Learning meetup.
All that to say, don't worry if you don't understand everything at first! You're not supposed to. We will start using some "black boxes" and then we'll dig into the lower level details later.
To start, focus on what things DO, not what they ARE.