Deep Learning for Natural Language Processing
Lecturers: Fahim Dalvi and Hassan Sajjad
In this lecture series, we cover the basics of machine learning, neural networks and deep neural networks. We look at several deep neural network architectures from the perspective of applying them to various classification tasks, such as sequence prediction and generation. Every lecture is accompanied with practice problems implemented in Keras, a popular Python framework for deep learning.
Background reading * Python Numpy Tutorial * IPython Tutorial * Linear Aljebra for Machine Learning
Slides
- Lecture 0 - Introduction & Roadmap slides
- Lecture 1 - Introduction to Machine Learning slides
- Practical session slides
- Learning Rate and Optimization Demo JupyterNotebook
- Introduction to Jupyter Notebooks JupyterNotebook
- Introduction to Python and Numpy JupyterNotebook
- Linear Classification by Regression JupyterNotebook
- Binary Linear Classification JupyterNotebook
- Linear Classification on Spiral Data JupyterNotebook
- Supplementary Material slides
- Lecture 2 - Neural Networks slides
- Practical session slides
- Neural Networks on Spiral Data JupyterNotebook
- Data .zip
- Sentiment Classification using Neural Networks JupyterNotebook
- Neural Network Language Model JupyterNotebook
- Lecture 3 - Recurrent Neural Networks slides
- Practical Session JupyterNotebook
- Recurrent Neural Networks JupyterNotebook
- Hybrid Model JupyterNotebook
- Lecture 4 - Sequence to Sequence Models and Practical Considerations
- Sequence to Sequence Models slides
- Practical Considerations slides
- Per Timestep Prediction JupyterNotebook
- Pretrained Embeddings JupyterNotebook
- Imbalanced Classes JupyterNotebook
- Lecture 5 - Advanced Topics, CNN, Multitask, GAN, RL, etc. slides