Course Schedule
Esfand 1400
Session | Date | Topic | Notes |
---|---|---|---|
1 | 23 Bahman | Introduction | [slides] |
2 | 25 Bahman | Semantic representation | [slides] (Classes 2-5) |
3 | 30 Bahman | Word embeddings (Word2vec) | |
4 | 2 Esfand | Word embeddings (Evaluation, cross-lingual space, ambiguity and sense embeddings, sub-word embeddings, retrofitting, bias) | |
5 | 7 Esfand | Language modeling (n-gram, probability computation, feedforward NN for LM) | Deep Learning Quiz |
6 | 9 Esfand | Pytorch tutorial | HW#1 [class notebook] |
7 | 14 Esfand | Language modeling with RNNs (backprop through time, text generation, perplexity) | [slides] (classes 7-9) |
8 | 16 Esfand | Vanishing gradients and fancy RNNs (LSTMs, bidirectional and stacked RNNs) | |
9 | 21 Esfand | Attention mechanism (seq2seq attention, attention variants) | |
10 | 23 Esfand | Transformers (self-attention, multi-head, positional encoding) | [slides] (Classes 10-11) |
Farvardin 1401
Session | Date | Topic | Notes |
---|---|---|---|
11 | 15 Farvardin | More about Transformers (contextualised embeddings, MLM, and BERT, and pretrain/finetune) | |
12 | 20 Farvardin | Transformers: derivatives of BERT and architecture types (subwords and tokenization, decoders, encoders, and encoder-decoders) | [slides] |
13 | 22 Farvardin | Midterm exam | |
14 | 27 Farvardin | Pytorch tutorial | [class notebook] |
15 | 29 Farvardin | Pytorch tutorial | Project Proposal [class notebook] |
Ordibehesht 1401
Session | Date | Topic | Notes |
---|---|---|---|
16 | 5 Ordibehesht | Multilingual Learning (Data balancing) | [slides] (classes 16 & 19) |
17 | 10 Ordibehesht | Project Proposal | |
18 | 17 Ordibehesht | *Isotropicity of Semantic Spaces (Sara Rajaee) | HW#3 [slides] |
19 | 19 Ordibehesht | HW#2 Multilingual Learning (Adapters and MAD-X, TLM, cross-lingual transfer learning, zero-shot, and active learning) |
|
20 | 24 Ordibehesht | Question Answering (reading comprehension, SQuAD, LSTM-based and BERT models, BiDAF) | [slides] |
21 | 26 Ordibehesht | Progress Report I | |
22 | 31 Ordibehesht | * Few-shot, Zero-shot, and Prompt-based learning (Mohsen Tabasi) | [slides] |
Khordad 1401
Session | Date | Topic | Notes |
---|---|---|---|
23 | 2 Khordad | *Ethical Considerations and Bias in NLP (Mahdi Zakizadeh & Kaveh Eskanadari) | [slides] |
24 | 7 Khordad | Skipped | |
25 | 9 Khordad | *Interpretability (Ali Modaressi & Hosein Mohebbi) | HW#4 [slides] |
26 | 16 Khordad | Neural Language Generation (applications, maximum likelihood training, teacher forcing, greedy and random sampling, top-k and nucleus sampling, unlikelihood training, exposure bias, evaluating NLG, bias and ethical concerns) | [slides] |
27 | 21 Khordad | Progress Report II |