Seminar on Math of NLP

multigrid.org
#NLP#seminar

State 1

  1. Introduction to NLP, Benyou Wang (CHHK Shenzhen)
  2. Word Vectors and Word Window Classification
  3. Dependency Parsing
  4. Recurrent Neural Networks and Language Models
  5. Vanishing Gradients, Fancy RNNs, Seq2Seq
  6. Machine Translation, Attention, Subword Models
  7. Transformers
  8. More about Transformers and Pretraining
  9. Pretrained models: GPT, Llama, …

State 2

  1. Natural Language Generation
  2. Integrating knowledge in language models
  3. Bias, toxicity, and fairness
  4. Retrieval Augmented Models + Knowledge
  5. ConvNets, Tree Recursive Neural Networks and Constituency
  6. Scaling laws for large models
  7. Editing Neural Networks