Reading recommendations RNNs

  1. These lecture notes at https://github.com/CompPhysics/MachineLearning/blob/master/doc/pub/week47/ipynb/week47.ipynb
  2. See also lecture notes from week 46 at https://github.com/CompPhysics/MachineLearning/blob/master/doc/pub/week46/ipynb/week46.ipynb. The lecture on Monday starts with a repetition on recurrent neural networks. The second lecture starts with basics of autoenconders.
  3. For RNNs, see Goodfellow et al chapter 10, see https://www.deeplearningbook.org/contents/rnn.html.
  4. Reading suggestions for implementation of RNNs in PyTorch: see Rashcka et al.'s chapter 15 and GitHub site at https://github.com/rasbt/machine-learning-book/tree/main/ch15.
  5. RNN video at https://youtu.be/PCgrgHgy26c?feature=shared
  6. New xLSTM, see Beck et al https://arxiv.org/abs/2405.04517. Exponential gating and modified memory structures boost xLSTM capabilities to perform favorably when compared to state-of-the-art Transformers and State Space Models, both in performance and scaling.