Implementation of Using Fast Weights to Attend to the Recent Past

Using Fast Weights to Attend to the Recent Past Jimmy Ba, Geoffrey Hinton, Volodymyr Mnih, Joel Z. Leibo, Catalin Ionescu NIPS 2016, https://arxiv.org/abs/1610.06258 Physiological Motivations How do we store memories? We don't store memories by keeping track of the exact neural activity that occurred at the time of the memory. Instead, we try to recreate … Continue reading Implementation of Using Fast Weights to Attend to the Recent Past

Recurrent Neural Network (RNN) – Part 4: Attentional Interfaces

oIn this post, we will be covering the encoder-decoder architecture with attention for seq-seq tasks. We will loosely follow the implementation from the paper I have simplified here. First we will take a look at the entire model and talk about some interest parts, then we will break down attention and then we will start … Continue reading Recurrent Neural Network (RNN) – Part 4: Attentional Interfaces