# Gated recurrent unit The gated recurrent unit (GRU) is a variation of the [[Long short-term memory]] unit with fewer parameters and no output gate. ![Gated recurrent unit](https://i.imgur.com/jbniHJP.png) - $x_t$ is the input vector - $z_t$ is the update gate vector - $r_t$ is the reset gate vector - $\tilde{h}_t$ is the candidate activation vector - $h_t$ is the output vector --- ## 📚 References - Cho, Kyunghyun, et al. ["Learning Phrase Representations using RNN Encoder-Decoder for Statistical Machine Translation."](https://arxiv.org/abs/1406.1078v3) arXiv, 3 June 2014. - ["Understanding LSTM Networks -- colah's blog."](https://colah.github.io/posts/2015-08-Understanding-LSTMs) 27 August 2015. - ["Gated recurrent unit | Wikipedia."](https://en.wikipedia.org/wiki/Gated_recurrent_unit) Wikipedia, 21 Sept. 2021.