Recurrent neural network (RNN) is a class of artificial neural network that utilizes information arbitrarily in long sequences. It represents history by neurons with recurrent connections. It can learn to compress unlimited history in low dimensional space. Contrary to the traditional neural network that all inputs and outputs are independent of each other.
Recurrent networks have the capability to form short term memory that enable them to deal with position invariance which feedforward networks cannot do. RNNs are called recurrent due to its performance of the same task for every element of a sequence with the output being depended on the previous computations.
Attention and Augmented Recurrent Neural Networks
RECURRENT NEURAL NETWORKS Design and Applications
L. R. Medsker and L. C. Jain
Sequence Modeling: Recurrent
and Recursive Nets
Ian Goodfellow, Yoshua Bengio, Aaron Courville
Documentaries, videos and podcasts
Stanford University School of Engineering: Lecture 10 | Recurrent Neural Networks
- Convolutional neural networkA type of neural network which uses overlapping input neurons modeled on the behavior of human visual cortex. Convolutional neural networks are best known for their use in image analysis, specifically object recognition.
- Long short-term memoryA variation of recurrent neural network that can be backpropagated through time and layers