$\color{black}\rule{365px}{3px}$
“Sequence to Sequence Learning with Neural Networks” - 2014
Link to Paper:
https://arxiv.org/pdf/1409.3215
Table of Contents
1. Introduction
$\color{black}\rule{365px}{3px}$

Recurrent Neural Networks (RNNs) are a class of neural networks designed to recognize patterns in sequences of data such as time-series, text, or audio. Unlike traditional feedforward neural networks, RNNs have a unique architecture that incorporates a feedback loop, allowing information to persist.
Contributions
- Sequential Data Processing: RNNs handle data where previous inputs influence the current output.
- Memory Retention: Maintain a hidden state to store information about past inputs.
- Weight Sharing: Reuse the same weights across all time steps, reducing complexity.
- Variants to Improve Performance:
- LSTM: Tackles vanishing gradient issues with gating mechanisms.
- GRU: Simplified version of LSTM, requiring fewer resources.
- Bidirectional RNNs: Leverage both past and future context for better predictions.