Recurrent Neural Networks (RNNs) make use of sequential or time series data. In a regular neural network, we consider that all inputs and outputs are independent of each other. For a task where you want to predict the next word in a given sentence, it's better to know which words have come before it. RNNs are recurrent as the same task is performed for every element in the sequence where the output is dependent on the previous calculations. RNNs can be thought of as having a memory that captures information about what has been computed so far.
Going from feedforward neural networks to recurrent neural networks, we will use the concept of sharing parameters across various parts of the model. Parameter sharing will make it possible to extend and apply the model to examples of different forms (different lengths, here) and generalize across them.