There are many cases where memory is required of neural networks. For instance, when modeling natural language context is important, that is, the meaning of a word late in a sentence is affected by the meaning of words earlier in the sentence. Compare this to the approach used by Naive Bayes classifiers, where only the bag of words is considered but not their order. Similarly, time series data may require some memory in order to make accurate predictions, as a future value may be related to current or past values.
RNN are a family of ANN topologies in which the information does not necessarily flow in only one direction. In contrast to feedforward neural networks, RNNs allow the output of neurons to be fed backward into their input, creating a feedback loop. Recurrent networks are almost always time-dependent. The concept of time is flexible, however...