A LSTM is a special type of recurrent neural network (RNN). A RNN is a neural network architecture that deal with sequenced data by keeping the sequence in memory. Conversely, a typical feed-forward neural does not keep the information about the sequences and do not allow for flexible inputs and outputs. A recursive neural network uses recursion to call from one output back to its input thereby generating a sequence. It passes a copy of the state of the network at any given time. In our case we are using two layers for our RNN. This additional layer helps with accuracy.
LSTMs solve a problem of vanilla RNNs by dropping out data to solve the vanishing gradient problem. The vanishing gradient problem is when the neural network stops training early but is inaccurate. By using dropout data we can help solve that problem. The LSTM does this by using gating functions.