In this tutorial, we’re going to be learning about more advanced types of RNN bidirectional LSTM. It’s all about information flowing left to right and right to left.

Sentiment Classification is the task when you have some kind of input sentence such as “The movie was terribly exciting !” and you want to classify this as a positive or negative sentiment. In this example, it should be seen as a positive sentiment. This is an example of how you might try to solve sentiment classification using a fairly simple RNN model.

The idea is that it’s a representation of the word “terribly” in the context of the sentence. Think about here that this contextual representation only contains information about the left context. It hasn’t seen the information of the words exciting or exclamation mark. What about the right context?

In this example, it is important because we’ve got the phrase “terribly exciting”. If you look at the word “terribly” in isolation, it usually means something bad. But “terribly exciting” means something good because it just means very exciting. If you know about the right context, the word “exciting” might quite significantly modify your perception of the meaning of the word “terribly” in the context of the sentence.

## Using a Bidirectional RNN in Practice

The idea is that you have two RNNs going on. You have the forward RNN as before that encodes the sentence from left to right. Then separately, you also have a backward RNN. This has completely separate weights from the forward RNN. The backward RNN is just doing the same thing except that it’s encoding the sequence from right to left. So each of the hidden states is computed based on the one to the right. Then finally, you just take the hidden states from the two RNNs, and then you concatenate them together and you’ve got your final kind of representations.

If we think about this contextual representation of the word “terribly” in the context, this vector has information for both the left and the right. Because you had the forward and backward RNNs that respectively had information from both left and right.

```model_multi_bi = tf.keras.Sequential()