Batch Normalization is a regularization function that has appeared recently.
Rather than doing one-hot encoding, we tend to represent words with shorter vectors which can have continuous values.it’s called an embedding.