If you want to apply BatchNormalization to LSTM line outputs, you can do it like
from keras.models import Sequential
from keras.layers.recurrent import LSTM
from keras.layers.wrappers import Bidirectional
from keras.layers.normalization import BatchNormalization
model = Sequential()
model.add(Bidirectional(LSTM(128, activation=None), input_shape=(256,10)))
model.add(BatchNormalization())
Essentially, you remove the non-linear LSTM activations (but not the gate activations), and then apply BatchNormalization to the output.
BatchNormalization LSTM, , , , Keras.