CNTK: sequence loss function for sequence processing

I am making a sequence model for a sequence to align phonemes. In particular, the data of my train look like paired sequences (phoneme - length), where the phoneme is one hot vector and the length is a float. Therefore, I want to feed the model using a sequence of phonemes and get the corresponding sequence of lengths.

My network is usually built as follows:

model = Sequential( EmbeddingLayer{embeddingSize} : RecurrentLSTMLayerStack {lstmDims} : LinearLayer{1} ) 

LinearLayer{1} should do the conversion from lstmDims to 1 if I get everything right. Therefore, when I submit a model with a sequence of length N, I should also get a resulting sequence of length N.

Now I want to set up the correct loss function, which, in my opinion, should be the average difference between elements of a known sequence of results and model output. Averaging must be performed over the time axis so that sequences of different lengths can be controlled.

I planned to do something like

 objectives = Input(1) #actually a sequence here as stated in the reader result = model(features) errs = Abs(objectives - result) loss_function = ReduceMean(errs) criterionNodes = (loss_function) 

but Reduction Actions clearly states that

These operations do not support sequence reduction. Instead, you can achieve this with repetition.

I am not sure how to use repetition for my task. And I'm also not sure that the whole concept is in order.

+5
source share
2 answers

You need two repetitions that are not too complicated (for the second we use the "built-in" operation, the implementation of which is in the cntk.core.bs file):

 sum = errs + PastValue (0, sum, defaultHiddenActivation=0) count = BS.Loop.Count(errs) loss_function = sum / count 
+1
source

GitHub has a specific tutorial on sequence in sequence, in which you view data similar to yours. You can see how the network is defined.

https://github.com/Microsoft/CNTK/blob/master/Tutorials/CNTK_204_Sequence_To_Sequence.ipynb

0
source

Source: https://habr.com/ru/post/1262558/


All Articles