Applying a hidden Markov model to multiple simultaneous bit sequences

This excellent article on the implementation of the hidden Markov model in C # does a fair job of classifying a single bit sequence based on training data.

How to change the algorithm or create it (several HMM?) To support the classification of several simultaneous bit sequences?

Example

Instead of classifying only one thread:

double t1 = hmm.Evaluate(new int[] { 0,1 });      // 0.49999423004045024  
double t2 = hmm.Evaluate(new int[] { 0,1,1,1 });  // 0.11458685045803882

Rather, classify a stream with two bits:

double t1 = hmm.Evaluate(new int[] { [0, 0], [0, 1] });
double t2 = hmm.Evaluate(new int[] { [0, 0], [1, 1], [0, 1], [1, 1] });

Or even better, three threads:

double t1 = hmm.Evaluate(new int[] { [0, 0, 1], [0, 0, 1] });
double t2 = hmm.Evaluate(new int[] { [0, 0, 1], [1, 1, 0], [0, 1, 1], [1, 1, 1] });

Obviously, training data will also be expanded.

+3
source share
2 answers

, n- , HMM 2^n, n - -.

: 8 : 000 001 010 011 100 101 110 111, megavariable, (0/1 )

+1

Accord.NET Framework. , , , HiddenMarkovModel . , JointDistribution.

, , , , " " .

:

// Specify a initial normal distribution for the samples.
var initialDensity = MultivariateNormalDistribution(3); // 3 dimensions

// Create a continuous hidden Markov Model with two states organized in a forward
//  topology and an underlying multivariate Normal distribution as probability density.
var model = new HiddenMarkovModel<MultivariateNormalDistribution>(new Ergodic(2), density);

// Configure the learning algorithms to train the sequence classifier until the
// difference in the average log-likelihood changes only by as little as 0.0001
var teacher = new BaumWelchLearning<MultivariateNormalDistribution>(model)
{
    Tolerance = 0.0001,
    Iterations = 0,
};

// Fit the model
double likelihood = teacher.Run(sequences);
+1

Source: https://habr.com/ru/post/1767845/


All Articles