Hidden Markov model for triangular cubes

I was taught HMM and asked this home problem. I understood part of it, but I'm not sure if this is correct. The problem is this:

Consider another game in which the dealer does not flip a coin, but instead has a three-sided stamp with marks 1, 2, and 3. (Try not to think about what a three-sided die might look like.) The dealer has two loaded bones D1 and D2. For each matrix Di, the probability of rolling the number i is 1/2, and the probability of each of the other two results is 1/4. At each turn, the dealer must decide whether (1) to keep the same die, (2) switch to another stamp or (3) end the game. He chooses (1) with a probability of 1/2 and each of them with a probability of 1/4. First, the dealer selects one of two dice with equal probability.

  • Give HMM for this situation. Indicate the alphabet, state, transition probability and probability of radiation. Include the start of the start and accept that the HMM starts from the beginning with probability 1. Also include the end of the end state.

  • Suppose you observe the following sequence of rolls of rollers: 1 1 2 1 2 2. Find the sequence of states that best explains the sequence of rolls. What is the likelihood of this sequence? Find the answer by filling out the Viterbi table. Include backtracking in cells so you can track the sequence of states. Some of the following facts may be helpful:

    log2 (0) = -∞
    log2 (1/4) = -2
    log2 (1/2) = -1
    log2 (1) = 0

  • In fact, there are two optimal sequences of states for this sequence of shots. What is another sequence of states?

If I'm not mistaken in the first part, I need to do something like http://en.wikipedia.org/wiki/Hidden_Markov_model#A_concrete_example But I did not quite understand what should start with probability 1.

Also, I'm not sure what I need to do for the Viterbi table in the second part of the question. If any body can give me a clue or a clue, I will be grateful.

+6
source share
1 answer

Suppose your starting probability is one: In the HMM, you either have a fixed initial state or a probability distribution over all states that indicates how likely it is to start in state X. Suppose that your starting probability for this state is 1 is equal to the first alternative.

Viterbi algorithm: In the viterbi matrix, the i-th row offten corresponds to the i-th state, and the j-th column corresponds to the lenth j prefix of your emitted symbol. In each entry (i, j) there is a maximum probability that you have already seen the prefix j, and you are in state i.

For your return following, you need to track each (i, j) -cube in which the maximum predecessor participated in the calculation of (i, j) -cages. If you have this information, you can step back from the cell with the highest value in the last column to the beginning. Drop it back and you got your way viterbi.

+2
source

Source: https://habr.com/ru/post/903265/


All Articles