最新消息:雨落星辰是一个专注网站SEO优化、网站SEO诊断、搜索引擎研究、网络营销推广、网站策划运营及站长类的自媒体原创博客

python - Viterbi Decoding Returns -Incorrect State Sequence with One-Hot Observations in MultinomialHMM (Tried v0.3.0, v0.3.2, a

programmeradmin6浏览0评论

I'm experiencing unexpected behavior with the MultinomialHMM in hmmlearn. When using one-hot encoded observations (with n_trials=1), the Viterbi algorithm returns the state sequence incorrectly.

In my minimal reproducible example, the decoded state sequence consists entirely of state 0, even though the parameters push for state 2.

Steps to Reproduce:

Use the following script as a minimal reproducible example:

import numpy as np
from hmmlearn import hmm

def main():
    # Define HMM parameters
    start_prob = np.array([0.3, 0.3, 0.4])
    trans_mat = np.array([
        [0.8, 0.1, 0.1],
        [0.1, 0.8, 0.1],
        [0.1, 0.4, 0.5]
    ])
    emission_mat = np.array([
        [0.3,  0.3,  0.3,  0.1],
        [0.25, 0.25, 0.25, 0.25],
        [0.25, 0.25, 0.25, 0.25]
    ])

    # Create an observation sequence:
    # Here, we create 20 observations, all of which are symbol 2.
    obs_int = np.array([2] * 20)
    # Convert to one-hot encoded observations (required by hmmlearn with n_trials=1)
    observations = np.eye(4)[obs_int]
    print(observations)

    # Initialize the HMM model.
    model = hmm.MultinomialHMM(n_components=3, n_trials=1, init_params="")
    model.startprob_ = start_prob
    model.transmat_ = trans_mat
    model.emissionprob_ = emission_mat

    # Decode the observation sequence using the Viterbi algorithm.
    logprob, state_seq = model.decode(observations, algorithm="viterbi")
    print("Log probability:", logprob)
    print("State sequence:", state_seq)

if __name__ == "__main__":
    main()

The outputs:

[[0. 0. 1. 0.]
 [0. 0. 1. 0.]
 [0. 0. 1. 0.]
 [0. 0. 1. 0.]
 [0. 0. 1. 0.]
 [0. 0. 1. 0.]
 [0. 0. 1. 0.]
 [0. 0. 1. 0.]
 [0. 0. 1. 0.]
 [0. 0. 1. 0.]
 [0. 0. 1. 0.]
 [0. 0. 1. 0.]
 [0. 0. 1. 0.]
 [0. 0. 1. 0.]
 [0. 0. 1. 0.]
 [0. 0. 1. 0.]
 [0. 0. 1. 0.]
 [0. 0. 1. 0.]
 [0. 0. 1. 0.]
 [0. 0. 1. 0.]]
MultinomialHMM has undergone major changes. The previous version was implementing a CategoricalHMM (a special case of MultinomialHMM). This new implementation follows the standard definition for a Multinomial distribution (e.g. as in ). See these issues for details:


Log probability: -29.52315636581463
State sequence: [0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0]

与本文相关的文章

发布评论

评论列表(0)

  1. 暂无评论