Context-Sensitive Hidden Markov Models for Modeling Long-Range Dependencies in Symbol Sequences |
| |
Abstract: | The hidden Markov model (HMM) has been widely used in signal processing and digital communication applications. It is well known for its efficiency in modeling short-term dependencies between adjacent symbols. However, it cannot be used for modeling long-range interactions between symbols that are distant from each other. In this paper, we introduce the concept of context-sensitive HMM. The proposed model is capable of modeling strong pairwise correlations between distant symbols. Based on this model, we propose dynamic programming algorithms that can be used for finding the optimal state sequence and for computing the probability of an observed symbol string. Furthermore, we also introduce a parameter re-estimation algorithm, which can be used for optimizing the model parameters based on the given training sequences. |
| |
Keywords: | |
|
|