### Sequence Prediction – Hidden Markov Model and Recurrent Neural Networks

#### Abstract

Machine learning is a hot topic nowadays and one of its subtypes is sequence prediction. Sequence prediction is the process of predicting an event or a series of events based on past observations. This can be done using some models developed specifically for sequence learning. Two of the more popular and prevalent models are Hidden Markov Model and Recurrent Neural Networks. Hidden Markov Models are primarily a probabilistic model. They perform well in scenarios where data over time steps do not need to be retained, but have an inherent disadvantage in sequences that need its data and order to be retained. Recurrent Neural Networks solve that problem using several methods, and one of those is Long Short-Term Memory. Long Short-Term Memory keeps track of all the data that came before in an efficient manner. We have a brief overview of these two models and see how the two compares with each other.

#### Keywords

#### Full Text:

PDF#### References

A. Graves and N. Jaitly, "Towards end-to-end speech recognition with recurrent neural networks," in Proceedings of the 31st International Conference on Machine Learning (ICML-14), 2014, pp. 1764-1772.

M. R. Hassan and B. Nath, "Stock market forecasting using hidden Markov model: a new approach," in Intelligent Systems Design and Applications, 2005. ISDA'05. Proceedings. 5th International Conference on, 2005, pp. 192-196.

O. Vinyals, A. Toshev, S. Bengio, and D. Erhan, "Show and tell: A neural image caption generator," in Proceedings of the IEEE conference on computer vision and pattern recognition, 2015, pp. 3156-3164.

B. Alipanahi, A. Delong, M. T. Weirauch, and B. J. Frey, "Predicting the sequence specificities of DNA-and RNA-binding proteins by deep learning," Nature bio-technology, vol. 33, pp. 831-838, 2015.

O. J. Räsänen and J. P. Saarinen, "Sequence prediction with sparse distributed hyperdimensional coding applied to the analysis of mobile phone use patterns," IEEE transactions on neural networks and learning systems, vol. 27, pp. 1878-1889, 2016.

R. Sun and C. L. Giles, "Sequence learning: from recognition and prediction to sequential decision making," IEEE Intelligent Systems, vol. 16, pp. 67-70, 2001.

L. Rabiner and B. Juang, "An introduction to hidden Markov models," IEEE assp magazine, vol. 3, pp. 4-16, 1986.

A. Krogh, B. Larsson, G. Von Heijne, and E. L. Sonnhammer, "Predicting transmembrane protein topology with a hidden Markov model: application to complete genomes," Journal of molecular biology, vol. 305, pp. 567-580, 2001.

L. R. Rabiner, "A tutorial on hidden Markov models and selected applications in speech recognition," Proceedings of the IEEE, vol. 77, pp. 257-286, 1989.

C. Chakraborty and P. Talukdar, "Issues and Limitations of HMM in Speech Processing: A Survey," International Journal of Computer Applications, vol. 141, 2016.

Y. Bengio, P. Simard, and P. Frasconi, "Learning long-term dependencies with gradient descent is difficult," IEEE transactions on neural networks, vol. 5, pp. 157-166, 1994.

S. Hochreiter, Y. Bengio, P. Frasconi, and J. Schmidhuber, "Gradient flow in recurrent nets: the difficulty of learning long-term dependencies," ed: A field guide to dynamical recurrent neural networks. IEEE Press, 2001.

S. Hochreiter and J. Schmidhuber, "LSTM can solve hard long time lag problems," in Advances in neural information processing systems, 1997, pp. 473-479.

M. Panzner and P. Cimiano, "Comparing Hidden Markov Models and Long Short-Term Memory Neural Networks for Learning Action Representations," in International Workshop on Machine Learning, Optimization and Big Data, 2016, pp. 94-105.

X. Wang, S. Takaki, and J. Yamagishi, "A Comparative Study of the Performance of HMM, DNN, and RNN based Speech Synthesis Systems Trained on Very Large Speaker-Dependent Corpora," in 9th ISCA Speech Synthesis Workshop, 2016, pp. 118-121.

### Refbacks

- There are currently no refbacks.

This work is licensed under a Creative Commons Attribution 3.0 License.