In contrast with training algorithms such as Baum-Welch, which produce solutions that
are a local optimum of the objective function, this thesis describes the attempt to develop
a training algorithm which delivers the global optimum Discrete ICdden Markov Model
for a given training sequence.
A total of four different methods of attack upon the problem are presented. First, after
building the necessary analytical tools, the thesis presents a direct, calculus-based assault
featuring Matrix Derivatives. Next, the dual analytic approach known as Geometric
Programming is examined and then adapted to the task. After that, a hill-climbing
formula is developed and applied.
These first three methods reveal a number of interesting and useful insights into the
problem. However, it is the fourth method which produces an algorithm that is then
used for direct comparison vAth the Baum-Welch algorithm: examples of global optima
are collected, examined for common features and patterns, and then a rule is induced.
The resulting rule is implemented in *C' and tested against a battery of Baum-Welch
based programs. In the limited range of tests carried out to date, the models produced
by the new algorithm yield optima which have not been surpassed by (and are typically
much better than) the Baum-Welch models. However, far more analysis and testing is
required and in its current form the algorithm is not fast enough for realistic application.
Date of Award | 1999 |
---|
Original language | English |
---|
Awarding Institution | |
---|
Optimal Hidden Markov Models
McKee, B. F. (Author). 1999
Student thesis: PhD