Course Notes on Bayesian Netoworks
This node contains one installment of the course notes for
MIT's graduate
course on the foundations of artificial intelligence.
In this course Bayesian networks are taught in one of a three
part sequence. First comes hidden Markov models for speech
recognition. Then comes Bayesian networks. And finally
parsing with stochastic context free grammars and the inside/outside
algorithm. These three subjects are all closely related. Both
Bayesian networks and stochastic grammars can be viewed as generalizations
of hidden Markov models. Presenting the material this way
reduced the effort required on the part of the students --- the equations
from each of these subjects are directly analogous with one another.
Unfortunately, course notes have not been prepared for hidden Markov
models or the inside/outside algorithm.
postscript.
David McAllester, February, 1995