home page
people
research
publications
seminars
travel
search
UCL
| |
The Helmholtz Machine
Peter Dayan, Dept. of Computer Science, University of Toronto
Geoffrey E. Hinton, Dept. of Computer Science, University of Toronto
Radford M. Neal, Dept. of Computer Science, University of Toronto
Richard S. Zemel, The Salk Institute
Discovering the structure inherent in a set of patterns is a
fundamental aim of statistical inference or learning. One fruitful approach is to build a
parameterised stochastic generative model, independent draws from which are likely to
produce the patterns. For all but the simplest generative models, each pattern can be
generated in exponentially many ways. It is thus intractable to adjust the parameters to
maximize the probability of the observed patterns, We describe a way of finessing this
combinatorial explosion by maximising an easily computed lower bound on the probability of
the observations. Our method can be viewed as a form of hierarchical self-supervised
learning that may relate to the function of bottom-up and top-down cortical processing
pathways.
In Neural Computation, vol. 7, pp. 1022-1037 (1995).
Download: [ps.gz] [pdf]
Associated reference:
A stochastic algorithm related to the Helmholtz Machine is discussed in the
following paper: Hinton, G. E., Dayan, P., Frey, B. J., and Neal, R. M. (1995) The
wake-sleep algorithm for unsupervised neural networks, Science, vol. 268, pp.
1158-1161. Download [abstract]
[ps] [pdf]
[home page] [publications]
|