SPEAKER: Alan Yuille

Learning Recursive Hierarchical Models of Objects.

This talk describes work on learning object models. Objects are
represented by recursive compositional models (RCMs) which are constructed
from hierarchical dictionaries of more elementary RCMs. These dictionaries
are learnt in an unsupervised manner using principles such as suspicious
concidence and competitive exclusion. For multiple objects, we learn
hierarchical dictionaries which encourage part-sharing (i.e. sharing of
elementary RCMs). This gives an efficient representation of multiple
objects and yields efficient inference and learning. We deomonstrate
results on benchmarked real images. We will discuss similarities and
differences between this work and more traditional deep belief networks.