Student research opportunities
Convex Relaxation for Latent Models
Project Code: CECS_978
This project is available at the following levels:
Honours, Masters, PhD
Keywords:
Convex optimization, graphical models
Supervisor:
Dr Xinhua ZhangOutline:
Latent variables play a key role in machine learning to encode the underlying regularities in the data. Examples include PCA, mixture models, and restricted Boltzman machine (RBM). Learning these models from data usually results in a non-convex optimization and only locally optimal solutions are found. Convex relaxation tries to design a convex approximation of the problem, e.g. via relaxing discrete constraints, while at the same time retaining the salient regularities in the data. In many problems it proffers state-of-the-art performance in learning. Recently many novel latent models have been shown effective, such as RBM, and deep network.
Goals of this project
The goal of this project is to develop convex relaxations for these models, in a way that is both tight and efficiently optimizable. This will enable more accurate learning of these latent models which have been conventionally trained by heuristic methods, and will have profound impact on a number of machine learning applications such as clustering and feature extraction.
Requirements/Prerequisites
Good knowledge of linear algebra.
Ability to program, and willingness to learn.
Background Literature
See my publication page.



