Student research opportunities
Unifying Probability and Logic for Learning (UPL)
Project Code: CECS_856
This project is available at the following levels:
Engn R&D, Honours, Summer Scholar, Masters, PhD
Keywords:
probability; logic; learning; induction; confirmation; prior; knowledge; entropy; unification.
Supervisor:
Professor Marcus HutterOutline:
Automated reasoning about uncertain knowledge has many applications. One difficulty when developing such systems is the lack of a completely satisfactory integration of logic and probability for learning. The main question is the following: Given a set of sentences, each having some probability of being true, what probability should be ascribed to other (query) sentences? A natural wish-list, among others, is that the probability distribution (i) is consistent with the knowledge base, (ii) allows for a consistent inference procedure and in particular (iii) reduces to deductive logic in the limit of probabilities being 0 and 1, (iv) allows (Bayesian) inductive reasoning and (v) learning in the limit and in particular (vi) allows confirmation of universally quantified hypotheses/sentences. In [HLNU12] this wish-list is converted into technical requirements for a prior probability and shows that probabilities satisfying them all exist. The theory is a step towards a globally consistent and empirically satisfactory unification of probability and logic and learning.
Goals of this project
While [HLNU12] provides a consistent theory of probabilities on sentences ready for induction, much is left to be explored, from a philosophical, logical, probabilistic, and practical perspective. The logic possesses a novel semantics, which has yet to be explored, esp. connections to existing logics. The major ideas work in many logics but there are important and subtle pitfalls to be avoided, so adaptations are not straight-forward. Integration with the well-developed theory of universal induction for binary sequences [RH11] might make the theory unique and lead to optimal priors. The theory should enable one to study open problems and paradoxes in the philosophy of induction. A major challenge is to develop reasonable approximation schemes for the different currently incomputable aspects of the general theory suitable for autonomous reasoning agents. One approach is to replace the model-theoretic constructions by a to-be-developed (incomplete, approximate, asymptotic) reasoning calculus. Another is to develop novel constructions and characterizations of probabilities with additional desirable properties.
Requirements/Prerequisites
The required background depends on the chosen direction (philosophical, logical, probabilistic, approximations), but students should meet at least two of the following criteria:
- good background in logic and probability theory,
- creativity in finding and constructing proofs,
- excellent abstract thinking ability,
- good/excellent writing skills,
- interest in philosophical foundations.
Student Gain
- getting acquainted with the interaction of logic, probability, and learning.
- acquiring experience in proving non-trivial theorems.
- getting experience in writing research papers.
- acquiring a deep understanding of fundamental questions relevant for induction, science, and AI.
Background Literature
- [HLNU12] M. Hutter, J.W. Lloyd, K.S. Ng, and W.T.B. Uther.
Probabilities on sentences in an expressive logic.
Technical Report, http://arxiv.org/abs/1209.2620. - [RH11] S. Rathmanner and M. Hutter.
A philosophical treatise of universal induction.
Entropy, 13(6):1076--1136, 2011.



