Online Learning

Graduate course to be given Winter 2006. Tue, Thu 11:00 - 12:20.
HSS 2305A, Humanities & Social Science Building in Muir College.
Registered students will recieve a grade of Satisfactory / Unsatisfactory


Please drop me a line (yfreund at ucsd.edu) if you plan to attend this class.

Description

Suppose we observe a binary sequence, one bit at a time. Our task, at each step, is to predict the next bit in the sequence. At our disposal we have the predictions of N "experts". At each step each one of the experts generates it's own prediction, in the form of a bit. Suppose we know that one of the experts is perfect i.e. all of it's predictions are correct. Can you find a prediction algorithm that would make at most log2(N) mistakes?

Try it, try it, and you may, try and you may I say!       [ Dr. Seuss, Green Eggs and Ham, 1960 ]

This course is about online prediction methods of the type you have just discovered (If you have not found it yet, here is a clue: it is called the Halving algorithm). At first blush, it might seem that this algorithm is just a nice little trick that works only because we assume that one of the experts is perfect.

In fact, this trick can, and has been, generalized to most imaginable situations.
In fact, I will argue that this methodology provides a mathematical foundation for prediction that is more complete and more relevant than existing foundations.

Come check it out!

Topics I hope to cover (Not in chronological order)

Prerequisites

You should have a good foundation in math, i.e. linear algebra, probability, integrals, taylor expansions. You will also need to devote a significant amount of time between classes to reading papers from a variety of fields. Notation and terminology in these papers varies a lot, which makes reading a challange. However, mastering these different terminologies and understanding the relationships between them will provide you with an invaluable insight into this exciting and evolving research area.