Seminar in Computational Learning: The Exact Learning Model

Dr. Amos Beimel

Fall 1999

The ability of a computer to learn by interacting with the environment is a fascinating subject. There are a few theoretical models of the learning process. Among them, the exact learning model represents situations in which a learner actively collects information. Specifically, the learner who tries to learn some target function f is allowed to make two kinds of queries regarding f:

  1. The learner may ask for the value f(x) on points x of its choice, or
  2. It may suggest a hypothesis h to which it either gets an answer ``hypothesis is equivalent to f'' or gets a counterexample.
The exact learning model was introduced by Angluin in 1988, and since then many interesting algorithms were presented in this model for various classes of functions. In this seminar we will focus on three topics: learning automata, learning sub-classes of DNF, and learning geometric objects.

Requirements:

This is a 2-credit course, consisting of weekly 2-hour meetings. I will give the first two lecture. Each student will present one or two papers from the list below. In addition the student will submit a summary of the papers they presented. The students are required to attend (and hopefully understand) all lectures. There maybe one or two homework assignments (if students won't be active in the lectures).

Pre-requirements:

The course in intended for graduate students and third year undergrad students. Automata and formal languages course and design of algorithms course are required.

List of papers

Lectures:

Lecture 1: Introducing the exact learning model.
Lecture 2: Comparison of Learning Models, and Learning Monotone DNF.
Lecture 3: Learning Deterministic Automata.
Lecture 4: Learning Deterministic Automata without Reset.
Lecture 5: Learning Multiplicity Automata.
Lecture 6: Using Multiplicity Automata to Learn Polynomials, Decision Trees, and Disjoint DNF.
Lecture 7: Learning read-one formulae.
Lecture 8: The monotone theory.
Lecture 9: Learning boxes in high dimension.
Lecture 10: The composition theorem - learning geometric objects in constant number of dimensions.
Lecture 11: The on-line model, learning decision lists, and the winnow algorithm.
Lecture 12: Efficient learning with virtual threshold gates.

Files:

Include file - An include file to use when you prepare your lecture notes

Sample file - a skeleton file for using the include file

Latex file of Lecture 1 - The latex file of the first lecture (figures omitted).

Information:

Lecture hours: Tuesday 10:00-12:00, Building 28, Room 103,
Reception hours:Thursday 12:30-14:30, Building 58 (Math), Room 205.
E-mail: beimel at cs.bgu.ac.il
Phone: 647 7858
Course home page: http://www.cs.bgu.ac.il/~beimel/Courses/learning.html