DIMACS Theory Seminar
Title:
Learning noisy halfspaces and weak linear programming
Speaker:
- Avrim Blum
- CMU
Place:
- Room 402, Computer Science Building,
- 35 Olden Street, Princeton University
Time:
- Lunch served at 11:45 AM, Room 402
- Talk: Room 402, at 12 Noon
- Friday, October 6, 1995
Abstract:
The problem of learning a linear threshold function (a halfspace in n
dimensions, also called a "perceptron") is one of the oldest in machine
learning. Methods for doing this generally fall into two categories. Greedy
algorithms are simple and can be made noise tolerant; but, their running time
depends on a separation parameter that may be exponentially small. On the
other hand, linear programming algorithms such as the Ellipsoid Algorithm run
in polynomial time, but seem to be intolerant of noise. We show how greedy
methods can be used to find weak hypotheses (hypotheses that classify
noticeably more than half of the examples) in polynomial time, without
dependence on any separation parameter. This results in a polynomial-time
algorithm for learning halfspaces in the presense of classification noise.
This is joint work with Alan Frieze, Ravi Kannan, and Santosh Vempala.
Host: Sanjeev Arora
Document last modified on October 3, 1995