DIMACS Theoretical Computer Science Seminar

Title: Every linear threshold function has a low-weight approximator

Speaker: Rocco Servedio, Columbia University

Date: Wednesday, March 28, 2007 11:00am - 12:00pm

Location: CoRE Bldg, CoRE A, Room 301, Rutgers University, Busch Campus, Piscataway, NJ


A linear threshold function, or halfspace, is defined by a hyperplane w.x=\theta through n-dimensional Euclidean space; it assigns a binary label to each input point according to which side of the hyperplane the point lies on. Linear threshold functions are well studied in areas such as learning theory and complexity theory; in particular, linear threshold functions with small integer weights are often of special interest.

Given any linear threshold function f on n Boolean variables, we construct a linear threshold function g which disagrees with f on at most an \epsilon fraction of inputs and has integer weights each of magnitude at most \sqrt{n} \cdot 2^{\tilde{O}(1/\epsilon^2)}. The construction is optimal in terms of its dependence on n.

We give two applications. The first is a deterministic algorithm for approximately counting the fraction of satisfying assignments to zero-one knapsack problems to within an additive \pm \epsilon. The algorithm runs in time polynomial in n for any constant \epsilon. In our second application, we show that any linear threshold function f is specified to within error \epsilon by estimates of its Chow parameters (degree 0 and 1 Fourier coefficients) which are accurate to within an additive \pm 1/(n \cdot 2^{\tilde{O}(1/\epsilon^2)}). This is the first such accuracy bound which is inverse polynomial in n, and gives the first polynomial bound (in terms of n) on the number of examples required for learning linear threshold functions in the "restricted focus of attention" framework.