### DIMACS - RUTGERS EXPERIMENTAL MATHEMATICS SEMINAR

Sponsored by the Rutgers University Department of Mathematics and the

Center for Discrete Mathematics and Theoretical Computer Science (DIMACS)

**Co-organizers:**
**Doron Zeilberger**, Rutgers University, zeilberg {at} math [dot] rutgers [dot] edu
**Nathan Fox**, Rutgers University, fox {at} math [dot] rutgers [dot] edu)

Matrix Volume and Its Applications

Speaker: **Adi Ben-Israel**, Rutgers University

Date: Thursday, March 2, 2017 5:00pm

Location: Hill Center, Room 705, Rutgers University, Busch Campus, Piscataway, NJ

Abstract:

The volume vol(A) of an mxn matrix A of rank r is: (a) the product of the r singular values of A, or (b) the square root of the sum of squares of all r x r subdeterminants of A, or (c) the volume of the image under A of a unit cube in the range of A transpose. Definition (b) is applicable to non-numerical matrices, in particular to rectangular Jacobians. Some representative applications will be discussed.

References:

[1] The matrix volume: http://benisrael.net/VOLUME.pdf

[2] Application to change-of-variables in integration: http://benisrael.net/INTEGRAL-AMS.pdf

[3] Application to probability: http://benisrael.net/MADISON-AMS.pdf

[4] Low-rank approximation In this paper we show that the negative sample distance covariance function is a quasiconcave set function of samples of random variables that are not statistically independent. We use these properties to propose greedy algorithms to combinatorially optimize some diversity (low statistical dependence) promoting functions of distance covariance. Our greedy algorithm obtains all the inclusion-minimal maximizers of this diversity promoting objective. Inclusion-minimal maximizers are multiple solution sets of globally optimal maximizers that are not a proper subset of any other maximizing set in the solution set. We present results upon applying this approach to obtain diverse features (covariates/variables/predictors) in a feature selection setting for regression (or classification) problems. We also combine our diverse feature selection algorithm with a distance covariance based relevant feature selection algorithm of [7] to produce subsets of covariates that are both relevant yet ordered in non-increasing levels of diversity of these subsets.of matrices: A. Deshpande, L. Rademacher et al, Matrix Approximation and Projective Clustering via Volume Sampling, Theory of Computing 2(2006), 225-247

See: http://www.math.rutgers.edu/~nhf12/expmath/