Title: Improvements in the world of fast matrix algorithms

Speaker: **Warren D. Smith**, DIMACS, Rutgers University, and Temple
University Mathematics Department

Date: Friday November 1, 2002, 1:o0 pm

Location: DIMACS Seminar Room, CoRE Building, Room 431A, Rutgers University.

Abstract:

I'll talk about two projects about "fast" matrix algorithms. The first is a computer-aided search for new basic formulas (exemplified by Strassen's 7-mul formula for multiplying 2x2 matrices). This search succeeded in finding 20 new records, and some remarkable new empirical phenomena, which I partially understand. New notions of "symmetric" basic formulas, and of "canonical forms" for them, are introduced.

The second is a project to, essentially, go thru Golub & Van Loan's book "Matrix computations" redoing every algorithm there to make it now become "fast" but without sacrificing its numerical stability and practicality. I've succeeded in most cases. In particular, I'll discuss how to perform the following matrix operations

(I) numerically backwards-stably,

(II) with the same asymptotic speed as matrix multiplication (i.e.
subcubic),

(III) with small "constant factors."

(A) Gaussian elimination with partial pivoting (i.e. PLU
factorization)

(B) Orthogonalization (i.e. QR factorization) of a rectangular matrix

(C) Reduction of a symmetric matrix to
tridiagonal form by orthogonal similarity transformations

We also show how to implement Gaussian elimination via "rook pivoting" with $O(N^{2.5})$ expected pivoting overhead. Foster had shown rook pivoting yielded unconditional backwards-stability, just like "complete pivoting."

It seems likely that the $O(N^{2.81})$-time versions of many of our algorithms will be of practical interest.