## DIMACS TR: 2000-38

## Polynomial Time Array Dataflow Analysis

### Authors: Robert Seater and David Wonnacott

**
ABSTRACT
**

Array dataflow analysis is a valuable tool for supercomputer
compilers. However, the worst-case time complexities for array
dataflow analysis techniques are either not well understood or
alarmingly high. For example, the constraint-based analysis used in
the Omega Test uses a subset of the 2^2^2^O(n) language of Presburger
Arithmetic for analysis of affine dependences, and its use of
uninterpreted function symbols for non-affine terms introduces
additional sources of complexity. Even traditional data dependence
analysis in this affine domain is equivalent to integer programming,
and is thus NP-complete. These worst-case complexities have raised
questions about the wisdom of using array dataflow analysis in a
production compiler, despite empirical data that show that various
tests runs quickly in practice.
In this paper, we demonstrate that a polynomial-time algorithm can
produce accurate information about the presence of loop-carried array
dataflow. We first identify a subdomain of Presburger Arithmetic that
can be manipulated (by the Omega Library) in polynomial time; we then
describe a modification to prevent exponential blowup of the Omega
Library's algorithm for manipulating function symbols. Restricting
the Omega Test to these polynomial cases can, in principle, reduce the
accuracy of the dataflow information produced by the Omega Test. We
therefore present an investigation of the effects of our restrictions
on the analysis of loop-carried array dataflow dependences in
benchmark codes. The restriction to polynomial-time algorithms blocks
parallelization of only three unimportant loops in the approximately
18000 lines of benchmark code we studied.

Paper Available at:
ftp://dimacs.rutgers.edu/pub/dimacs/TechnicalReports/TechReports/2000/2000-38.ps.gz

DIMACS Home Page