MotivationsThe theory of programming languages is one of the most technically sophisticated theoretical areas in computing, and, for all its abstractness, it has been reasonably successful in having its insights applied in practice. Most successes in programming language theory have been in studying the input/output behavior of programs, procedures, etc. The successes in studying other aspects of the behavior of programs (e.g., their time and space complexity) have been rather limited-especially in regard to more advanced programming constructs.For example, while ML is a wonderful language in which to state algorithms, there is no adequate theoretical support for the complexity analysis of ML programs and modules. This is in part due to the fact that ML includes features such as higher-type procedures for which there is only a rudimentary theory of their computational complexity.There has been scattered work in the past few years in extending the theories of programming languages and computational complexity to deal with such problems. However, there has been little synergy between the various pockets of researchers conducting this work. The groups come from diverse communities, speak distinct technical languages, and have quite different perspectives on the core phenomena.
For another example, program specifications can be very detailed and formal when it comes to laying out requirements for module interfaces and such, but in the matter of stating performance specifications (e.g., time complexity requirements) one has to resort to loose informal narratives. This is weak engineering, but it is the result of weak scientific support -- there is currently no precise way to talk about the computational complexity of a module or an ``object.''
Thus, the idea behind the workshop was to bring together researchers in computational complexity and in programming languages to start a discussion between the various parties, explain each other's concerns to one another, define problem areas, and possibly start some collaborations.
The WorkshopThe quality of the talks, both invited and contributed, turned out to be very high. Topics covered in presentations included:
While no synergistic miracles occured, there were apparent themes that cut across talks from different areas. For example:
- amortized analysis of lazy data structures for persistent data (Okasaki)
- cost models for functional languages (Blelloch, Mairson, K. Rose)
- applications of intensional semantics to computational complexity (Abramsky, Dancanent, Hains, Moggi)
- complexity of higher-order functionals (Kapron, Royer)
- applications of type theory (Hoffman, Leivant, Otto)
- realistic computational models for programming language applications (Jones, E. Rose)
- the difficulties in finding realistic cost models of functional languages such as the lambda-calculus, PCF, Haskell, etc.
- the subtleties involved in adapting standard programming language tools (e.g., compositional reasoning) to complexity theoretic problemsAn example given by Neil Jones. Consider
\ A : int_array . insertion_sort(heap_sort(A))
where ``\'' stands for lambda and insertion_sort and heap_sort are faithful implementations of insertion sort and heap sort, respectively, that take as argument and return as result integer arrays. Naive compositional reasoning would indicate this new algorithm would run in Theta(n2) worse-case time (because heap sort is O(n log n) and insertion sort is O(n2)). In fact, it is an O(n log n) algorithm (because insertion sort is linear time on sorted inputs).
- the subtleties involved in adapting standard complexity theoretic tools to programming language problems (e.g., what should feasibility mean for higher-type computation, how does one do a complexity analysis of a lazy algorithm).
- the interesting possibilities in using something like semantic games as a means through which one could talk about the complexity of higher-type computations.
Back to the list of talks