DANSAS 2015, Odense, August 20
|09:00 - 09:30||Coffee and breakfast rolls|
|09:30 - 09:40||Welcome|
|09:40 - 10:25||How I tried to commercialize software analysis(invited talk)|
|10:30 - 11:00||Fruit Break|
|11:00 - 11:25||Confluence Modulo Equivalence for Constraint Handling Rules|
|11:35 - 12:00||Towards static type inference for the optionally typed language Dart|
|12:05 - 13:35||Lunch|
|13:35 - 14:00||Resource analysis of C program by analyzing its Horn clauses representation|
|14:10 - 14:35||Probabilistic Output Analysis for discrete resource programs|
Maja H Kirkeby
|14:45 - 15:15||Coffee Break|
|15:15 - 15:40||Determinacy in static analysis for jQuery|
|15:45 - 16:00||Wrap-up and evaluation|
Confluence Modulo Equivalence for Constraint Handling Rules
Confluence is an important property for systems with nondeterminism such as in parallel systems or nondeterministic choice mechanisms. Constraint Handling Rules, CHR, is such a language consisting of guarded rewrite rules over constraint stores. Confluence means that the final result of a computation is independent of any specific execution order which is important for correctness, and it allows optimization by searching for a good execution order. This work generalizes previous work on confluence for CHR, introducing confluence modulo equivalence: alternative final states need not be identical, but only equivalent with respect to a programmer-defined equivalence relation.
This notion applies to programs with redundant data representation, e.g., sets-as-lists, for dynamic programming algorithms with pruning (e.g., Viterbi), that are not covered by previous confluence notions for CHR or similar systems.
The approach is based on a lifting of a semantics for CHR to a meta-level that allows to reason about compact representation of many concrete proof cases in a single abstract case, due to powerful parametrization. It this way it resembles abstract interpretation. As part of the work we introduced a new operational semantics for CHR which, in addition to the equivalence notion, also can handle extralogical and "problematic" predicates (a la Prolog style var, nonvar and 'is', not considered before wrt. confluence) and, perhaps surprisingly, is a lot simpler the previous CHR semantics.
Joint work with Maja H. Kirkeby
Towards static type inference for the optionally typed language Dart
Dart is a dynamically typed programming language which though allows to put static types where appropriate. This way, the programmer can choose where in her code to benefit from safety guarantees and documentation/tool support supported by the static typing discipline, and where to exploit the flexibility provided by dynamic typing. Furthermore, Dart's optional typing approach is in line with the idea of gradual evolution from dynamic scripts to fully fledged static programs. In particular the latter would be even more promoted by the application of type inference to also derive safe static types for the dynamically typed parts of the code -- wherever possible. In this talk, we will present our work in progress on static type inference for the Dart programming language. While type inference is in general a well-understood problem, Dart's language design imposes various challenges for implementing such a technique. As an example, Dart's type system is neither sound nor complete and, in consequence, type annotations present in the code can not always be trusted for safely inferring static types. We will thus especially elaborate on the problems we have encountered and on different approaches how to address them.
Joint work with Anders Møller, and Thomas Heinze
Resource analysis of C program by analyzing its Horn clauses representation
In this talk, we present an approach to automatic resource (energy, time etc.) analysis of C programs. Our approach combines ideas from abstract interpretation, ranking function synthesis and complexity analysis in a unified framework. Instead of acting on a source program, the analysis uses an inter- mediate language known as Constrained Horn clauses (CHCs) and computes bound in terms of the program’s input parameters. Our approach is based on the method described by Sinn et al.  where a program is abstracted to a lossy vector addition systems with states (VASS). A VASS can be con- veniently described by a set of CHCs where the constraints have a particular shape. Our analysis derives a VASS from the input Horn clauses using abstract interpretation and ranking function syn- thesis. Then it computes a lexicographic ranking function which proves termination of the VASS. Finally a bound is computed directly from the VASS ranking functions. Resource consumption is obtained by multiplying the bound by a suitable resource measure.
Joint work with John P. Gallagher
Probabilistic Output Analysis for discrete resource programs
Maja H Kirkeby
Probabilistic output analysis is a static analysis of programs based on their probability distribution of input, that computes a probability distribution of the (program) output. We present such a transformation based analysis, that is intended for analyzing discrete resource usages of first-order functional simple typed programs (e.g., time complexity, number of operations). We specify the functional language that describes the input programs and a meta-language for representing probability distributions as programs. We define a set of input programs for which we can derive accurate output probability distributions as closed form expressions and describe the essential transformation steps. Finally, we present results from our implementation and discuss different over-approximation techniques that may be used to obtain output distributions in closed form.
Joint work with Mads Rosendahl
Determinacy in static analysis for jQuery
The techniques are implemented in the TAJS analysis tool and evaluated on a collection of small programs that use jQuery. Our results show that the proposed analysis techniques boost both precision and performance, specifically for inferring type information and call graphs.
Joint work with Anders Møller