Research


Theory and Tools of Compiler Construction

Initially, the main focus was on formally proving the correctness of efficient parsing algorithms, in particular, on the elimination of chain reductions from practical LR parsing automata (SLR(1), LALR(1), Pager's variant). The results of this research are presented in my (German) Ph.D. thesis (1982) entitled Theorie kettenfreier LR-Zerteiler.

An extension of the well-known Earley algorithm to a wider class of grammars (providing both complement and intersection operations) was described in a paper which was published in 1991 in Theoretical Computer Science.

The (again, German) text book Syntaxbasierte Programmierwerkzeuge, which was published in 1995 by Teubner Verlag Stuttgart, describes the theory and practice of compiler construction tools, with a special emphasis on applying these tools outside their primary field of programming language translation.

Other publications are about the visual compiler-compilers SIC und Jaccie (see Projects) that have been developed since 1989.

[ Publications ]


Programming Methodology

Based on the fold-and-unfold technique, which was developed by Burstall und Darlington for deriving algorithms from formal specifications, systematic experiments in the fields of computing transitive closures and with special approaches to sorting were undertaken.

A novel approach to avoiding overspecification of execution order in formal program development are the nondeterministic control abstractions that are described in my Habilitation thesis (2001). Using control abstractions, families of related algorithms can be reduced to their very essence. Also, for verifying all members of such a family only a single proof is required.

Other publications in this area are about applying these formal program development techniques to the object-oriented paradigm. A series of more recent publications is about experiences with using the Java-Framework SalesPoint (see Projects) for teaching software engineering to beginners. SalesPoint is a joint development of our Institute together with the the Institute for Software and Multimedia at the Technical University in Dresden, which started in 1997.

[ Publications ]


Document Processing and Long Term Archiving

A number of publications is about the consistency of heterogeneous archives of digital documents, where for practical reasons inconsistencies are accepted temporarily. Tools are provided that will find such inconsistencies and generate low-cost repair suggestions.

A monograph published by dpunkt Verlag in 2003 and other pulications discuss different approaches to the problem of archiving digital documents for a very long time (say, more than 100 years).  For long term preservation of digital documents, the main obstacle is the ephemerality of document formats, which in turn is caused by the rapid succession of innovation cycles in computing. The slides of a (German) survey lecture on long term archiving (Elektronische Langzeitarchivierung (pdf)) are available for download.

[ Publications ]