Contents of this page: Administrative matters | Technical presentations(Last updated 2nd November 2001.)
We are appreciative of the careful planning that caused the one day of good weather to coincide with our study of Dutch marine lifeforms - the more so since, in order to perform the aforesaid study, the group had to brave climatic conditions on an up-and-duin bike ride.
There will be no ordinary meeting attached to the Working Conference; it was decided that the one-day meeting at Le Bischenberg had not been a success. Therefore the Working Conference will not be numbered in the usual sequence.
The local organizers will be Robert Dewar and Annie Liu. Three possible venues were discussed:
The WG2.1 chair will consider the trends represented in the group, and nominate one author for each.
No summary received yet.
Transforming recursion into iteration eliminates the use of stack frames during program execution. It has been studied extensively. This talk describes a powerful and systematic method, based on incrementalization, for transforming general recursion into iteration: identify an input increment, derive an incremental version under the input increment, and form an iterative computation using the incremental version.Exploiting incrementalization yields iterative computation in a uniform way and also allows additional optimizations to be explored cleanly and applied systematically, in most cases yielding iterative programs that use constant additional space, reducing additional space usage asymptotically, and run much faster. I will summarize major optimizations, complexity improvements, and performance measurements. An initial surprise we found is that the classical tail recursive factorial function that uses an accumulating parameter can be slower than the recursive version, even though it eliminates the stack overhead.
This is joint work with Scott Stoller. The slides are available as a working document.
No summary received yet.
Consider a large, possibly heterogeneous collection of `nodes' having some (typically modest) computing power, and some of which are equipped with sensors or actuators. Large here means up to 100,000 nodes. Nodes can communicate `locally'. Both nodes and communication can fail. The problem is: How can we program such a collection to realize real-time fault-tolerant and adaptive distributed applications? The new Consona project at Kestrel Institute tries to develop model-based methods and tools for the goal-oriented design and synthesis of such applications and services.
The slides are available.
Abstract interpretation is a technique for safely approximating the behaviour of computer programs. It is used widely in compiler optimisation and, increasingly, as a basis for model checking. Theorems for free is a technique for deriving properties of parametrically polymorphic functions from their type. This talk is about combining abstract interpretations with theorems-for-free. We introduce the use of Galois connections in abstract interpretations and we show how given base Galois connections can be extended to Galois connections of arbitrary type. It is then shown that theorems-for-free establishes the safety of the abstract interpretations.
This is joint work with Kevin Backhouse. The slides are available, as is a paper.
No summary received yet.
The subject of this thesis is the construction of programming languages suitable for the implementation of program transformation systems. First class rules and generic traversals are especially useful in such languages. Stratego, a language specifically intended for program transformations, supports these features, but is untyped and impure.In this thesis we develop a pure non-strict functional language called RhoStratego, incorporating features from Stratego. First class rules are obtained through the equivalent of Stratego's left-biased choice operator. This approach is not only useful to strategic programming, but is also more powerful than existing proposals to extend pattern matching, such as views and pattern guards. Stratego's generic traversal primitives are implemented through a more fundamental mechanism, the application pattern match, whereby constructed values can be deconstructed in a generic and typeable fashion. We present the syntax and semantics of the language, as well as the semantics of a strict variant.
Furthermore, we have developed a type system for RhoStratego, which consists of the Hindley-Milner type system extended with rank-2 polymorphism and typing rules to support generic traversals. The type system is powerful enough to allow, and ensure the safety of, type unifying and type preserving generic transformations. We have implemented a type checker that infers all types, except rank-2 types for which annotations must be given.
The slides are available, as is the thesis on which the talk was based.
No summary received yet, but the slides are available.
[HYPERBOLE ON...] Object-oriented programming has failed to live up to its promises of reuse and compositionality: implementation inheritance breaks encapsulation. Component-oriented programming, based on composition and delegation rather than inheritance and overriding, is the way to go. [...HYPERBOLE OFF.]Various groups researching in software architecture, coordination languages, and distributed systems have proposed the separation of computation and coordination as a paradigm for component-oriented programming. Computation is achieved by old-fashioned sequential components; coordination of these components is achieved by the superposition of connectors (first-class entities in their own right) onto the components. Maibaum and Fiadeiro, among others, have shown how to make formal sense of the resulting configurations using pushouts or colimits.
We explain the ideas and motivation, and show how superposition captures exactly what visual application development environments do with technologies like JavaBeans. This is very definitely work in progress.
Danvy has shown how to implement a variant of C's printf
function in a statically typed language. We present a new solution to this
problem based on the notion of a functor. Our solution enjoys nice
algebraic properties and is more direct than the original one.
Along the way, we encounter singleton types, type-indexed values,
type-indexed types, multiple-parameter type classes, and
functional dependencies.
The slides and a paper are available as working documents.
No summary received yet, but the paper was published in the Haskell Workshop, 2001.
There a recursion scheme for Greedy-like algorithms was based on a a general notion of a `part-of-relation' in monoids. It turns out that this can be generalized in two ways:
- The approach works for general partial orders and offers a simpler treatment of matroids
- We can abstract from relations to elements of Kleene algebras thus making the treatment even more schematic.
Matroids require a fairly strong closure assumption. Therefore they have been generalized to greedoids; these, too, can be dealt with in a simpler fashion using Kleene algebra.
At WG2.1 Meeting 55, I presented necessary and sufficient conditions for when a total function can be written as a fold or as an unfold. I gave a complicated set-theoretic proof of the result.Since then, we have generalized the result to partial functions as well. I presented ugly set-theoretic proofs of both results, and an elegant relational proof of the result for total functions.
I didn't know how to adapt the elegant relational proof to the result for partial functions. Roland Backhouse showed how to simplify what I had, but we still can't complete the proof.
This is joint work with Graham Hutton and Thorsten Altenkirch.
A permutation phrase is a sequence of elements (possibly of different types) in which each element occurs exactly once and the order is irrelevant. Some of the permutable elements may be optional. We show a way to extend a parser combinator library with support for parsing such free-order constructs. A user of the library can easily write parsers for permutation phrases and does not need to care about checking and reordering the recognised elements. Possible applications include the generation of parsers for attributes of XML tags and Haskell's record syntax.
This is just work with Arthur Baars and Doaitse Swierstra. An implementation is available on the web. The slides and a paper are also available.
Pursuing a suggestion by Ernie Cohen, I presented a brief generalization to Richard Bird's board tiling presentation suitable for tiling arbitrary boards with tiles of arbitrary shapes, stated in the relational calculus. I believe it went something like: let E be the empty board relation, P the relation between legal board positions - the left board is the right board with more tiles on it - and F the full board relation. Obviously, the predicate F Sn E must hold (for some n) for each legal tiling. Then using Richard's formulation, namely that we seek Q = SP where S is a subrelation of the identity, we now want to establish that F Qn E holds for some value of n. Such a sequence of Qs is referred to as a safe partial tiling. No solution based on this specification was proposed, but it should admit to alternative tiling strategies to that used by Richard, even when specialized to the dominos and chess board.
No summary received yet.
This was a presentation on typical industry procedures for generating safety critical certified software.
No summary received yet.
I presented an introduction to omega algebra (an extension of Kleene algebra to omega-regular languages) and its use in the transformation of concurrent programs.
This presentation is concerned with typed generic higher-order functions of the following kind:A paper draft, the combinator library and generative support for the term interface is readily available at the Strafunski project web page. This is joint work with Joost Visser.These properties are sufficient to define generic function combinators, especially combinators representing traversal schemes. We discuss two models of such generic functions within Haskell. One model is based on universally quantified parameters of datatype constructors. The other one makes hidden use of a universal representation type in a completely safe manner. In both cases, we rely on a very simple term interface.
- The functions can be applied to terms of any type,
- they admit generic traversal into subterms, and
- they also admit type-safe dynamic update.
By contrast, the required form of genericity cannot be achieved with top-level quantified polytypic type parameters as present in PolyP or Generic Haskell. Type cases required for generic function update (see 3 above) are also not supported in those languages.
We present a combinator library for generic functions. We employ our style of generic programming for program transformation, namely for the encoding of program refactorings.
The realm of this presentation is ``basic concepts for the design of programming languages''. We consider it as the major challenge of future research on language design to combine the benefits of different successful paradigms -- e.g. functional programming, object-oriented programming, logic programming -- into a common development framework. Ideally, such a framework should also acknowledge modern developments in Software Engineering such as patterns, components, and software architecture. And all this should be accomplished without creating monstrous languages like Algol68, PL/1 or Ada.Evidently, such a wish list is easy to formulate but hard to realize. But we believe that it provides a valuable direction for our future research efforts.
In this talk we present some facets of our work towards this goal. To this end, we try to elaborate the most fundamental design issues in the development of languages. In doing so, we concentrate on three topics: Typing, modularization with scoping, and the integration of state into a functional view. (The talk concentrates on the first one of these issues.) It turns out that a few basic principles indeed suffice to describe a whole plethora of well-known programming features. This gives hope that an orthogonal language design is possible, which achieves a considerable portion of the above-mentioned goals.
No summary received yet.This presentation concerns work by Juan José Cabezas.
It is well-known that the set of square matrices whose entries come from a Kleene algebra themselves form a Kleene algebra. Varying-size matrices can also be handled, giving heterogeneous Kleene algebras. In addition, it is known that such matrices can be used to represent automata.We recall the operations that can be applied to such matrices, including residuation and domain. We also present examples showing how automata are represented, and how modal formulae can be interpreted over a matrix by means of algebraic expressions involving the given matrix.
The main insight of the talk is that matrices of types (i.e., elements below 1) are relations, to which the standard relational operators can be applied. Two applications of this fact are presented:
The talk concludes by describing the problem of controller synthesis, where all the concepts presented come together. It is hoped that these concepts will help finding elegant solutions to the problem.
- Simulations and bisimulations between automata can be described purely by using matrices, whereas the standard description involves automata on the one hand, and relations on the other, that is, heterogeneous objects that are not easily mixed in calculations.
- Pairs of (relational) matrices that form projections can be used to make the product of automata.
The slides are available as a working document.
No summary received yet.
No summary received yet.