![]() |
ULTRA group Useful Logics, Types, Rewriting, and Applications |
|
UKII UK Institute of Informatics, UK |
![]() |
In the past twenty years, a very rich body of results has emerged in type theory and term rewriting systems. This is unsurprising since these topics are at the heart of the theory and implementation of programming languages. Type theory enables increasing expressiveness in programming languages and in theorem proving, whilst retaining strong theoretical and applied properties. Understanding termination and reduction strategies in term rewriting systems provides insights into increasingly efficient implementations with safe theoretical foundations. Furthermore, the two notions of type theory and term rewriting are very general and cover various topics including termination, calculi of substitutions, subtyping, reduction, unification, interaction and process calculi, the logic of programs and theorem proving.
Unfortunately, collaboration between type theory and term rewriting systems is not as strong as it should be despite their common goals and extreme importance. The International School on Type Theory and Term Rewriting aims at bridging this gap and at bringing together leaders and active researchers in the various branches of type theory and term rewriting. The school is beneficial to all active researchers and research students in the area. Talks will range from an overview of recent results, to presenting some open problems and to discussing some future research in the area. The list of invited speakers includes: Jos Baeten, Steffen van Bakel, Henk Barendregt, Gilles Barthe, N.G. de Bruijn, Roberto Di Cosmo, Michael Fourman, Herman Geuvers John Glauert, Thérèse Hardin, Fairouz Kamareddine, Richard Kennaway, Assaf Kfoury, Zurab Khasidashvili, Jan Willem Klop, Pierre Lescanne, Jean-Jacques Lévy, Giuseppe Longo, Ursula Martin Paul-André Melliès, Cristine Paulin-Mohring, Adolfo Piperno, Gordon Plotkin, Alejandro Ríos, Simona Ronchi Della Rocca, Judith Underwood, Roel de Vrijer, Joe Wells.
Here is the list of Participants. Here are some photos from the school.
|-----------------------------------| |-----------------------------------| |Thursday 12 September | |Friday 13 September | |-----------------------------------| |-----------------------------------| | Chair: Fairouz Kamareddine | | Chair: Jan-Willem Klop | | -------------------------- | | ---------------------- | | 9:00- 9:45 N.G. de Bruijn | | 9:00- 9:45 Therese Hardin | | 9:45-10:00 Discussion | | 9:45-10:00 Discussion | | | | | | 10:00-10:45 Henk Barendregt | | 10:00-10:45 Michael Fourman | | 10:45-11:00 Discussion | | 10:45-11:00 Discussion | | | | | | 11:00-11:30 COFFEE BREAK | | 11:00-11:30 COFFEE BREAK | | | | | | Chair: Roy Dyckhoff | | Chair: Zhaoui Luo | | ------------------- | | ----------------- | | 11:30-12:15 Fairouz Kamareddine | | 11:30-12:15 Joe Wells | | 12:15-12:30 Discussion | | 12:15-12:30 Discussion | | | | | | 12:30-14:00 LUNCH | | 12:30-14:00 LUNCH | | | | | | Chair: Randy Pollack | | Chair: Franco Barbanera | | -------------------- | | ----------------------- | | 14:00-14:30 Gilles Barthe | | 14:00-14:45 Jean-Jacques Levy | | 14:30-15:00 Alejandro Rios | | 14:45-15:00 Discussion | | | | | | 15:00-15:30 Herman Geuvers | | 15:00-15:30 Paul-Andre Mellies | | | | | | 15:30-16:00 COFFEE BREAK | | 15:30-16:00 COFFEE BREAK | | | | | | Chair: Steffen van Bakel | | Chair: Pierre Lescanne | | ------------------------ | | ---------------------- | | 16:00-16:45 Jos Baeten | | 16:00-16:45 Gordon Plotkin | | 16:45-17:00 Discussion | | 16:45-17:00 Discussion | |-----------------------------------| |-----------------------------------| |-----------------------------------| |-----------------------------------| |Saturday 14 September | |Sunday 15 September | |-----------------------------------| |-----------------------------------| | Chair: Maribel Fernandez | | Chair: Adriana Compagnoni | | ------------------------- | | ------------------------- | | 9:00- 9:45 Pierre Lescanne | | 9:00- 9:45 Giuseppe Longo | | 9:45-10:00 Discussion | | 9:45-10:00 Discussion | | | | | | 10:00-10:30 Steffen van Bakel | | 10:00-10:45 Adolfo Piperno | | 10:30-11:00 Roel de Vrijer | | 10:45-11:00 Discussion | | | | | | 11:00-11:30 COFFEE BREAK | | 11:00-11:30 COFFEE BREAK | | | | | | Chair: Paul Jackson | | Chair: Olivier Danvy | | ------------------- | | -------------------- | | 11:30-12:15 Christine Paulin | | 11:30-12:15 Ursula Martin | | 12:15-12:30 Discussion | | 12:15-12:30 Discussion | | | | | | 12:30-13:00 Judith Underwood | | 12:30-13:00 John Glauert | | | | | | 13:00-14:00 LUNCH | | 13:00-14:00 LUNCH | | | | | | Chair: Kristoffer Rose | | Chair: Ursula Martin | ---------------------- | | ------------------------ | | 14:00-14:45 Jan-Willem Klop | | 14:00-14:45 Richard Kennaway | | 14:45-15:00 Discussion | | 14:45-15:00 Discussion | | | | | | 15:00-15:30 Roberto Di Cosmo | | 15:00-15:30 Zurab Khasidashvili | | | | | | 15:30-16:00 COFFEE BREAK | | 15:30-16:00 COFFEE BREAK | | | | | | Chair: Henk Barendregt | | Chair: Philippe Audebaud | | ---------------------- | | ------------------------ | | 16:00-16:45 Assaf Kfoury | | 16:00-16:45 S. Ronchi Della Rocca| | 16:45-17:00 Discussion | | 16:45-17:00 Discussion | |-----------------------------------| |-----------------------------------|
Sequential data algebra
Jos Baeten
Eindhoven university of technology
---------------------------------
(joint work with Jan Bergstra (Univ. of Amsterdam and Utrecht) and Alex
Sellink (Univ. of Amsterdam))
We propose a style of data type specification based on four valued logic.
Four valued logic has constants for true, false, meaningless and divergent;
this allows to deal with error values and non-terminating computations. The
word sequential indicates that all operators come with a definite order of
evaluation, thus a normalization strategy. The aim is to avoid endless
duplication and reinvention in the practice of algebraic specification.
As an example, we study the specification of floating-point numbers.
----------------------------------------------------------------------------
Normalisation and approximation results for term rewriting systems
(with abstraction and \beta-rule) typeable using intersection types.
Steffen van Bakel
Imperial college, London
-------------------------------------------------------------------
Research preformed in cooperation with Maribel Ferna'ndez, of E'cole Normale
Supe'rieure, Paris, and Franco Barbanera, University of Turin, Italy.
The talk will start with a short discussion of intersection type assignment
for Lambda Calculus, together with its major properties:
- a filter model,
- a strong normalization result,
- a head-normalization result,
- an approximation theorem.
We are interested in defining a filter-like semantics for other computational
models, and our starting point was to have a look at first order term rewriting
systems. We start with the definition of a notion of type assignment for
first order rewrite systems, extended with a notion of application, for
which we proved a subject reduction result.
We will present the restictions put on recursive definitions in order to
obtain the strong normalization and the head-normalization results. We then
will define a notion of approximants for terms in our rewrite system; the same
restrictions as before on recursion are also sufficient for the approximation
result. We then look at a calculus that combines term rewriting and
beta-reduction, for which similar results are obtainable.
-------------------------------------------------------------------------
Efficient Computations in Formal Proofs
Henk Barendregt
Catholic university of Nijmegen
-------------------------------------------
Formal proofs in mathematics and computer science are being
studied because these objects can be verified by a very simple computer
program. An important open problem is whether
these formal proofs can be generated with an effort not much greater
than writing a mathematical paper in, say, \LaTeX.
Modern systems for proof-development make the formalization of
reasoning relatively easy. Formalizing computations such that the
results can be used in formal proofs is not immediate.
In this talk it is shown how to obtain formal proofs of
statements like $\mathsf{Prime}(\num{61})$ in the context of Peano
arithmetic or $x^2-1=(x-1)(x+1)$ in the context of rings. It is hoped
that the method will help bridge the gap between systems of computer
algebra and systems of proof-development.
-----------------------------------------------------------------------------
Classical Pure Type Systems
Gilles Barthe
CWI, Amsterdam
----------------------------
Since Griffin discovered that natural deduction systems for classical
logics correspond to typed lambda-calculi with control operators via the
Curry-Howard isomorphism, many such systems have appeared.
We introduce classical pure type systems (CPTSs), a framework
enabling a general study and systematic classification of such
calculi; akin to Barendregt's lambda-cube, we also define a
classical lambda-cube, whose most complex system is a Classical
Calculus of Constructions.
We briefly sketch the meta-theory of CPTSs: confluence, subject reduction,
strong normalisation. We also discuss a CPS-translation from the
clasical lambda-cube to Barendregt's lambda-cube.
----------------------------------------------------------------------------
Features of Automath
N.G. de Bruijn.
Eindhoven University of Technology
----------------------------------
Various proof verification systems, at least those
which combine typed lambda calculus with the idea of
proof objects, have much in common with old Automath. But
Automath has some features that were not taken over by
others and nevertheless have technical and conceptual
advantages. In particular this refers to the book
structure of PAL, the lambda-free fragment of Automath.
Instead of replacing that structure by lambda formalism,
Automath maintained it, thus providing a double mechanism
for functionality. One of the effects of keeping the PAL
structure is the natural way to internalize definitions, a
thing that has to be given up if the PAL mechanism is
replaced by typed lambda calculus. It can be reclaimed if
ordinary typed lambda calculus is replaced by the only
slightly different modification called delta-lambda.
----------------------------------------------------------------------------
A brief history of rewriting with extensionality
Roberto Di Cosmo
Ecole Normale Superieure, Paris
---------------------------------------------------
Over the past years, much work has been done on typed lambda
calculi with extensional axioms turned into expansive rules
rather than into contractive ones. We will survey the rewriting
techniques that have been developed and used for these calculi,
which are a typical example of fruitful interaction between
lambda calculus and rewriting, and we will survey also the most
recent results in the field.
------------------------------------------------------------------------
Indeterminate Structures
Michael Fourman
The University of Edinburgh
----------------------------
We use the notion of a generic object, from topos theory, to model
indeterminate types.
This allows us to give a semantics for SML functors. We describe the
construction of indeterminate
structures, used to model functor parameters, and discuss the features of a
module system based on this semantics.
----------------------------------------------------------------------------
From models of logic to models of typed lambda calculus
Herman Geuvers
Tech. University Eindhoven, NL
---------------------------------------------------------
Abstract: Type theories with type dependency, like the Calculus of
Constructions, can be used for formalizing mathematical reasoning,
using the formulas-as-types principle, which embeds a logic
into the type theory. Hence, when implemented, such a type theory can
be used as an interactive proof checker. Then it also becomes apparent
that pure logic is a rather `static' thing: functions are usually seen
as relations and hence do not compute. Also, formalising things in
pure logic involves a lot of coding. These drawbacks can be improved
in type theory by allowing additional constructs (like, for
example, lambda-abstraction, or a scheme for inductive types). Type
theory also allows to formalize principles that can not be formalized
in ordinary logic, for example the idea that propositions and sets
are really the same kind of things (types), living in the same universe.
A question that arises here is what all these extra constructs and new
principles mean for the logic (e.g. are they conservative extensions?)
We look at this question by studying how a model of the logic can be
extended to a logic of the type theory. In detail we look at extending
(classical) models of second order logic to models of second order
dependent type theory.
------------------------------------------------------------------------------
Neededness and Relative Normalization
John Glauert
University of East Anglia
--------------------------------------
(This work is performed in the context of Expression Reduction Systems, a
form of higher-order rewriting, and is joint work with Z. Khasidashvili.)
Practical applications of rewriting systems often require reduction to
head-normal forms or other classes of "normal" form. We generalize the
theory of normalization via neededness to normalization relative to
desirable sets of terms, S, having natural properties of stability which
guarantee normalization, whenever possible, by contraction of S-needed
redexes. We also discuss minimal and optimal needed reduction and show that
they need not coincide.
---------------------------------------------------------------------------
What about freeness?
Therese Hardin
INRIA, PARIS
-----------------------
In calculi with binders, variable names are well-known to be a source
of problems and De Bruijn notation, which replaces names by numbers
recording bindings, is a recognized way to avoid these
difficulties. It is easy to replace bound variables by numbers. Free
variables can also be replaced by numbers, by implicitly closing the
term but they may also be kept as names. We compare these two points
of view and we explain to code these free variables by metavariables
in a calculus of explicit substitutions. Then, we show how this coding
may be fruitfully used for unification and compilation.
--------------------------------------------------------------------------
Type Theory and term Rewriting at Glasgow: Past and Future
Fairouz Kamareddine
University of Glasgow
---------------------
This talk describes some of the problems attacked by the type theory and term
rewriting group at Glasgow in the last five years and outlines some open
problems and future goals. The discussion will be directed towards implicitly
versus explicitly typed lambda calculi and the study of their theory and
implementation. I shall concentrate on extending those calculi with explicit
definitions, explicit substitutions, new notions of typing and reductions
and the logical systems based on these calculi. I will also be looking for
future generalisations of this work and its advantages. For example, the
various new notions of reductions can be generalised even FURTHER and this
generalisation can provide a decidable charaterisation which is the nearest
approximation to the undecidable notion of reductional equivalence between
terms. It is very interesting indeed that almost all new notions of
reductions that have been found in the last twenty years by various
researchers (as Bloo and Kamareddine and Nederpelt, Kamareddine and Nederpelt,
Kfoury and Wells, Klop, Moggi, Nederpelt, Plotkin, Regner, etc), were trying
to remain within this notion of reductional equivalence.
I shall outline the future I forsee for the type theory and
term rewriting group at Glasgow both within Glasgow and at the various
centres with whom we collaborate.
---------------------------------------------------------------------------
Transfinite rewriting
Richard Kennaway
University of East Anglia
--------------------------
We survey the basic concepts and properties of transfinite rewriting for
orthogonal term rewrite systems, lambda calculus, higher-order rewrite
systems, and abstract rewrite systems.
---------------------------------------------------------------------------
A Linearization of the Lambda-Calculus and Applications
Assaf Kfoury
Boston University
----------------------------------------------------------
If every lambda-abstraction in a lambda-term M binds at most one variable
occurrence (M is thus a ``linear'' term), then M is beta-strongly normalizing.
We extend the syntax of the standard lambda-calculus "Lambda" to adapt this
idea to the general case when lambda-abstractions are allowed to bind more
than one variable occurrence.
In the extended calculus, denoted "Lambda/\", a subterm Q of a term M can be
applied to several subterms R1,...,Rk in parallel, which we write as
(Q. R1/\..../\Rk)$. The appropriate notion of beta-reduction "beta/\" for
this extended calculus is such that, if Q is the lambda-abstraction
(\lambda x.P) with l>= 0 bound occurrences of x, the reduction can be carried
out provided k = \max(l,1). Every M in Lambda/\ is thus beta/\-SN. The
contraction of M in Lambda/\ is a standard term |M| in Lambda, which is
defined provided for every subterm of the form (Q. R1/\..../\Rk) =< M, each
of R1,...,Rk contracts to the same standard term |R1| = ... = |Rk|. We say
that M in Lambda/\wedge is "fully linear" if the beta/\wedge-normal form N
of M is such that |N| is defined and is a standard term in beta-normal form.
One key lemma states that for every standard term M in Lambda:
M is beta-SN iff there is a fully linear M' in Lambda/\ such that M = |M'|.
We draw several consequences from this result, in particular a new simple
proof for the (known) fact that a (standard) term M is beta-SN iff M can be
assigned a so-called ``intersection'' type (``top'' type disallowed).
-----------------------------------------------------------------------------
Comparing stable models of deterministic computation
Zurab Khazidashvili
University of East Anglia
-------------------------------------------------------
(This work has been performed in cooperation with John Glauert.)
Event Structures are well established models of concurrent computation.
Here we study Event Structure semantics for Orthogonal Rewrite Systems,
such as the lambda-calculus and its sharing graph model, and of functional
programming languages based on such systems. Conflict-free (or Deterministic)
Prime Event Structures, DPESs, which are simply event sets partially ordered
by a causal dependency relation, are sufficient for this purpose.
We study orthogonal systems in the form of Stable Deterministic Residual
Structures (SDRSs), which are Abstract Reduction Systems with axiomatized
residual relation. Basic theory of orthogonal systems, such as
normalization, can be developed already in the framework of SDRSs. In order
to interpret SDRSs as DPESs, an equivalence relation on transitions in
SDRSs is required that would relate transitions performing (part of) the
`same' event. We introduce such structures, Deterministic Family Structures
(DFSs), by axiomatizing family relation on transitions in SDRSs, by analogy
with the family relation in the lambda calculus and other orthogonal
rewrite systems. DFSs can easily be interpreted as DPESs (by taking
redex-families as events and taking the contribution relation on them as
the causal dependency relation on events).
However, DPESs are linear in nature (no event can erase or duplicate
another -- any event can occur at most once in the course of a computation),
and therefore they cannot give an adequate account of erasure of redexes/
transitions in SDRSs. To remedy this situation, we extend DPESs in two ways:
by axiomatizing erasure relation on events, and by axiomatizing permutation
equivalence on event configurations. We then define translations between the
two extended event models, non-duplicating SDRSs, and non-duplicating DFSs,
and show that the translations commute, implying isomorphism of the above
set-theoretic and transition models of deterministic computation.
--------------------------------------------------------------------------
Decreasing diagrams
Jan Willem Klop
CWI and Vrije Universiteit, Amsterdam
--------------------------------------
Joint work with Marc Bezem and Vincent van Oostrom.
We discuss confluence criteria for abstract rewriting, notably the method of
decreasing diagrams due to De Bruijn and van Oostrom. This method generalizes
various lemma's in abstract rewriting such as Newman's Lemma, Huet's strong
confluence lemma and several others. We also pay attention to infinite
reduction diagrams.
----------------------------------------------------------------------------
EXPLICIT SUBSTITUTIONS AND EXPLICIT ALPHA CONVERSION
Pierre Lescanne
Inria, Nancy
---------------------------------------------------------
Calculi of explicit substitution internalise the main process of
lambda-calculus, namely substitution. Thus substitution is better
understood and its complexity is better mastered. Actually calculi of
explicit substitution can be divided into two classes, namely those
which use de Bruijn's indices and those which use names. It is well
known that de Bruijn's indices make terms less readable and therefore
names are often prefered by people. But when one uses names, one is
faced to the problem of alpha conversion which is is somewhat hidden
in de Bruijn's indices and one may like to make it explicit either. We
propose a calculus which uses a convention for attributing names based
on their binding depth and introduced as ``levels'' by de Bruijn in
1972. Thanks to levels, alpha conversion can be made explicit in our
calculus. We discuss some of its properties and applications.
----------------------------------------------------------------------------
Dependency calculus for the lambda calculus.
Jean-Jacques Levy
Inria, Paris
--------------------------------------------
Vesta is a modeling system with a functional language for
describing software packages. Incremental calculation of dependencies
may be viewed as a simple lambda calculus problem, with an extra
origin calculation. Joint work with Abadi and Lampson.
------------------------------------------------------------------------------
SUBTYPES, GENERICITY AND NORMALIZATION
Giuseppe Longo
Ecole Normale Superieure, Paris
------------------------------------------
A recent logical theory of subtypes is introduced. Its motivation
goes back to the introduction, in a functional frame, of a typical
Object Oriented feature, inheritance, reinterpreted in a specific
and functional, way. The presentation in terms of sequent calculi
allows to formulate and prove a coherence theorem and a
"transitivity elimination" result. The later may be handled as a
cut-elimination (normalization) result, in the sense of logic. The
implicit challenges will be shortly discussed. The analysis of
cut-elimination and of "injectivity" of subtyping relates to an
apparently unrelated result on parametricity of higher order
lambda-calculi, which in turn gives some insight into the proof of
normalization.
---------------------------------------------------------------------------
Termination, geometry and invariants
Ursula Martin
University of St Andrews
--------------------------------------
Many different techniques have been developed for proving the termination
of programs or functions. Some are purely ad-hoc arguments developed solely
to solve an immediate problem. However a typical termination proof involves
finding a quantity which decreases at each step of a computation, or in
other words finding a well-founded ordering which is respected by the process
we are considering. An extraordinary diversity of well-founded division
orderings on structures such as vectors, strings, partitions or terms have
been devised to prove termination, particularly of rewrite systems.
Such orderings can also be used to constrain the search space in the search
for solutions or normal forms in a wide variety of algorithms such as Knuth
Bendix completion, for computation in algebras or groups, Gr\"{o}bner
basis, for computation in polynomial rings, or resolution. It turns out
that the choice of ordering may have a surprising effect on efficiency.
Recent work has shown that in some circumstances a geometrical classifying
space for the orderings may be found, which allows us to associate
invariants to the orderings and the associated termination problems. We
discuss theorems, examples and questions.
------------------------------------------------------------------------------
The lambda sigma calculus is omega deterministic
Paul-Andre Mellies
LFCS, University of Edinburgh
------------------------------------
The normalisation in the lambda calculus of (fair) call-by-need strategies
is a consequence of the standardisation theorem.
The demonstration is extended here to non commutative (= non orthogonal)
rewriting systems. First we characterise algebraically the subclass of
``external'' computations inside reduction systems --- a subclass which
contains all normalising computations. Then we reduce the normalisation
property to the finite number of permutation classes, see Jean-Jacques
Levy's Phd, from a term to its several normal forms --- what we call
omega-determinism. Finaly we show that the lambda sigma calculus enjoys
the property: as a consequence all (fair) call-by-need strategies
in the lambda sigma calculus are normalising.
-------------------------------------------------------------------------------
Inductive Types in Higher-Order Logic and Type Theory
Christine Paulin-Mohring
LIP, URA CNRS 1398, ENS Lyon, France
------------------------------------------------------
Inductive types and definitions are an important tool when formalising
mathematical concepts. Impredicative systems such that higher-order logic
or impredicative type theories offer the possibility to encode these objects.
We present a comparative approach of these encodings with respect to the
class of inductive types that are representable and the
properties that are provable on them. We shall present more precisely the
inductive types as they are represented in the Coq system.
-----------------------------------------------------------------------------
Normalization by developments and subtyping
Adolpho Piperno
University of Roma
---------------------------------------------
Normalization proofs for typed lambda-calculi have received
a considerable interest in the last few years.
In particular, investigations have been made to clarify
the relationship between weak and strong normalization,
to refine the notion of development and to give
syntactical tools for studying properties of reduction.
Some of these issues will be reviewed and discussed.
A general method for proving normalization
in typed lambda calculi will be presented, based on the
use of eta-expansions, superdevelopments and subtyping.
----------------------------------------------------------------------------
Compositionality and Universal Type Theory
Gordon Plotkin
University of Edinburgh
---------------------------------------
The principle of compositionality or referential transparency is central to
denotational semantics. Taking this seriously requires a theory of
language. For example, viewing language as a free multi-sorted algebra
enables one to express compositionality homomorphically, as advocated long
ago by the ADJ group. It also provides a uniform notion of translation
between signatures that allows one to pass from the semantics of one
language to that of another.
However such a theory of language is inadequate, and for several reasons.
We consider two of these: the lack of a treatment of binding operators, and
the fact that sorts (= types), can themselves have a complex linguistic
structure. Making a (partial) identification of type theories with
programming languages, we advocate a programme to find a treatment of type
theories analogous to the universal algebraic treatment of algebraic
theories. With this, the language used to express (typically)
domain-theoretic semantics is itself brought into the picture, and one can
see compositional denotational semantics itself as translation, in a
uniform sense.
-----------------------------------------------------------------------------
Two styles of explicit substitutions: a survey
Alejandro Rios
University of Glasgow
----------------------------------------------
(This work is carried out in collaboration with Fairouz Kamareddine and is
supported by EPSRC grant GR/K25014: "Theoretical and implementational
advantages of a new lambda notation".)
Most formulations of the lambda-calculus use implicit rather than explicit
substitutions. The last decade however has seen an explosion of calculi
which deal with substitutions explicitly.
This talk starts by giving a brief introduction to two different styles of
explicit substitutions: the already classical lambda-sigma style and the
recently developped lambda-s style. We discuss substitutions calculi
with de Bruijn indices and with variable names, try to establish a bridge
between both and compare substitutions calculi in both styles from the point
of view of efficiency to simulate beta-reduction.
Furthermore, for each style, we outline the remaining open problems and the
disadvantages versus the advantages they offer.
----------------------------------------------------------------------------
The operational content of intersection types
Simona Ronchi Della Rocca
Univerity of Turino
---------------------------------------------
Various aspects of the functional computation can be modeled
through (various kinds of) lambda calculus. For example, the
call-by-name computation, the call-by-value one, the lazy one,
the relevant one. An operational semantics can be defined
for each one of these paradigmatic languages, starting from
the definition of a set of "values", and a notion of convergency.
A term converges if and only if it reduces (according to a
given reduction strategy) to a value. A notion of operational
equivalence between terms arises naturally, based on Leibnitz equality.
Namely two terms M and N are operationally equivalent if and only if
every context behaves in the same way (with respect
to the convergency) when filled by either M or N.
Intersection type assignment systems give a tool for reasoning
about operational equivalence of terms. We define a type assignment
system being correct with respect to an operational equivalence
if and only if the fact that two terms can be typed with the
same types implies that they are operationally equivalent.
If also the inverse implication holds the system will be called
complete. For each one of the operational semantics induced by
the before listed computations a correct type assignment system
can be defined.
----------------------------------------------------------------------------
Classical logic and nonlocal control: the case against confluence
Judith Underwood
University of Edinburgh
-----------------------------------------------------------------
Nonlocal control operators were introduced to functional languages to
allow programmers to guide the evaluation process more efficiently.
They can also be used to implement error handlers and concurrency
mechanisms such as coroutines. However, they represent a significant
addition to the principles of lambda-calculus which underlie many
ideas in functional programming. Furthermore, there appears to be a
deep connection between classical logic and typed lambda-calculus plus
nonlocal control, similar to the connection between constructive logic
and typed lambda-calculus.
An important consideration in studying languages
with nonlocal control operators is that they may be nonconfluent.
This property can be justified both by the connection with classical
logic and by applications to programming. Because of this,
many proof techniques for properties such as normalization
cannot be used directly. It is hoped that this talk will motivate
work on such proof methods.
----------------------------------------------------------------------------
Completing partial combinatory algebras
Roel de Vrijer
Free University of Amsterdam
----------------------------------------
This is joint work with I. Bethke and J.W. Klop.
In this talk a over view will be given of the results on completability
of partial combinatory algebras.
A short review will be given of constructions by J.W. Klop and I. Bethke
of partial combinatory algebras that can not be completed.
On the other hand, we will prove that partial combinatory algebras having
the additional property of 'unique head normal forms', can always be completed.
An example is the 'Kleene brackets' pca with domain N and application defined
by n.m = {n}m.
-------------------------------------------------------------------------------
Generalized Reduction, Let-Floating, and CPS
Joe Wells
Boston University
--------------------------------------------
Generalized reduction recognizes implicit beta redexes in a lambda term
and reduces them directly, ignoring intervening abstractions and
applications. Let-floating lifts or lowers beta redexes to other
equivalent locations. A CPS transformation changes terms into a form
which is executable on a stack-based implementation and which is
indifferent to call-by-value vs. call-by-name evaluation.
We evaluate and survey different restrictions on let-floating. We discuss
how generalized reduction is subsumed by a particular subset of
let-floating. We show how a larger subset of let-floating is subsumed by
beta reduction on CPS transformed terms. We discuss the application of
these techniques in such areas as type inference, strong normalization,
and program equivalence.
----------------------------------------------------------------------
Registration fee is £ 150 (£ 75 for full time students).
Accommodation is available at the following rates per person per night:
Staff: £ 20.50 (bed and breakfast) or £ 11.50 (bed only).
Students (bed only): £ 8.50 (UK students) and £ 10.00 (overseas students).
If you are interested in attending, send the registration fee (with evidence of full time studentship if applicable), together with the fee for the accommodation for the relevant number of nights and type of accommodation to: Fairouz Kamareddine, Attention International School on Type Theory and Term Rewriting, Glasgow University, Computing Science, 17 Lilybank Gardens, Glasgow. Note that cheques must be made payable to Glasgow University and labelled: "International School on Type Theory and Term Rewriting".
Fairouz Kamareddine
email:
(
)
Maintained by
Fairouz Kamareddine
email:
(
)