Abstracts of the lectures at the Winter Workshop in Logics, Types and Rewriting
Cedar Room, Heriot-Watt University, Edinburgh
1-3 February 2000
 |
http://www.cedar-forest.org/forest/events/inaugural-workshop2000/ina-abstracts2.html
 |
ULTRA group Useful Logics, Types, Rewriting,
and Applications |
 |
UKII
UK Institute of Informatics,
UK |
The list of abstracts of the lectures is as follows:
Henk Barendregt Catholic University of Nijmegen
Proof-assistants for mathematics
Ample attention has been given to the question what formal systems a
proof-assistant should support and also to the meta-theory of those
systems. Somewhat neglected is the question how proof-assistants can be
user-friendly for developing mathematics. Some requirements will be
presented.
Stefano Berardi University of Turin
A notion of computation for Analysis
We investigate which notion of computation would be suitable
to validate a constructive interpretation of classical second order choice.
We propose a notion of computation extending Coquand-Ong game model of
simple typed lambda calculus, by allowing possible transfinite computations
when functional argument are discontinous.
N.G. de Bruijn Eindhoven University of Technology:
The Design of Automath
Around 1967 the Automath system was designed in a
relatively short time, in a world that was not yet ready for the
idea of automatic checking. A number of decisions had to be taken
in order to get to a simple flexible system, able to cater for
various kinds of foundation, and good enough to handle mathematics
on a very large scale. One of the novelties was that the system
did not just want to check proofs, but complete theories, or rather
complete books. As a consequence, definitions became essential
parts of the books, completely similar to proved statements;
another consequence was that there was no longer a difference in
treating logic and mathematics. Everything fitted nicely in the
setting of a general form of type theory.
An important decision was to stay out of artificial intelligence,
apart from the fundamental task of automated type checking. Another
decision was to resist the temptation to go after time saving
devices (at the price of complicating the system). In particular it
was decided that induction or recursion were not to be implemented
in the system, since they could easily be treated as subjects in
the books.
Alan Bundy University of Edinburgh
What is a Proof?
To those brought up in a logic-based tradition there seems to
be
a simnple and clear definition of proof. But this is largely a
20th century invention; earlier proofs had a different
nature. Even in recent decades there has been fierce debate about
the nature of proofs -- a debate fueled by efforts to automate
the proof process. We will present and contrast some different
notions of proof and the rival arguments between them.
Gilles Dowek INRIA
Axioms vs. Rewrite rules : from completeness to cut elimination
Orienting axioms as rewrite rules is a traditionnal way to cut off search
space in automated theorem proving. We try to show in this talk that such
an orientation is not just a way to implement a selective use of axioms but
that it may cut off search space more dramatically. In particular, the
completeness of proof search may be lost in some cases and new proof
theoretical tools are needed when it is not.
Martin Hofmann
University of Edinburgh
A type system for bounded space and functional in-place update
We show how linear typing can be used to obtain functional programs
which modify heap-allocated data structures in place.
We present this both as a ``design pattern'' for writing C-code in a
functional style and as a compilation process from linearly typed
first-order functional programs into malloc()-free C code.
The main technical result is the correctness of this compilation.
The crucial innovation over previous linear typing schemes consists of
the introduction of a resource type <> which controls the
number of constructor symbols such as cons in recursive definitions
and ensures linear space while restricting expressive power
surprisingly little.
While the space efficiency brought about by the new typing scheme and
the compilation into C can also be realised by with state-of-the-art
optimising compilers for functional languages such as Ocaml, the
present method provides guaranteed bounds on heap space which will be
of use for applications such as languages for embedded systems or
proof carrying code.
We discuss various extensions, in particular an extension with FIFO
queues admitting constant time catenation and enqueuing, and an
extension of the type system to fully-fledged intuitionistic linear
logic.
Helene Kirchner University of Nancy
Algebraic Specifications, Higher-Order Types and Set-Theoretic Models
In most algebraic specification frameworks, the type system is
restricted to sorts, subsorts, and first-order function types. This
is in marked contrast to the so-called model-oriented frameworks,
which provide higher-order types, interpreted set-theoretically as
Cartesian products, function spaces, and power-sets.
This talk presents a simple framework for enriching algebraic
specifications with higher-order types and set-theoretic models, while
keeping to the tractable logic of Horn clauses. It may be regarded as
the basis for a Horn-clause approximation to the Z framework, and has
the advantage of being amenable to prototyping and automated
reasoning. Standard set-theoretic models are considered, and
conditions are given for the existence of initial reducts of such
models. Algebraic specifications for various set-theoretic concepts
are considered.
Jan-Willem Klop
Free University of Amsterdam
Proof systems for cyclic term graphs
Cyclic term graphs are important in functional
programming and have also other applications such as
recursive types. We discuss and compare two complete
proof systems, one being the usual 'folklore'
system based on a fixed point rule, and one based on a
coinductive proof rule due to Brandt and Henglein.
Gordon Plotkin University of Edinburgh
Another Meta-Language for Programming with Bound Names Modulo Renaming
Pitts and Gabbay have recently described a language for metaprogramming for
languages with binders.
(See: http://www.cl.cam.ac.uk/users/amp12/papers/index.html#metlpb).
It is based on their Fraenkel-Mostowski (FM) set
model. Inspired by this work, we present a very similar language based
instead on the Fiore-Plotkin-Turi presheaf model (over the category of finite
cardinals).
The main difference is in the approach to the type theory. For Gabbay and
Pitts the judgments are that a value has a certain type and that it does not
depend on certain atoms. We rather take a positive approach and the
judgment is that the atoms depended on are included in a certain set
(cf. analogous statements about free variables in languages with binders).
This positive judgment is then part of the structural logic of the
type theory and is internalised by the abstraction functor, yielding a
straightforward semantics (just as commas in contexts are internalised by
products).
It may be that the two approaches are equivalent, apart from a certain
linearity
restriction inherent in the FM model. We have at any rate shown that one is
not
restricted to the FM model for the semantics of variable-binding metalanguages,
and it remains unclear which model is to be preferred.
Simona Ronchi Della Rocca
University of Turin
Intuitionistic Logics and Intersection Type assignment System
The intersection type assignment system for $\lambda$-calculus is an
extension of the Curry's type assignment system, where types are built
using, besides the arrow constructor for functional types, an {\em
intersection} constructor, allowing to assign to the same variable
more than one type. The so obtained system is very powerful, it has
interesting properties (it types all and only the solvable terms) and,
once equipped with a suitable notion of subtyping, it can supply
useful tools for programs analysis. Here we study the logical
counterpart of the intersection type assignment system, in order to
look at its properties from a purely logical point of view. The
related logics can be obtained from the intuitionistic logics with
implication and conjunction, by allowing the parallel derivation of a
finite number of formulas, and by restricting the introduction of the
conjunction just to formulas which have been proved in parallel. We
define a kind of {\em Curry-Howard isomorphism} between terms of
$\lambda$-calculus and such a logic: in the spirit of Girard's proof
nets, we associate to every term a {\em proof schema}, i.e., a tree
whose nodes are labelled by names of the logical rules. Then a term
can be typed if and only if its associated proof schema can be
transformed in a true proof by:
-
fixing the degree of parallelism of
every rule in it;
-
modifying its structure by introducing some
conjunction introduction rules;
- decorating the resulting proof
schema by logical formulas.
Then the principal type property of the
type assignment system, which has been proved in the past through a
difficult expansion-unification property of types, becomes now an easy
consequence of some very elementary properties of the logical
system. Moreover the logical system supplies useful tools for
designing both a type inference semi-algorithm and its decidable
restrictions.
Jonathan Seldin
Lethbridge University
On Lists and Other Abstract Data Types in the Calculus of Constructions
The representation of the inductively defined abstract data type for
lists was left incomplete in \cite[\S 9]{seldin:PTCC}. Here that
representation is completed, and it is proved that all extra
axioms needed are consistent. Among the innovations of this paper
is a definition of \textsf{cdr}, whose definition was left for
future work in \cite[\S 9]{seldin:PTCC}. The results are then
extended to other abstract data types, those of
\cite{berardi:EDTP}. The method used to define \textsf{cdr} for
lists is extended to obtain the definition of an inverse for each
argument of each constructor of an abstract data type. These
inverses are used to prove the injective property for the
constructors. Also, Dedekind's method of defining the natural
numbers is used to define a predicate associated with each abstract
data type, and the use of this predicate makes it unnecessary to
postulate the induction principle. The only axioms left to be
proved are those asserting the disjointness of the co-domains of
different constructors, and it is shown that those axioms can be
proved consistent.
\begin{thebibliography}{}
\bibitem[Berardi 1993]{berardi:EDTP}
Stefano Berardi. (1993)
\newblock Encoding of data types in pure construction calculus: a semantic
justification.
\newblock In G\'{e}rard Huet and Gordon Plotkin, editors, {\em Logical
Environments}, pages 30--60. Cambridge University Press.
\bibitem[Pfenning and Paulin-Mohring 1989]{pfenning/paulin-mohring:IDTC}
Frank Pfenning and Christine Paulin-Mohring. (1989)
\newblock Inductively defined types in the calculus of constructions.
\newblock In M.~Main, A.~Melton, M.~Mislove, and D.~Schmidt, editors, {\em
Mathematical Foundations of Programming Semantics: 5th International
Conference, Tulane University, New Orleans, Louisiana, USA, March 29--April
1, 1989, Proceedings}, volume 442 of {\em Lecture Notes in Computer Science},
pages 209--228. Springer-Verlag.
\bibitem[Seldin 1997]{seldin:PTCC}
Jonathan~P. Seldin. (1997)
\newblock On the proof theory of {C}oquand's calculus of constructions.
\newblock {\em Annals of Pure and Applied Logic}, 83:23--101.
\end{thebibliography}
Joe Wells Heriot-Watt University
A Paradigm Shift in Program Analysis and
Transformation via Intersection and Union Types
Implementations of modern higher-order typed programming languages (e.g., Haskell, Java, ML) increasingly rely on advanced
type systems to ensure that desirable properties (e.g., safety, security, freedom from deadlock) are preserved or enforced by
compilation and also to guide representation and optimization decisions. For programming flexibility (e.g., code reuse) and more
precise program analysis, modern type systems must support some form of type polymorphism, and generally do so with
universal and/or existential quantifiers in types. The most well-known example is Milner's type system, used at the core of
Haskell and ML.
Mbr>
While there is a rich theory of universal/existential quantifiers, there are two important difficulties in practice. (1) Systems
with these quantifiers generally do not have a strong principality property, i.e., there is no most general analysis for each
program fragment. This makes it difficult to perform type analysis compositionally (i.e., where program fragments are analyzed
independently) and sometimes even makes type analysis impossible. (2) These quantifiers hide information, making it hard to
extend type analysis to flow analysis and to perform type/flow-directed program transformations.
To address these difficulties, I have designed type systems which support type polymorphism with intersection and union
types instead of universal/existential types. In collaboration with my colleagues in the Church Project, I have developed
analysis and transformation algorithms using these systems. Overcoming difficulty (1), we have used intersection type
systems with strong principality properties to develop new type analysis algorithms which are compositional and also accept
many safe programs rejected as ill-typed by earlier algorithms. Overcoming difficulty (2), we have shown how to use a system
with intersection/union types to express polyvariant flow analyses and to guide a new style of program transformation and
optimization: type/flow-directed compilation. This new technology is the foundation for an experimental compiler we are
developing. I will discuss how our new approach to compilation will contribute to efficient, reliable, and modular compilation.
Maintained by
Fairouz Kamareddine
(
)