HOME

TheInfoList



OR:

The simply typed lambda calculus (\lambda^\to), a form of
type theory In mathematics, logic, and computer science, a type theory is the formal presentation of a specific type system, and in general type theory is the academic study of type systems. Some type theories serve as alternatives to set theory as a founda ...
, is a typed interpretation of the
lambda calculus Lambda calculus (also written as ''λ''-calculus) is a formal system in mathematical logic for expressing computation based on function abstraction and application using variable binding and substitution. It is a universal model of computation th ...
with only one type constructor (\to) that builds function types. It is the canonical and simplest example of a typed lambda calculus. The simply typed lambda calculus was originally introduced by Alonzo Church in 1940 as an attempt to avoid paradoxical use of the untyped lambda calculus. The term ''simple type'' is also used to refer extensions of the simply typed lambda calculus such as products, coproducts or
natural number In mathematics, the natural numbers are those numbers used for counting (as in "there are ''six'' coins on the table") and ordering (as in "this is the ''third'' largest city in the country"). Numbers used for counting are called '' cardinal ...
s ( System T) or even full
recursion Recursion (adjective: ''recursive'') occurs when a thing is defined in terms of itself or of its type. Recursion is used in a variety of disciplines ranging from linguistics to logic. The most common application of recursion is in mathematic ...
(like PCF). In contrast, systems which introduce polymorphic types (like
System F System F (also polymorphic lambda calculus or second-order lambda calculus) is a typed lambda calculus that introduces, to simply typed lambda calculus, a mechanism of universal quantification over types. System F formalizes parametric polymorph ...
) or
dependent type In computer science and logic, a dependent type is a type whose definition depends on a value. It is an overlapping feature of type theory and type systems. In intuitionistic type theory, dependent types are used to encode logic's quantifier ...
s (like the
Logical Framework In logic, a logical framework provides a means to define (or present) a logic as a signature in a higher-order type theory in such a way that provability of a formula in the original logic reduces to a type inhabitation problem in the framework typ ...
) are not considered ''simply typed''. The simple types, except for full recursion, are still considered ''simple'' because the Church encodings of such structures can be done using only \to and suitable type variables, while polymorphism and dependency cannot.


Syntax

In this article, the symbols \sigma and \tau are used to range over types. Informally, the ''function type'' \sigma \to \tau refers to the type of functions that, given an input of type \sigma, produce an output of type \tau. By convention, \to associates to the right: \sigma\to\tau\to\rho is read as \sigma\to(\tau\to\rho). To define the types, a set of ''base types'', B, must first be defined. These are sometimes called ''atomic types'' or ''type constants''. With this fixed, the syntax of types is: :\tau ::= \tau \to \tau \mid T \quad \mathrm \quad T \in B. For example, B = \, generates an infinite set of types starting with a,b,a \to a,a \to b,b\to b,b\to a, a \to (a \to a),\ldots,(b\to a) \to (a\to b), \ldots A set of ''term constants'' is also fixed for the base types. For example, it might assumed that a base type , and the term constants could be the natural numbers. In the original presentation, Church used only two base types: o for "the type of propositions" and \iota for "the type of individuals". The type o has no term constants, whereas \iota has one term constant. Frequently the calculus with only one base type, usually o, is considered. The syntax of the simply typed lambda calculus is essentially that of the lambda calculus itself. The term x\mathbin\tau denotes that the variable x is of type \tau. The term syntax, in BNF, is then: :e ::= x \mid \lambda x\mathbin\tau.e \mid e \, e \mid c where c is a term constant. That is, ''variable reference'', ''abstractions'', ''application'', and ''constant''. A variable reference x is ''bound'' if it is inside of an abstraction binding x. A term is ''closed'' if there are no unbound variables. In comparison, the syntax of untyped lambda calculus has no such typing or term constants: :e ::= x \mid \lambda x.e \mid e \, e Whereas in typed lambda calculus every ''abstraction'' (i.e. function) must specify the type of its argument.


Typing rules

To define the set of well-typed lambda terms of a given type, we will define a typing relation between terms and types. First, we introduce ''typing contexts'' or ''
typing environment In type theory a typing environment (or typing context) represents the association between variable names and data types. More formally an environment \Gamma is a set or ordered list of pairs \langle x,\tau \rangle, usually written as x:\tau, where ...
s'' \Gamma,\Delta,\dots, which are sets of typing assumptions. A ''typing assumption'' has the form x\mathbin\sigma, meaning x has type \sigma. The ''typing relation'' \Gamma\vdash e\mathbin\sigma indicates that e is a term of type \sigma in context \Gamma. In this case e is said to be ''well-typed'' (having type \sigma). Instances of the typing relation are called ''typing judgements''. The validity of a typing judgement is shown by providing a ''typing derivation'', constructed using
typing rule In type theory, a typing rule is an inference rule that describes how a type system assigns a type to a Syntax (programming languages), syntactic construction. These rules may be applied by the type system to determine if a Computer program, progr ...
s (wherein the premises above the line allow us to derive the conclusion below the line). Simply-typed lambda calculus uses these rules: In words, # If x has type \sigma in the context, we know that x has type \sigma. # Term constants have the appropriate base types. # If, in a certain context with x having type \sigma, e has type \tau, then, in the same context without x, \lambda x\mathbin\sigma.~e has type \sigma \to \tau. # If, in a certain context, e_1 has type \sigma \to \tau, and e_2 has type \sigma, then e_1~e_2 has type \tau. Examples of closed terms, ''i.e.'' terms typable in the empty context, are: *For every type \tau, a term \lambda x\mathbin\tau.x\mathbin\tau\to\tau (identity function/I-combinator), *For types \sigma,\tau, a term \lambda x\mathbin\sigma.\lambda y\mathbin\tau.x\mathbin\sigma \to \tau \to \sigma (the K-combinator), and *For types \tau,\tau',\tau'', a term \lambda x\mathbin\tau\to\tau'\to\tau''.\lambda y\mathbin\tau\to\tau'.\lambda z\mathbin\tau.x z (y z) : (\tau\to\tau'\to\tau'')\to(\tau\to\tau')\to\tau\to\tau'' (the S-combinator). These are the typed lambda calculus representations of the basic combinators of
combinatory logic Combinatory logic is a notation to eliminate the need for quantified variables in mathematical logic. It was introduced by Moses Schönfinkel and Haskell Curry, and has more recently been used in computer science as a theoretical model of com ...
. Each type \tau is assigned an order, a number o(\tau). For base types, o(T)=0; for function types, o(\sigma\to\tau)=\mbox(o(\sigma)+1,o(\tau)). That is, the order of a type measures the depth of the most left-nested arrow. Hence: : o(\iota \to \iota \to \iota) = 1 : o((\iota \to \iota) \to \iota) = 2


Semantics


Intrinsic vs. extrinsic interpretations

Broadly speaking, there are two different ways of assigning meaning to the simply typed lambda calculus, as to typed languages more generally, variously called intrinsic vs. extrinsic, ontological vs. semantical, or Church-style vs. Curry-style. An intrinsic semantics only assigns meaning to well-typed terms, or more precisely, assigns meaning directly to typing derivations. This has the effect that terms differing only by type annotations can nonetheless be assigned different meanings. For example, the identity term \lambda x\mathbin\mathtt.~x on integers and the identity term \lambda x\mathbin\mathtt.~x on booleans may mean different things. (The classic intended interpretations are the identity function on integers and the identity function on boolean values.) In contrast, an extrinsic semantics assigns meaning to terms regardless of typing, as they would be interpreted in an untyped language. In this view, \lambda x\mathbin\mathtt.~x and \lambda x\mathbin\mathtt.~x mean the same thing (''i.e.'', the same thing as \lambda x.~x). The distinction between intrinsic and extrinsic semantics is sometimes associated with the presence or absence of annotations on lambda abstractions, but strictly speaking this usage is imprecise. It is possible to define a extrinsic semantics on annotated terms simply by ignoring the types (''i.e.'', through type erasure), as it is possible to give a intrinsic semantics on unannotated terms when the types can be deduced from context (''i.e.'', through
type inference Type inference refers to the automatic detection of the type of an expression in a formal language. These include programming languages and mathematical type systems, but also natural languages in some branches of computer science and linguistic ...
). The essential difference between intrinsic and extrinsic approaches is just whether the typing rules are viewed as defining the language, or as a formalism for verifying properties of a more primitive underlying language. Most of the different semantic interpretations discussed below can be seen through either an intrinsic or extrinsic perspective.


Equational theory

The simply typed lambda calculus has the same
equational theory Universal algebra (sometimes called general algebra) is the field of mathematics that studies algebraic structures themselves, not examples ("models") of algebraic structures. For instance, rather than take particular groups as the object of study ...
of βη-equivalence as untyped lambda calculus, but subject to type restrictions. The equation for beta reduction :(\lambda x\mathbin\sigma.~t)\,u =_ t :=u/math> holds in context \Gamma whenever \Gamma,x\mathbin\sigma \vdash t\mathbin\tau and \Gamma\vdash u\mathbin\sigma, while the equation for eta reduction :\lambda x\mathbin\sigma.~t\,x =_\eta t holds whenever \Gamma\vdash t\!:\sigma \to \tau and x does not appear free in t.


Operational semantics

Likewise, the operational semantics of simply typed lambda calculus can be fixed as for the untyped lambda calculus, using call by name, call by value, or other evaluation strategies. As for any typed language, type safety is a fundamental property of all of these evaluation strategies. Additionally, the strong normalization property described below implies that any evaluation strategy will terminate on all simply typed terms.


Categorical semantics

The simply typed lambda calculus (with \beta\eta-equivalence) is the internal language of
Cartesian closed categories In category theory, a category is Cartesian closed if, roughly speaking, any morphism defined on a product of two objects can be naturally identified with a morphism defined on one of the factors. These categories are particularly important in mat ...
(CCCs), as was first observed by
Joachim Lambek Joachim "Jim" Lambek (5 December 1922 – 23 June 2014) was a German-born Canadian mathematician. He was Peter Redpath Emeritus Professor of Pure Mathematics at McGill University, where he earned his PhD degree in 1950 with Hans Zassenhaus ...
. Given any specific CCC, the basic types of the corresponding lambda calculus are just the
objects Object may refer to: General meanings * Object (philosophy), a thing, being, or concept ** Object (abstract), an object which does not exist at any particular time or place ** Physical object, an identifiable collection of matter * Goal, an ai ...
, and the terms are the
morphism In mathematics, particularly in category theory, a morphism is a structure-preserving map from one mathematical structure to another one of the same type. The notion of morphism recurs in much of contemporary mathematics. In set theory, morphisms ...
s. Conversely, every simply typed lambda calculus gives a CCC whose objects are the types, and morphisms are equivalence classes of terms. To make the correspondence clear, a type constructor for the
Cartesian product In mathematics, specifically set theory, the Cartesian product of two sets ''A'' and ''B'', denoted ''A''×''B'', is the set of all ordered pairs where ''a'' is in ''A'' and ''b'' is in ''B''. In terms of set-builder notation, that is : A\t ...
is typically added to the above. To preserve the categoricity of the Cartesian product, one adds
typing rule In type theory, a typing rule is an inference rule that describes how a type system assigns a type to a Syntax (programming languages), syntactic construction. These rules may be applied by the type system to determine if a Computer program, progr ...
s for ''pairing'', ''projection'', and a ''unit term''. Given two terms s\mathbin\sigma and t\mathbin\tau, the term (s,t) has type \sigma\times\tau. Likewise, if one has a term u\mathbin\tau_1\times\tau_2, then there are terms \pi_1(u)\mathbin\tau_1 and \pi_2(u)\mathbin\tau_2 where the \pi_i correspond to the projections of the Cartesian product. The ''unit term'', of type 1, is written as () and vocalized as 'nil', is the final object. The equational theory is extended likewise, so that one has :\pi_1(s\mathbin\sigma,t\mathbin\tau) = s\mathbin\sigma :\pi_2(s\mathbin\sigma,t\mathbin\tau) = t\mathbin\tau :(\pi_1(u\mathbin\sigma\times\tau) , \pi_2(u\mathbin\sigma\times\tau)) =u\mathbin\sigma\times\tau :t\mathbin1 = () This last is read as "''if t has type 1, then it reduces to nil''". The above can then be turned into a category by taking the types as the
objects Object may refer to: General meanings * Object (philosophy), a thing, being, or concept ** Object (abstract), an object which does not exist at any particular time or place ** Physical object, an identifiable collection of matter * Goal, an ai ...
. The
morphism In mathematics, particularly in category theory, a morphism is a structure-preserving map from one mathematical structure to another one of the same type. The notion of morphism recurs in much of contemporary mathematics. In set theory, morphisms ...
s \sigma\to\tau are
equivalence class In mathematics, when the elements of some set S have a notion of equivalence (formalized as an equivalence relation), then one may naturally split the set S into equivalence classes. These equivalence classes are constructed so that elements a ...
es of pairs (x\mathbin\sigma, t\mathbin\tau) where ''x'' is a variable (of type \sigma) and ''t'' is a term (of type \tau), having no free variables in it, except for (optionally) ''x''. Closure is obtained from
currying In mathematics and computer science, currying is the technique of translating the evaluation of a function that takes multiple arguments into evaluating a sequence of functions, each with a single argument. For example, currying a function f tha ...
and
application Application may refer to: Mathematics and computing * Application software, computer software designed to help the user to perform specific tasks ** Application layer, an abstraction layer that specifies protocols and interface methods used in a c ...
, as usual. More precisely, there exist functors between the category of Cartesian closed categories, and the category of simply-typed lambda theories. It is common to extend this case to closed symmetric monoidal categories by using a linear type system. The reason for this is that the CCC is a special case of the closed symmetric monoidal category, which is typically taken to be the category of sets. This is fine for laying the foundations of
set theory Set theory is the branch of mathematical logic that studies sets, which can be informally described as collections of objects. Although objects of any kind can be collected into a set, set theory, as a branch of mathematics, is mostly concern ...
, but the more general topos seems to provide a superior foundation.


Proof-theoretic semantics

The simply typed lambda calculus is closely related to the implicational fragment of propositional
intuitionistic logic Intuitionistic logic, sometimes more generally called constructive logic, refers to systems of symbolic logic that differ from the systems used for classical logic by more closely mirroring the notion of constructive proof. In particular, systems ...
, i.e.,
minimal logic Minimal logic, or minimal calculus, is a symbolic logic system originally developed by Ingebrigt Johansson. It is an intuitionistic and paraconsistent logic, that rejects both the law of the excluded middle as well as the principle of explosion ...
, via the Curry–Howard isomorphism: terms correspond precisely to proofs in natural deduction, and inhabited types are exactly the tautologies of minimal logic.


Alternative syntaxes

The presentation given above is not the only way of defining the syntax of the simply typed lambda calculus. One alternative is to remove type annotations entirely (so that the syntax is identical to the untyped lambda calculus), while ensuring that terms are well-typed via Hindley–Milner type inference. The inference algorithm is terminating, sound, and complete: whenever a term is typable, the algorithm computes its type. More precisely, it computes the term's principal type, since often an unannotated term (such as \lambda x.~x) may have more than one type (\mathtt \to \mathtt, \mathtt \to \mathtt, etc., which are all instances of the principal type \alpha \to \alpha). Another alternative presentation of simply typed lambda calculus is based on bidirectional type checking, which requires more type annotations than Hindley–Milner inference but is easier to describe. The
type system In computer programming, a type system is a logical system comprising a set of rules that assigns a property called a type to every "term" (a word, phrase, or other set of symbols). Usually the terms are various constructs of a computer progra ...
is divided into two judgments, representing both ''checking'' and ''synthesis'', written \Gamma \vdash e \Leftarrow \tau and \Gamma \vdash e \Rightarrow \tau respectively. Operationally, the three components \Gamma, e, and \tau are all ''inputs'' to the checking judgment \Gamma \vdash e \Leftarrow \tau, whereas the synthesis judgment \Gamma \vdash e \Rightarrow \tau only takes \Gamma and e as inputs, producing the type \tau as output. These judgments are derived via the following rules: Observe that rules �� are nearly identical to rules (1)–(4) above, except for the careful choice of checking or synthesis judgments. These choices can be explained like so: # If x\mathbin\sigma is in the context, we can synthesize type \sigma for x. # The types of term constants are fixed and can be synthesized. # To check that \lambda x.~e has type \sigma \to \tau in some context, we extend the context with x\mathbin\sigma and check that e has type \tau. # If e_1 synthesizes type \sigma \to \tau (in some context), and e_2 checks against type \sigma (in the same context), then e_1~e_2 synthesizes type \tau. Observe that the rules for synthesis are read top-to-bottom, whereas the rules for checking are read bottom-to-top. Note in particular that we do not need any annotation on the lambda abstraction in rule because the type of the bound variable can be deduced from the type at which we check the function. Finally, we explain rules and as follows:
  1. To check that e has type \tau, it suffices to synthesize type \tau.
  2. If e checks against type \tau, then the explicitly annotated term (e\mathbin\tau) synthesizes \tau.
Because of these last two rules coercing between synthesis and checking, it is easy to see that any well-typed but unannotated term can be checked in the bidirectional system, so long as we insert "enough" type annotations. And in fact, annotations are needed only at β-redexes.


General observations

Given the standard semantics, the simply typed lambda calculus is
strongly normalizing In abstract rewriting, an object is in normal form if it cannot be rewritten any further, i.e. it is irreducible. Depending on the rewriting system, an object may rewrite to several normal forms or none at all. Many properties of rewriting systems ...
: that is, well-typed terms always reduce to a value, i.e., a \lambda abstraction. This is because recursion is not allowed by the typing rules: it is impossible to find types for fixed-point combinators and the looping term \Omega = (\lambda x.~x~x) (\lambda x.~x~x). Recursion can be added to the language by either having a special operator \mathtt_\alphaof type (\alpha \to \alpha) \to \alpha or adding general recursive types, though both eliminate strong normalization. Since it is strongly normalising, it is decidable whether or not a simply typed lambda calculus program halts: in fact, it ''always'' halts. We can therefore conclude that the language is ''not''
Turing complete Alan Mathison Turing (; 23 June 1912 – 7 June 1954) was an English mathematician, computer scientist, logician, cryptanalyst, philosopher, and theoretical biologist. Turing was highly influential in the development of theoretical ...
.


Important results

* Tait showed in 1967 that \beta-reduction is
strongly normalizing In abstract rewriting, an object is in normal form if it cannot be rewritten any further, i.e. it is irreducible. Depending on the rewriting system, an object may rewrite to several normal forms or none at all. Many properties of rewriting systems ...
. As a corollary \beta\eta-equivalence is decidable. Statman showed in 1979 that the normalisation problem is not elementary recursive, a proof which was later simplified by Mairson. The problem is known to be in the set \mathcal^4 of the Grzegorczyk hierarchy. A purely semantic normalisation proof (see normalisation by evaluation) was given by Berger and Schwichtenberg in 1991. * The unification problem for \beta\eta-equivalence is undecidable. Huet showed in 1973 that 3rd order unification is undecidable and this was improved upon by Baxter in 1978 then by Goldfarb in 1981 by showing that 2nd order unification is already undecidable. A proof that higher order matching (unification where only one term contains existential variables) is decidable was announced by Colin Stirling in 2006, and a full proof was published in 2009. * We can encode
natural number In mathematics, the natural numbers are those numbers used for counting (as in "there are ''six'' coins on the table") and ordering (as in "this is the ''third'' largest city in the country"). Numbers used for counting are called '' cardinal ...
s by terms of the type (o\to o)\to(o \to o) (
Church numeral In mathematics, Church encoding is a means of representing data and operators in the lambda calculus. The Church numerals are a representation of the natural numbers using lambda notation. The method is named for Alonzo Church, who first encoded d ...
s). Schwichtenberg showed in 1975 that in \lambda^\to exactly the extended
polynomial In mathematics, a polynomial is an expression consisting of indeterminates (also called variables) and coefficients, that involves only the operations of addition, subtraction, multiplication, and positive-integer powers of variables. An exampl ...
s are representable as functions over Church numerals; these are roughly the polynomials closed up under a conditional operator. * A ''full model'' of \lambda^\to is given by interpreting base types as sets and function types by the set-theoretic
function space In mathematics, a function space is a set of functions between two fixed sets. Often, the domain and/or codomain will have additional structure which is inherited by the function space. For example, the set of functions from any set into a vect ...
. Friedman showed in 1975 that this interpretation is
complete Complete may refer to: Logic * Completeness (logic) * Completeness of a theory, the property of a theory that every formula in the theory's language or its negation is provable Mathematics * The completeness of the real numbers, which implies t ...
for \beta\eta-equivalence, if the base types are interpreted by infinite sets. Statman showed in 1983 that \beta\eta-equivalence is the maximal equivalence which is ''typically ambiguous'', i.e. closed under type substitutions (''Statman's Typical Ambiguity Theorem''). A corollary of this is that the ''finite model property'' holds, i.e. finite sets are sufficient to distinguish terms which are not identified by \beta\eta-equivalence. * Plotkin introduced logical relations in 1973 to characterize the elements of a model which are definable by lambda terms. In 1993 Jung and Tiuryn showed that a general form of logical relation (Kripke logical relations with varying arity) exactly characterizes lambda definability. Plotkin and Statman conjectured that it is decidable whether a given element of a model generated from finite sets is definable by a lambda term (''Plotkin–Statman conjecture''). The conjecture was shown to be false by Loader in 2001.


Notes


References

* H. Barendregt, tp://ftp.cs.ru.nl/pub/CompMath.Found/HBK.ps Lambda Calculi with Types Handbook of Logic in Computer Science, Volume II, Oxford University Press, 1993. .


External links

* * {{SEP, type-theory-church, Church's Type Theory Lambda calculus Theory of computation Type theory