added StlcPropOld

This commit is contained in:
wadler 2017-06-23 11:09:29 +01:00
parent c1c997589d
commit 66132f1dbb
3 changed files with 769 additions and 3 deletions

View file

@ -6,7 +6,7 @@ permalink : /StlcOld
<div class="foldable">
\begin{code}
open import Maps using (Id; id; _≟_; PartialMap; module PartialMap)
open import MapsOld using (Id; id; _≟_; PartialMap; module PartialMap)
open import Data.Empty using (⊥; ⊥-elim)
open import Data.Maybe using (Maybe; just; nothing)
open import Data.Nat using (; suc; zero; _+_)

View file

@ -13,8 +13,8 @@ open import Data.Product using (∃; ∃₂; _,_; ,_)
open import Data.Sum using (_⊎_; inj₁; inj₂)
open import Relation.Nullary using (¬_; Dec; yes; no)
open import Relation.Binary.PropositionalEquality using (_≡_; _≢_; refl)
open import Maps
open import Stlc
open import MapsOld
open import StlcOld
\end{code}
</div>

766
src/StlcPropOld.lagda Normal file
View file

@ -0,0 +1,766 @@
---
title : "StlcProp: Properties of STLC"
layout : page
permalink : /StlcProp
---
<div class="foldable">
\begin{code}
open import Function using (_∘_)
open import Data.Empty using (⊥; ⊥-elim)
open import Data.Maybe using (Maybe; just; nothing)
open import Data.Product using (∃; ∃₂; _,_; ,_)
open import Data.Sum using (_⊎_; inj₁; inj₂)
open import Relation.Nullary using (¬_; Dec; yes; no)
open import Relation.Binary.PropositionalEquality using (_≡_; _≢_; refl)
open import MapsOld
open import StlcOld
\end{code}
</div>
In this chapter, we develop the fundamental theory of the Simply
Typed Lambda Calculus---in particular, the type safety
theorem.
## Canonical Forms
As we saw for the simple calculus in the [Stlc]({{ "Stlc" | relative_url }})
chapter, the first step in establishing basic properties of reduction and types
is to identify the possible _canonical forms_ (i.e., well-typed closed values)
belonging to each type. For $$bool$$, these are the boolean values $$true$$ and
$$false$$. For arrow types, the canonical forms are lambda-abstractions.
\begin{code}
CanonicalForms : Term → Type → Set
CanonicalForms t bool = t ≡ true ⊎ t ≡ false
CanonicalForms t (A ⇒ B) = ∃₂ λ x t → t ≡ abs x A t
canonicalForms : ∀ {t A} → ∅ ⊢ t A → Value t → CanonicalForms t A
canonicalForms (abs t) abs = _ , _ , refl
canonicalForms true true = inj₁ refl
canonicalForms false false = inj₂ refl
\end{code}
## Progress
As before, the _progress_ theorem tells us that closed, well-typed
terms are not stuck: either a well-typed term is a value, or it
can take a reduction step. The proof is a relatively
straightforward extension of the progress proof we saw in the
[Stlc]({{ "Stlc" | relative_url }}) chapter. We'll give the proof in English
first, then the formal version.
\begin{code}
progress : ∀ {t A} → ∅ ⊢ t A → Value t ⊎ ∃ λ t → t ==> t
\end{code}
_Proof_: By induction on the derivation of $$\vdash t : A$$.
- The last rule of the derivation cannot be `var`,
since a variable is never well typed in an empty context.
- The `true`, `false`, and `abs` cases are trivial, since in
each of these cases we can see by inspecting the rule that $$t$$
is a value.
- If the last rule of the derivation is `app`, then $$t$$ has the
form $$t_1\;t_2$$ for som e$$t_1$$ and $$t_2$$, where we know that
$$t_1$$ and $$t_2$$ are also well typed in the empty context; in particular,
there exists a type $$B$$ such that $$\vdash t_1 : A\to T$$ and
$$\vdash t_2 : B$$. By the induction hypothesis, either $$t_1$$ is a
value or it can take a reduction step.
- If $$t_1$$ is a value, then consider $$t_2$$, which by the other
induction hypothesis must also either be a value or take a step.
- Suppose $$t_2$$ is a value. Since $$t_1$$ is a value with an
arrow type, it must be a lambda abstraction; hence $$t_1\;t_2$$
can take a step by `red`.
- Otherwise, $$t_2$$ can take a step, and hence so can $$t_1\;t_2$$
by `app2`.
- If $$t_1$$ can take a step, then so can $$t_1 t_2$$ by `app1`.
- If the last rule of the derivation is `if`, then $$t = \text{if }t_1
\text{ then }t_2\text{ else }t_3$$, where $$t_1$$ has type $$bool$$. By
the IH, $$t_1$$ either is a value or takes a step.
- If $$t_1$$ is a value, then since it has type $$bool$$ it must be
either $$true$$ or $$false$$. If it is $$true$$, then $$t$$ steps
to $$t_2$$; otherwise it steps to $$t_3$$.
- Otherwise, $$t_1$$ takes a step, and therefore so does $$t$$ (by `if`).
\begin{code}
progress (var x ())
progress true = inj₁ true
progress false = inj₁ false
progress (abs tA) = inj₁ abs
progress (app t₁A⇒B t₂B)
with progress t₁A⇒B
... | inj₂ (_ , t₁⇒t₁) = inj₂ (_ , app1 t₁⇒t₁)
... | inj₁ v₁
with progress t₂B
... | inj₂ (_ , t₂⇒t₂) = inj₂ (_ , app2 v₁ t₂⇒t₂)
... | inj₁ v₂
with canonicalForms t₁A⇒B v₁
... | (_ , _ , refl) = inj₂ (_ , red v₂)
progress (if t₁bool then t₂A else t₃A)
with progress t₁bool
... | inj₂ (_ , t₁⇒t₁) = inj₂ (_ , if t₁⇒t₁)
... | inj₁ v₁
with canonicalForms t₁bool v₁
... | inj₁ refl = inj₂ (_ , iftrue)
... | inj₂ refl = inj₂ (_ , iffalse)
\end{code}
#### Exercise: 3 stars, optional (progress_from_term_ind)
Show that progress can also be proved by induction on terms
instead of induction on typing derivations.
\begin{code}
postulate
progress : ∀ {t A} → ∅ ⊢ t A → Value t ⊎ ∃ λ t → t ==> t
\end{code}
## Preservation
The other half of the type soundness property is the preservation
of types during reduction. For this, we need to develop some
technical machinery for reasoning about variables and
substitution. Working from top to bottom (from the high-level
property we are actually interested in to the lowest-level
technical lemmas that are needed by various cases of the more
interesting proofs), the story goes like this:
- The _preservation theorem_ is proved by induction on a typing
derivation, pretty much as we did in the [Stlc]({{ "Stlc" | relative_url }})
chapter. The one case that is significantly different is the one for the
$$red$$ rule, whose definition uses the substitution operation. To see that
this step preserves typing, we need to know that the substitution itself
does. So we prove a...
- _substitution lemma_, stating that substituting a (closed)
term $$s$$ for a variable $$x$$ in a term $$t$$ preserves the type
of $$t$$. The proof goes by induction on the form of $$t$$ and
requires looking at all the different cases in the definition
of substitition. This time, the tricky cases are the ones for
variables and for function abstractions. In both cases, we
discover that we need to take a term $$s$$ that has been shown
to be well-typed in some context $$\Gamma$$ and consider the same
term $$s$$ in a slightly different context $$\Gamma'$$. For this
we prove a...
- _context invariance_ lemma, showing that typing is preserved
under "inessential changes" to the context $$\Gamma$$---in
particular, changes that do not affect any of the free
variables of the term. And finally, for this, we need a
careful definition of...
- the _free variables_ of a term---i.e., those variables
mentioned in a term and not in the scope of an enclosing
function abstraction binding a variable of the same name.
To make Agda happy, we need to formalize the story in the opposite
order...
### Free Occurrences
A variable $$x$$ _appears free in_ a term _t_ if $$t$$ contains some
occurrence of $$x$$ that is not under an abstraction labeled $$x$$.
For example:
- $$y$$ appears free, but $$x$$ does not, in $$\lambda x : A\to B. x\;y$$
- both $$x$$ and $$y$$ appear free in $$(\lambda x:A\to B. x\;y) x$$
- no variables appear free in $$\lambda x:A\to B. \lambda y:A. x\;y$$
Formally:
\begin{code}
data _FreeIn_ (x : Id) : Term → Set where
var : x FreeIn var x
abs : ∀ {y A t} → y ≢ x → x FreeIn t → x FreeIn abs y A t
app1 : ∀ {t₁ t₂} → x FreeIn t₁ → x FreeIn app t₁ t₂
app2 : ∀ {t₁ t₂} → x FreeIn t₂ → x FreeIn app t₁ t₂
if1 : ∀ {t₁ t₂ t₃} → x FreeIn t₁ → x FreeIn (if t₁ then t₂ else t₃)
if2 : ∀ {t₁ t₂ t₃} → x FreeIn t₂ → x FreeIn (if t₁ then t₂ else t₃)
if3 : ∀ {t₁ t₂ t₃} → x FreeIn t₃ → x FreeIn (if t₁ then t₂ else t₃)
\end{code}
A term in which no variables appear free is said to be _closed_.
\begin{code}
Closed : Term → Set
Closed t = ∀ {x} → ¬ (x FreeIn t)
\end{code}
#### Exercise: 1 star (free-in)
If the definition of `_FreeIn_` is not crystal clear to
you, it is a good idea to take a piece of paper and write out the
rules in informal inference-rule notation. (Although it is a
rather low-level, technical definition, understanding it is
crucial to understanding substitution and its properties, which
are really the crux of the lambda-calculus.)
### Substitution
To prove that substitution preserves typing, we first need a
technical lemma connecting free variables and typing contexts: If
a variable $$x$$ appears free in a term $$t$$, and if we know $$t$$ is
well typed in context $$\Gamma$$, then it must be the case that
$$\Gamma$$ assigns a type to $$x$$.
\begin{code}
freeInCtxt : ∀ {x t A Γ} → x FreeIn t → Γ ⊢ t A → ∃ λ B → Γ x ≡ just B
\end{code}
_Proof_: We show, by induction on the proof that $$x$$ appears
free in $$t$$, that, for all contexts $$\Gamma$$, if $$t$$ is well
typed under $$\Gamma$$, then $$\Gamma$$ assigns some type to $$x$$.
- If the last rule used was `var`, then $$t = x$$, and from
the assumption that $$t$$ is well typed under $$\Gamma$$ we have
immediately that $$\Gamma$$ assigns a type to $$x$$.
- If the last rule used was `app1`, then $$t = t_1\;t_2$$ and $$x$$
appears free in $$t_1$$. Since $$t$$ is well typed under $$\Gamma$$,
we can see from the typing rules that $$t_1$$ must also be, and
the IH then tells us that $$\Gamma$$ assigns $$x$$ a type.
- Almost all the other cases are similar: $$x$$ appears free in a
subterm of $$t$$, and since $$t$$ is well typed under $$\Gamma$$, we
know the subterm of $$t$$ in which $$x$$ appears is well typed
under $$\Gamma$$ as well, and the IH gives us exactly the
conclusion we want.
- The only remaining case is `abs`. In this case $$t =
\lambda y:A.t'$$, and $$x$$ appears free in $$t'$$; we also know that
$$x$$ is different from $$y$$. The difference from the previous
cases is that whereas $$t$$ is well typed under $$\Gamma$$, its
body $$t'$$ is well typed under $$(\Gamma, y:A)$$, so the IH
allows us to conclude that $$x$$ is assigned some type by the
extended context $$(\Gamma, y:A)$$. To conclude that $$\Gamma$$
assigns a type to $$x$$, we appeal the decidable equality for names
`_≟_`, noting that $$x$$ and $$y$$ are different variables.
\begin{code}
freeInCtxt var (var _ xA) = (_ , xA)
freeInCtxt (app1 x∈t₁) (app t₁A _ ) = freeInCtxt x∈t₁ t₁A
freeInCtxt (app2 x∈t₂) (app _ t₂A) = freeInCtxt x∈t₂ t₂A
freeInCtxt (if1 x∈t₁) (if t₁A then _ else _ ) = freeInCtxt x∈t₁ t₁A
freeInCtxt (if2 x∈t₂) (if _ then t₂A else _ ) = freeInCtxt x∈t₂ t₂A
freeInCtxt (if3 x∈t₃) (if _ then _ else t₃A) = freeInCtxt x∈t₃ t₃A
freeInCtxt {x} (abs {y} y≠x x∈t) (abs tB)
with freeInCtxt x∈t tB
... | xA
with y ≟ x
... | yes y=x = ⊥-elim (y≠x y=x)
... | no _ = xA
\end{code}
Next, we'll need the fact that any term $$t$$ which is well typed in
the empty context is closed (it has no free variables).
#### Exercise: 2 stars, optional (∅⊢-closed)
\begin{code}
postulate
∅⊢-closed : ∀ {t A} → ∅ ⊢ t A → Closed t
\end{code}
<div class="hidden">
\begin{code}
∅⊢-closed : ∀ {t A} → ∅ ⊢ t A → Closed t
∅⊢-closed (var x ())
∅⊢-closed (app t₁A⇒B t₂A) (app1 x∈t₁) = ∅⊢-closed t₁A⇒B x∈t₁
∅⊢-closed (app t₁A⇒B t₂A) (app2 x∈t₂) = ∅⊢-closed t₂A x∈t₂
∅⊢-closed true ()
∅⊢-closed false ()
∅⊢-closed (if t₁bool then t₂A else t₃A) (if1 x∈t₁) = ∅⊢-closed t₁bool x∈t₁
∅⊢-closed (if t₁bool then t₂A else t₃A) (if2 x∈t₂) = ∅⊢-closed t₂A x∈t₂
∅⊢-closed (if t₁bool then t₂A else t₃A) (if3 x∈t₃) = ∅⊢-closed t₃A x∈t₃
∅⊢-closed (abs {x = x} tA) {y} (abs x≠y y∈t) with freeInCtxt y∈t tA
∅⊢-closed (abs {x = x} tA) {y} (abs x≠y y∈t) | A , yA with x ≟ y
∅⊢-closed (abs {x = x} tA) {y} (abs x≠y y∈t) | A , yA | yes x=y = x≠y x=y
∅⊢-closed (abs {x = x} tA) {y} (abs x≠y y∈t) | A , () | no _
\end{code}
</div>
Sometimes, when we have a proof $$\Gamma\vdash t : A$$, we will need to
replace $$\Gamma$$ by a different context $$\Gamma'$$. When is it safe
to do this? Intuitively, it must at least be the case that
$$\Gamma'$$ assigns the same types as $$\Gamma$$ to all the variables
that appear free in $$t$$. In fact, this is the only condition that
is needed.
\begin{code}
replaceCtxt : ∀ {Γ Γ′ t A}
→ (∀ {x} → x FreeIn t → Γ x ≡ Γ′ x)
→ Γ ⊢ t A
→ Γ′ ⊢ t A
\end{code}
_Proof_: By induction on the derivation of
$$\Gamma \vdash t \in T$$.
- If the last rule in the derivation was `var`, then $$t = x$$
and $$\Gamma x = T$$. By assumption, $$\Gamma' x = T$$ as well, and
hence $$\Gamma' \vdash t : T$$ by `var`.
- If the last rule was `abs`, then $$t = \lambda y:A. t'$$, with
$$T = A\to B$$ and $$\Gamma, y : A \vdash t' : B$$. The
induction hypothesis is that, for any context $$\Gamma''$$, if
$$\Gamma, y:A$$ and $$\Gamma''$$ assign the same types to all the
free variables in $$t'$$, then $$t'$$ has type $$B$$ under
$$\Gamma''$$. Let $$\Gamma'$$ be a context which agrees with
$$\Gamma$$ on the free variables in $$t$$; we must show
$$\Gamma' \vdash \lambda y:A. t' : A\to B$$.
By $$abs$$, it suffices to show that $$\Gamma', y:A \vdash t' : t'$$.
By the IH (setting $$\Gamma'' = \Gamma', y:A$$), it suffices to show
that $$\Gamma, y:A$$ and $$\Gamma', y:A$$ agree on all the variables
that appear free in $$t'$$.
Any variable occurring free in $$t'$$ must be either $$y$$ or
some other variable. $$\Gamma, y:A$$ and $$\Gamma', y:A$$
clearly agree on $$y$$. Otherwise, note that any variable other
than $$y$$ that occurs free in $$t'$$ also occurs free in
$$t = \lambda y:A. t'$$, and by assumption $$\Gamma$$ and
$$\Gamma'$$ agree on all such variables; hence so do $$\Gamma, y:A$$ and
$$\Gamma', y:A$$.
- If the last rule was `app`, then $$t = t_1\;t_2$$, with
$$\Gamma \vdash t_1:A\to T$$ and $$\Gamma \vdash t_2:A$$.
One induction hypothesis states that for all contexts $$\Gamma'$$,
if $$\Gamma'$$ agrees with $$\Gamma$$ on the free variables in $$t_1$$,
then $$t_1$$ has type $$A\to T$$ under $$\Gamma'$$; there is a similar IH
for $$t_2$$. We must show that $$t_1\;t_2$$ also has type $$T$$ under
$$\Gamma'$$, given the assumption that $$\Gamma'$$ agrees with
$$\Gamma$$ on all the free variables in $$t_1\;t_2$$. By `app`, it
suffices to show that $$t_1$$ and $$t_2$$ each have the same type
under $$\Gamma'$$ as under $$\Gamma$$. But all free variables in
$$t_1$$ are also free in $$t_1\;t_2$$, and similarly for $$t_2$$;
hence the desired result follows from the induction hypotheses.
\begin{code}
replaceCtxt f (var x xA) rewrite f var = var x xA
replaceCtxt f (app t₁A⇒B t₂A)
= app (replaceCtxt (f ∘ app1) t₁A⇒B) (replaceCtxt (f ∘ app2) t₂A)
replaceCtxt {Γ} {Γ′} f (abs {.Γ} {x} {A} {B} {t} tB)
= abs (replaceCtxt f tB)
where
f : ∀ {y} → y FreeIn t → (Γ , x A) y ≡ (Γ′ , x A) y
f {y} y∈t with x ≟ y
... | yes _ = refl
... | no x≠y = f (abs x≠y y∈t)
replaceCtxt _ true = true
replaceCtxt _ false = false
replaceCtxt f (if t₁bool then t₂A else t₃A)
= if replaceCtxt (f ∘ if1) t₁bool
then replaceCtxt (f ∘ if2) t₂A
else replaceCtxt (f ∘ if3) t₃A
\end{code}
Now we come to the conceptual heart of the proof that reduction
preserves types---namely, the observation that _substitution_
preserves types.
Formally, the so-called _Substitution Lemma_ says this: Suppose we
have a term $$t$$ with a free variable $$x$$, and suppose we've been
able to assign a type $$T$$ to $$t$$ under the assumption that $$x$$ has
some type $$U$$. Also, suppose that we have some other term $$v$$ and
that we've shown that $$v$$ has type $$U$$. Then, since $$v$$ satisfies
the assumption we made about $$x$$ when typing $$t$$, we should be
able to substitute $$v$$ for each of the occurrences of $$x$$ in $$t$$
and obtain a new term that still has type $$T$$.
_Lemma_: If $$\Gamma,x:U \vdash t : T$$ and $$\vdash v : U$$, then
$$\Gamma \vdash [x:=v]t : T$$.
\begin{code}
[:=]-preserves-⊢ : ∀ {Γ x A t v B}
→ ∅ ⊢ v A
→ Γ , x A ⊢ t B
→ Γ , x A ⊢ [ x := v ] t B
\end{code}
One technical subtlety in the statement of the lemma is that
we assign $$v$$ the type $$U$$ in the _empty_ context---in other
words, we assume $$v$$ is closed. This assumption considerably
simplifies the $$abs$$ case of the proof (compared to assuming
$$\Gamma \vdash v : U$$, which would be the other reasonable assumption
at this point) because the context invariance lemma then tells us
that $$v$$ has type $$U$$ in any context at all---we don't have to
worry about free variables in $$v$$ clashing with the variable being
introduced into the context by $$abs$$.
The substitution lemma can be viewed as a kind of "commutation"
property. Intuitively, it says that substitution and typing can
be done in either order: we can either assign types to the terms
$$t$$ and $$v$$ separately (under suitable contexts) and then combine
them using substitution, or we can substitute first and then
assign a type to $$ $$x:=v$$ t $$---the result is the same either
way.
_Proof_: We show, by induction on $$t$$, that for all $$T$$ and
$$\Gamma$$, if $$\Gamma,x:U \vdash t : T$$ and $$\vdash v : U$$, then $$\Gamma
\vdash $$x:=v$$t : T$$.
- If $$t$$ is a variable there are two cases to consider,
depending on whether $$t$$ is $$x$$ or some other variable.
- If $$t = x$$, then from the fact that $$\Gamma, x:U \vdash x :
T$$ we conclude that $$U = T$$. We must show that $$[x:=v]x =
v$$ has type $$T$$ under $$\Gamma$$, given the assumption that
$$v$$ has type $$U = T$$ under the empty context. This
follows from context invariance: if a closed term has type
$$T$$ in the empty context, it has that type in any context.
- If $$t$$ is some variable $$y$$ that is not equal to $$x$$, then
we need only note that $$y$$ has the same type under $$\Gamma,
x:U$$ as under $$\Gamma$$.
- If $$t$$ is an abstraction $$\lambda y:t_{11}. t_{12}$$, then the IH tells us,
for all $$\Gamma'$$ and $$T'$$, that if $$\Gamma',x:U \vdash t_{12}:T'$$
and $$\vdash v:U$$, then $$\Gamma' \vdash [x:=v]t_{12}:T'$$.
The substitution in the conclusion behaves differently
depending on whether $$x$$ and $$y$$ are the same variable.
First, suppose $$x = y$$. Then, by the definition of
substitution, $$[x:=v]t = t$$, so we just need to show $$\Gamma \vdash
t : T$$. But we know $$\Gamma,x:U \vdash t : T$$, and, since $$y$$
does not appear free in $$\lambda y:t_{11}. t_{12}$$, the context invariance
lemma yields $$\Gamma \vdash t : T$$.
Second, suppose $$x \neq y$$. We know $$\Gamma,x:U,y:t_{11} \vdash
t_{12}:t_{12}$$ by inversion of the typing relation, from which
$$\Gamma,y:t_{11},x:U \vdash t_{12}:t_{12}$$ follows by the context invariance
lemma, so the IH applies, giving us $$\Gamma,y:t_{11} \vdash
[x:=v]t_{12}:t_{12}$$. By $$abs$$, $$\Gamma \vdash \lambda y:t_{11}.
[x:=v]t_{12}:t_{11}\to t_{12}$$, and by the definition of substitution (noting
that $$x \neq y$$), $$\Gamma \vdash \lambda y:t_{11}. [x:=v]t_{12}:t_{11}\to
t_{12}$$ as required.
- If $$t$$ is an application $$t_1 t_2$$, the result follows
straightforwardly from the definition of substitution and the
induction hypotheses.
- The remaining cases are similar to the application case.
One more technical note: This proof is a rare case where an
induction on terms, rather than typing derivations, yields a
simpler argument. The reason for this is that the assumption
$$update Gamma x U \vdash t : T$$ is not completely generic, in the
sense that one of the "slots" in the typing relation---namely the
context---is not just a variable, and this means that Agda's
native induction tactic does not give us the induction hypothesis
that we want. It is possible to work around this, but the needed
generalization is a little tricky. The term $$t$$, on the other
hand, _is_ completely generic.
\begin{code}
[:=]-preserves-⊢ {Γ} {x} vA (var y y∈Γ) with x ≟ y
... | yes x=y = {!!}
... | no x≠y = {!!}
[:=]-preserves-⊢ vA (abs tB) = {!!}
[:=]-preserves-⊢ vA (app t₁A⇒B t₂A) =
app ([:=]-preserves-⊢ vA t₁A⇒B) ([:=]-preserves-⊢ vA t₂A)
[:=]-preserves-⊢ vA true = true
[:=]-preserves-⊢ vA false = false
[:=]-preserves-⊢ vA (if t₁bool then t₂B else t₃B) =
if [:=]-preserves-⊢ vA t₁bool
then [:=]-preserves-⊢ vA t₂B
else [:=]-preserves-⊢ vA t₃B
\end{code}
### Main Theorem
We now have the tools we need to prove preservation: if a closed
term $$t$$ has type $$T$$ and takes a step to $$t'$$, then $$t'$$
is also a closed term with type $$T$$. In other words, the small-step
reduction relation preserves types.
Theorem preservation : forall t t' T,
empty \vdash t : T →
t ==> t' →
empty \vdash t' : T.
_Proof_: By induction on the derivation of $$\vdash t : T$$.
- We can immediately rule out $$var$$, $$abs$$, $$T_True$$, and
$$T_False$$ as the final rules in the derivation, since in each of
these cases $$t$$ cannot take a step.
- If the last rule in the derivation was $$app$$, then $$t = t_1
t_2$$. There are three cases to consider, one for each rule that
could have been used to show that $$t_1 t_2$$ takes a step to $$t'$$.
- If $$t_1 t_2$$ takes a step by $$Sapp1$$, with $$t_1$$ stepping to
$$t_1'$$, then by the IH $$t_1'$$ has the same type as $$t_1$$, and
hence $$t_1' t_2$$ has the same type as $$t_1 t_2$$.
- The $$Sapp2$$ case is similar.
- If $$t_1 t_2$$ takes a step by $$Sred$$, then $$t_1 =
\lambda x:t_{11}.t_{12}$$ and $$t_1 t_2$$ steps to $$$$x:=t_2$$t_{12}$$; the
desired result now follows from the fact that substitution
preserves types.
- If the last rule in the derivation was $$if$$, then $$t = if t_1
then t_2 else t_3$$, and there are again three cases depending on
how $$t$$ steps.
- If $$t$$ steps to $$t_2$$ or $$t_3$$, the result is immediate, since
$$t_2$$ and $$t_3$$ have the same type as $$t$$.
- Otherwise, $$t$$ steps by $$Sif$$, and the desired conclusion
follows directly from the induction hypothesis.
Proof with eauto.
remember (@empty ty) as Gamma.
intros t t' T HT. generalize dependent t'.
induction HT;
intros t' HE; subst Gamma; subst;
try solve $$inversion HE; subst; auto$$.
- (* app
inversion HE; subst...
(* Most of the cases are immediate by induction,
and $$eauto$$ takes care of them
+ (* Sred
apply substitution_preserves_typing with t_{11}...
inversion HT_1...
Qed.
#### Exercise: 2 stars, recommended (subject_expansion_stlc)
An exercise in the [Stlc]({{ "Stlc" | relative_url }}) chapter asked about the
subject expansion property for the simple language of arithmetic and boolean
expressions. Does this property hold for STLC? That is, is it always the case
that, if $$t ==> t'$$ and $$has_type t' T$$, then $$empty \vdash t : T$$? If
so, prove it. If not, give a counter-example not involving conditionals.
(* FILL IN HERE
## Type Soundness
#### Exercise: 2 stars, optional (type_soundness)
Put progress and preservation together and show that a well-typed
term can _never_ reach a stuck state.
Definition stuck (t:tm) : Prop :=
(normal_form step) t /\ ~ value t.
Corollary soundness : forall t t' T,
empty \vdash t : T →
t ==>* t' →
~(stuck t').
Proof.
intros t t' T Hhas_type Hmulti. unfold stuck.
intros $$Hnf Hnot_val$$. unfold normal_form in Hnf.
induction Hmulti.
(* FILL IN HERE Admitted.
(** $$$$
## Uniqueness of Types
#### Exercise: 3 stars (types_unique)
Another nice property of the STLC is that types are unique: a
given term (in a given context) has at most one type.
Formalize this statement and prove it.
(* FILL IN HERE
(** $$$$
## Additional Exercises
#### Exercise: 1 star (progress_preservation_statement)
Without peeking at their statements above, write down the progress
and preservation theorems for the simply typed lambda-calculus.
$$$$
#### Exercise: 2 stars (stlc_variation1)
Suppose we add a new term $$zap$$ with the following reduction rule
--------- (ST_Zap)
t ==> zap
and the following typing rule:
---------------- (T_Zap)
Gamma \vdash zap : T
Which of the following properties of the STLC remain true in
the presence of these rules? For each property, write either
"remains true" or "becomes false." If a property becomes
false, give a counterexample.
- Determinism of $$step$$
- Progress
- Preservation
#### Exercise: 2 stars (stlc_variation2)
Suppose instead that we add a new term $$foo$$ with the following
reduction rules:
----------------- (ST_Foo1)
(\lambda x:A. x) ==> foo
------------ (ST_Foo2)
foo ==> true
Which of the following properties of the STLC remain true in
the presence of this rule? For each one, write either
"remains true" or else "becomes false." If a property becomes
false, give a counterexample.
- Determinism of $$step$$
- Progress
- Preservation
#### Exercise: 2 stars (stlc_variation3)
Suppose instead that we remove the rule $$Sapp1$$ from the $$step$$
relation. Which of the following properties of the STLC remain
true in the presence of this rule? For each one, write either
"remains true" or else "becomes false." If a property becomes
false, give a counterexample.
- Determinism of $$step$$
- Progress
- Preservation
#### Exercise: 2 stars, optional (stlc_variation4)
Suppose instead that we add the following new rule to the
reduction relation:
---------------------------------- (ST_FunnyIfTrue)
(if true then t_1 else t_2) ==> true
Which of the following properties of the STLC remain true in
the presence of this rule? For each one, write either
"remains true" or else "becomes false." If a property becomes
false, give a counterexample.
- Determinism of $$step$$
- Progress
- Preservation
#### Exercise: 2 stars, optional (stlc_variation5)
Suppose instead that we add the following new rule to the typing
relation:
Gamma \vdash t_1 : bool→bool→bool
Gamma \vdash t_2 : bool
------------------------------ (T_FunnyApp)
Gamma \vdash t_1 t_2 : bool
Which of the following properties of the STLC remain true in
the presence of this rule? For each one, write either
"remains true" or else "becomes false." If a property becomes
false, give a counterexample.
- Determinism of $$step$$
- Progress
- Preservation
#### Exercise: 2 stars, optional (stlc_variation6)
Suppose instead that we add the following new rule to the typing
relation:
Gamma \vdash t_1 : bool
Gamma \vdash t_2 : bool
--------------------- (T_FunnyApp')
Gamma \vdash t_1 t_2 : bool
Which of the following properties of the STLC remain true in
the presence of this rule? For each one, write either
"remains true" or else "becomes false." If a property becomes
false, give a counterexample.
- Determinism of $$step$$
- Progress
- Preservation
#### Exercise: 2 stars, optional (stlc_variation7)
Suppose we add the following new rule to the typing relation
of the STLC:
------------------- (T_FunnyAbs)
\vdash \lambda x:bool.t : bool
Which of the following properties of the STLC remain true in
the presence of this rule? For each one, write either
"remains true" or else "becomes false." If a property becomes
false, give a counterexample.
- Determinism of $$step$$
- Progress
- Preservation
### Exercise: STLC with Arithmetic
To see how the STLC might function as the core of a real
programming language, let's extend it with a concrete base
type of numbers and some constants and primitive
operators.
To types, we add a base type of natural numbers (and remove
booleans, for brevity).
Inductive ty : Type :=
| TArrow : ty → ty → ty
| TNat : ty.
To terms, we add natural number constants, along with
successor, predecessor, multiplication, and zero-testing.
Inductive tm : Type :=
| tvar : id → tm
| tapp : tm → tm → tm
| tabs : id → ty → tm → tm
| tnat : nat → tm
| tsucc : tm → tm
| tpred : tm → tm
| tmult : tm → tm → tm
| tif0 : tm → tm → tm → tm.
#### Exercise: 4 stars (stlc_arith)
Finish formalizing the definition and properties of the STLC extended
with arithmetic. Specifically:
- Copy the whole development of STLC that we went through above (from
the definition of values through the Type Soundness theorem), and
paste it into the file at this point.
- Extend the definitions of the $$subst$$ operation and the $$step$$
relation to include appropriate clauses for the arithmetic operators.
- Extend the proofs of all the properties (up to $$soundness$$) of
the original STLC to deal with the new syntactic forms. Make
sure Agda accepts the whole file.