diff --git a/out/Maps.md b/out/Maps.md index 869200f1..76d793b3 100644 --- a/out/Maps.md +++ b/out/Maps.md @@ -818,8 +818,6 @@ We define some identifiers for future use. -## Extensionality - ## Total Maps Our main job in this chapter will be to build a definition of @@ -841,67 +839,67 @@ that is not present in the map.
-TotalMap : Set → Set TotalMap A = Id → A-Intuitively, a total map over anfi element type $$A$$ _is_ just a -function that can be used to look up ids, yielding $$A$$s. +Intuitively, a total map over anfi element type `A` _is_ just a +function that can be used to look up ids, yielding `A`s.
-module TotalMap where @@ -913,932 +911,321 @@ applied to any id.- always : ∀ {A} → A → TotalMap :A ∀ {A} → A → TotalMap A always v x = vMore interesting is the update function, which (as before) takes -a map $$ρ$$, a key $$x$$, and a value $$v$$ and returns a new map that -takes $$x$$ to $$v$$ and takes every other key to whatever $$ρ$$ does. +a map `ρ`, a key `x`, and a value `v` and returns a new map that +takes `x` to `v` and takes every other key to whatever `ρ` does.- infixl 100 _,_↦_ + + _,_↦_ : ∀ {A} → TotalMap A → TotalMapId → A → A → Id →TotalMap A + (ρ →, TotalMapx ↦ A - (ρ , x ↦ v) y with x ≟ y ... | yes x=y = v ... | no x≠y = ρ yThis definition is a nice example of higher-order programming. -The update function takes a _function_ $$ρ$$ and yields a new +The update function takes a _function_ `ρ` and yields a new function that behaves like the desired map. -We define handy abbreviations for updating a map two, three, or four times. - - +For example, we can build a map taking ids to naturals, where `x` +maps to 42, `y` maps to 69, and every other key maps to 0, as follows:- _,_↦_,_↦_ : ∀ {A} → TotalMap A → Id → A → Id → A → TotalMap A - ρ , x₁ ↦ v₁ , x₂ ↦ v₂ = (ρ , x₁ ↦ v₁), x₂ ↦ v₂ - - _,_↦_,_↦_,_↦_ : ∀ {A} → TotalMap A → Id → A → Id → A → Id → A → TotalMap A - ρ , x₁ ↦ v₁ , x₂ ↦ v₂ , x₃ ↦ v₃ = ((ρ , x₁ ↦ v₁), x₂ ↦ v₂), x₃ ↦ v₃ - - _,_↦_,_↦_,_↦_,_↦_ : ∀ {A} → TotalMap A → Id → A → Id → A → Id → A → Id → A → TotalMap A - ρ , x₁ ↦ v₁ , x₂ ↦ v₂ , x₃ ↦ v₃ , x₄ ↦ v₄ = (((ρ , x₁ ↦ v₁), x₂ ↦ v₂), x₃ ↦ v₃), x₄ ↦ v₄ - -- -For example, we can build a map taking ids to naturals, where $$x$$ -maps to 42, $$y$$ maps to 69, and every other key maps to 0, as follows: - -- - ρ₀ : TotalMap ℕ ρ₀ = always 0 , x ↦ 42 , y ↦ 69 @@ -1850,118 +1237,118 @@ application!- test₁ : ρ₀ x ≡ 42 test₁ = refl test₂ : ρ₀ y ≡ 69 test₂ = refl test₃ : ρ₀ z ≡ 0 test₃ = refl @@ -1977,84 +1364,84 @@ The `always` map returns its default element for all keys:- postulate apply-always : ∀ {A} (v : A) (x : Id) → always v x ≡ v @@ -2063,100 +1450,100 @@ The `always` map returns its default element for all keys:#### Exercise: 2 stars, optional (update-eq) -Next, if we update a map $$ρ$$ at a key $$x$$ with a new value $$v$$ -and then look up $$x$$ in the map resulting from the update, we get -back $$v$$: +Next, if we update a map `ρ` at a key `x` with a new value `v` +and then look up `x` in the map resulting from the update, we get +back `v`:- apply-always′ : ∀ {A} (v : A) (x : Id) → always v x ≡ v apply-always′ v x = refl @@ -2164,127 +1551,127 @@ The `always` map returns its default element for all keys:- postulate update-eq : ∀ {A} (ρ : TotalMap A) (x : Id) (v : A) → (ρ , x ↦ v) x ≡ v @@ -2293,211 +1680,211 @@ back $$v$$:#### Exercise: 2 stars, optional (update-neq) -On the other hand, if we update a map $$m$$ at a key $$x$$ and -then look up a _different_ key $$y$$ in the resulting map, we get -the same result that $$m$$ would have given: +On the other hand, if we update a map `m` at a key `x` and +then look up a _different_ key `y` in the resulting map, we get +the same result that `m` would have given:- update-eq′ : ∀ {A} (ρ : TotalMap A) (x : Id) (v : A) → (ρ , x ↦ v) x ≡ v update-eq′ ρ x v with x ≟ x ... | yes x≡x = refl ... | no x≢x = ⊥-elim (x≢x refl) @@ -2505,261 +1892,261 @@ back $$v$$:- update-neq : ∀ {A} (ρ : TotalMap A) (x : Id) (v : A) (y : Id) → x ≢ y → (ρ , x ↦ v) y ≡ ρ y update-neq ρ x v y x≢y with x ≟ y ... | yes x≡y = ⊥-elim (x≢y x≡y) ... | no _ = refl @@ -2770,279 +2157,279 @@ show two maps equal we will need to postulate extensionality.- postulate extensionality : ∀ {A : Set} {ρ ρ′ : TotalMap A} → (∀ x → ρ x ≡ ρ′ x) → ρ ≡ ρ′#### Exercise: 2 stars, optional (update-shadow) -If we update a map $$ρ$$ at a key $$x$$ with a value $$v$$ and then -update again with the same key $$x$$ and another value $$w$$, the +If we update a map `ρ` at a key `x` with a value `v` and then +update again with the same key `x` and another value `w`, the resulting map behaves the same (gives the same result when applied to any key) as the simpler map obtained by performing just -the second update on $$ρ$$: +the second update on `ρ`:- postulate update-shadow : ∀ {A} (ρ : TotalMap A) (x : Id) (v w : A) → (ρ , x ↦ v , x ↦ w) ≡ (ρ , x ↦ w) @@ -3051,386 +2438,386 @@ the second update on $$ρ$$:#### Exercise: 2 stars (update-same) -Prove the following theorem, which states that if we update a map $$ρ$$ to -assign key $$x$$ the same value as it already has in $$ρ$$, then the -result is equal to $$ρ$$: +Prove the following theorem, which states that if we update a map `ρ` to +assign key `x` the same value as it already has in `ρ`, then the +result is equal to `ρ`:- update-shadow′ : ∀ {A} (ρ : TotalMap A) (x : Id) (v w : A) → ((ρ , x ↦ v) , x ↦ w) ≡ (ρ , x ↦ w) update-shadow′ ρ x v w = extensionality lemma where lemma : ∀ y → ((ρ , x ↦ v) , x ↦ w) y ≡ (ρ , x ↦ w) y lemma y with x ≟ y ... | yes refl = refl ... | no x≢y = update-neq ρ x v y x≢y @@ -3438,110 +2825,110 @@ the second update on $$ρ$$:- postulate update-same : ∀ {A} (ρ : TotalMap A) (x : Id) → (ρ , x ↦ ρ x) ≡ ρ @@ -3550,269 +2937,269 @@ result is equal to $$ρ$$:#### Exercise: 2 stars, optional (update-neq) -On the other hand, if we update a map $$m$$ at a key $$x$$ and -then look up a _different_ key $$y$$ in the resulting map, we get -the same result that $$m$$ would have given: +On the other hand, if we update a map `m` at a key `x` and +then look up a _different_ key `y` in the resulting map, we get +the same result that `m` would have given: \begin{code} update-neq : ∀ {A} (ρ : TotalMap A) (x : Id) (v : A) (y : Id) @@ -248,11 +229,11 @@ show two maps equal we will need to postulate extensionality. \end{code} #### Exercise: 2 stars, optional (update-shadow) -If we update a map $$ρ$$ at a key $$x$$ with a value $$v$$ and then -update again with the same key $$x$$ and another value $$w$$, the +If we update a map `ρ` at a key `x` with a value `v` and then +update again with the same key `x` and another value `w`, the resulting map behaves the same (gives the same result when applied to any key) as the simpler map obtained by performing just -the second update on $$ρ$$: +the second update on `ρ`: \begin{code} postulate @@ -274,9 +255,9 @@ the second update on $$ρ$$: #### Exercise: 2 stars (update-same) -Prove the following theorem, which states that if we update a map $$ρ$$ to -assign key $$x$$ the same value as it already has in $$ρ$$, then the -result is equal to $$ρ$$: +Prove the following theorem, which states that if we update a map `ρ` to +assign key `x` the same value as it already has in `ρ`, then the +result is equal to `ρ`: \begin{code} postulate @@ -297,7 +278,7 @@ result is equal to $$ρ$$: #### Exercise: 3 stars, recommended (update-permute) Prove one final property of the `update` function: If we update a map -$$m$$ at two distinct keys, it doesn't matter in which order we do the +`m` at two distinct keys, it doesn't matter in which order we do the updates. \begin{code} @@ -354,21 +335,11 @@ module PartialMap where \end{code} \begin{code} + infixl 100 _,_↦_ + _,_↦_ : ∀ {A} (ρ : PartialMap A) (x : Id) (v : A) → PartialMap A ρ , x ↦ v = TotalMap._,_↦_ ρ x (just v) \end{code} -As before, we define handy abbreviations for updating a map two, three, or four times. - -\begin{code} - _,_↦_,_↦_ : ∀ {A} → PartialMap A → Id → A → Id → A → PartialMap A - ρ , x₁ ↦ v₁ , x₂ ↦ v₂ = (ρ , x₁ ↦ v₁), x₂ ↦ v₂ - - _,_↦_,_↦_,_↦_ : ∀ {A} → PartialMap A → Id → A → Id → A → Id → A → PartialMap A - ρ , x₁ ↦ v₁ , x₂ ↦ v₂ , x₃ ↦ v₃ = ((ρ , x₁ ↦ v₁), x₂ ↦ v₂), x₃ ↦ v₃ - - _,_↦_,_↦_,_↦_,_↦_ : ∀ {A} → PartialMap A → Id → A → Id → A → Id → A → Id → A → PartialMap A - ρ , x₁ ↦ v₁ , x₂ ↦ v₂ , x₃ ↦ v₃ , x₄ ↦ v₄ = (((ρ , x₁ ↦ v₁), x₂ ↦ v₂), x₃ ↦ v₃), x₄ ↦ v₄ -\end{code} We now lift all of the basic lemmas about total maps to partial maps. diff --git a/src/Stlc.lagda b/src/Stlc.lagda index dc525534..4a15cae3 100644 --- a/src/Stlc.lagda +++ b/src/Stlc.lagda @@ -199,6 +199,8 @@ example₁ = ⟨ step₀ ⟩ >> ⟨ step₁ ⟩ >> ⟨ step₂ ⟩ >> ⟨ step Context : Set Context = PartialMap Type +infix 50 _⊢_∈_ + data _⊢_∈_ : Context → Term → Type → Set where Ax : ∀ {Γ x A} → Γ x ≡ just A → diff --git a/src/StlcProp.lagda b/src/StlcProp.lagda index 0980a17d..14ba4032 100644 --- a/src/StlcProp.lagda +++ b/src/StlcProp.lagda @@ -28,8 +28,8 @@ theorem. As we saw for the simple calculus in the [Stlc]({{ "Stlc" | relative_url }}) chapter, the first step in establishing basic properties of reduction and types is to identify the possible _canonical forms_ (i.e., well-typed closed values) -belonging to each type. For $$bool$$, these are the boolean values $$true$$ and -$$false$$. For arrow types, the canonical forms are lambda-abstractions. +belonging to each type. For `bool`, these are the boolean values `true` and +`false`. For arrow types, the canonical forms are lambda-abstractions. \begin{code} data canonical_for_ : Term → Type → Set where @@ -63,43 +63,43 @@ first, then the formal version. progress : ∀ {M A} → ∅ ⊢ M ∈ A → value M ⊎ ∃ λ N → M ⟹ N \end{code} -_Proof_: By induction on the derivation of $$\vdash t : A$$. +_Proof_: By induction on the derivation of `\vdash t : A`. - The last rule of the derivation cannot be `var`, since a variable is never well typed in an empty context. - The `true`, `false`, and `abs` cases are trivial, since in - each of these cases we can see by inspecting the rule that $$t$$ + each of these cases we can see by inspecting the rule that `t` is a value. - - If the last rule of the derivation is `app`, then $$t$$ has the - form $$t_1\;t_2$$ for som e$$t_1$$ and $$t_2$$, where we know that - $$t_1$$ and $$t_2$$ are also well typed in the empty context; in particular, - there exists a type $$B$$ such that $$\vdash t_1 : A\to T$$ and - $$\vdash t_2 : B$$. By the induction hypothesis, either $$t_1$$ is a + - If the last rule of the derivation is `app`, then `t` has the + form `t_1\;t_2` for som e`t_1` and `t_2`, where we know that + `t_1` and `t_2` are also well typed in the empty context; in particular, + there exists a type `B` such that `\vdash t_1 : A\to T` and + `\vdash t_2 : B`. By the induction hypothesis, either `t_1` is a value or it can take a reduction step. - - If $$t_1$$ is a value, then consider $$t_2$$, which by the other + - If `t_1` is a value, then consider `t_2`, which by the other induction hypothesis must also either be a value or take a step. - - Suppose $$t_2$$ is a value. Since $$t_1$$ is a value with an - arrow type, it must be a lambda abstraction; hence $$t_1\;t_2$$ + - Suppose `t_2` is a value. Since `t_1` is a value with an + arrow type, it must be a lambda abstraction; hence `t_1\;t_2` can take a step by `red`. - - Otherwise, $$t_2$$ can take a step, and hence so can $$t_1\;t_2$$ + - Otherwise, `t_2` can take a step, and hence so can `t_1\;t_2` by `app2`. - - If $$t_1$$ can take a step, then so can $$t_1 t_2$$ by `app1`. + - If `t_1` can take a step, then so can `t_1 t_2` by `app1`. - - If the last rule of the derivation is `if`, then $$t = \text{if }t_1 - \text{ then }t_2\text{ else }t_3$$, where $$t_1$$ has type $$bool$$. By - the IH, $$t_1$$ either is a value or takes a step. + - If the last rule of the derivation is `if`, then `t = \text{if }t_1 + \text{ then }t_2\text{ else }t_3`, where `t_1` has type `bool`. By + the IH, `t_1` either is a value or takes a step. - - If $$t_1$$ is a value, then since it has type $$bool$$ it must be - either $$true$$ or $$false$$. If it is $$true$$, then $$t$$ steps - to $$t_2$$; otherwise it steps to $$t_3$$. + - If `t_1` is a value, then since it has type `bool` it must be + either `true` or `false`. If it is `true`, then `t` steps + to `t_2`; otherwise it steps to `t_3`. - - Otherwise, $$t_1$$ takes a step, and therefore so does $$t$$ (by `if`). + - Otherwise, `t_1` takes a step, and therefore so does `t` (by `if`). \begin{code} progress (Ax ()) @@ -141,23 +141,23 @@ interesting proofs), the story goes like this: - The _preservation theorem_ is proved by induction on a typing derivation, pretty much as we did in the [Stlc]({{ "Stlc" | relative_url }}) chapter. The one case that is significantly different is the one for the - $$red$$ rule, whose definition uses the substitution operation. To see that + `red` rule, whose definition uses the substitution operation. To see that this step preserves typing, we need to know that the substitution itself does. So we prove a... - _substitution lemma_, stating that substituting a (closed) - term $$s$$ for a variable $$x$$ in a term $$t$$ preserves the type - of $$t$$. The proof goes by induction on the form of $$t$$ and + term `s` for a variable `x` in a term `t` preserves the type + of `t`. The proof goes by induction on the form of `t` and requires looking at all the different cases in the definition of substitition. This time, the tricky cases are the ones for variables and for function abstractions. In both cases, we - discover that we need to take a term $$s$$ that has been shown - to be well-typed in some context $$\Gamma$$ and consider the same - term $$s$$ in a slightly different context $$\Gamma'$$. For this + discover that we need to take a term `s` that has been shown + to be well-typed in some context `\Gamma` and consider the same + term `s` in a slightly different context `\Gamma'`. For this we prove a... - _context invariance_ lemma, showing that typing is preserved - under "inessential changes" to the context $$\Gamma$$---in + under "inessential changes" to the context `\Gamma`---in particular, changes that do not affect any of the free variables of the term. And finally, for this, we need a careful definition of... @@ -172,13 +172,13 @@ order... ### Free Occurrences -A variable $$x$$ _appears free in_ a term $$M$$ if $$M$$ contains some -occurrence of $$x$$ that is not under an abstraction over $$x$$. +A variable `x` _appears free in_ a term `M` if `M` contains some +occurrence of `x` that is not under an abstraction over `x`. For example: - - $$y$$ appears free, but $$x$$ does not, in $$λᵀ x ∈ (A ⇒ B) ⇒ x ·ᵀ y$$ - - both $$x$$ and $$y$$ appear free in $$(λᵀ x ∈ (A ⇒ B) ⇒ x ·ᵀ y) ·ᵀ x$$ - - no variables appear free in $$λᵀ x ∈ (A ⇒ B) ⇒ (λᵀ y ∈ A ⇒ x ·ᵀ y)$$ + - `y` appears free, but `x` does not, in `λᵀ x ∈ (A ⇒ B) ⇒ x ·ᵀ y` + - both `x` and `y` appear free in `(λᵀ x ∈ (A ⇒ B) ⇒ x ·ᵀ y) ·ᵀ x` + - no variables appear free in `λᵀ x ∈ (A ⇒ B) ⇒ (λᵀ y ∈ A ⇒ x ·ᵀ y)` Formally: @@ -211,42 +211,42 @@ are really the crux of the lambda-calculus.) ### Substitution To prove that substitution preserves typing, we first need a technical lemma connecting free variables and typing contexts: If -a variable $$x$$ appears free in a term $$M$$, and if we know $$M$$ is -well typed in context $$Γ$$, then it must be the case that -$$Γ$$ assigns a type to $$x$$. +a variable `x` appears free in a term `M`, and if we know `M` is +well typed in context `Γ`, then it must be the case that +`Γ` assigns a type to `x`. \begin{code} freeLemma : ∀ {x M A Γ} → x FreeIn M → Γ ⊢ M ∈ A → ∃ λ B → Γ x ≡ just B \end{code} -_Proof_: We show, by induction on the proof that $$x$$ appears - free in $$P$$, that, for all contexts $$Γ$$, if $$P$$ is well - typed under $$Γ$$, then $$Γ$$ assigns some type to $$x$$. +_Proof_: We show, by induction on the proof that `x` appears + free in `P`, that, for all contexts `Γ`, if `P` is well + typed under `Γ`, then `Γ` assigns some type to `x`. - - If the last rule used was `free-varᵀ`, then $$P = x$$, and from - the assumption that $$M$$ is well typed under $$Γ$$ we have - immediately that $$Γ$$ assigns a type to $$x$$. + - If the last rule used was `free-varᵀ`, then `P = x`, and from + the assumption that `M` is well typed under `Γ` we have + immediately that `Γ` assigns a type to `x`. - - If the last rule used was `free-·₁`, then $$P = L ·ᵀ M$$ and $$x$$ - appears free in $$L$$. Since $$L$$ is well typed under $$\Gamma$$, - we can see from the typing rules that $$L$$ must also be, and - the IH then tells us that $$Γ$$ assigns $$x$$ a type. + - If the last rule used was `free-·₁`, then `P = L ·ᵀ M` and `x` + appears free in `L`. Since `L` is well typed under `\Gamma`, + we can see from the typing rules that `L` must also be, and + the IH then tells us that `Γ` assigns `x` a type. - - Almost all the other cases are similar: $$x$$ appears free in a - subterm of $$P$$, and since $$P$$ is well typed under $$Γ$$, we - know the subterm of $$M$$ in which $$x$$ appears is well typed - under $$Γ$$ as well, and the IH gives us exactly the + - Almost all the other cases are similar: `x` appears free in a + subterm of `P`, and since `P` is well typed under `Γ`, we + know the subterm of `M` in which `x` appears is well typed + under `Γ` as well, and the IH gives us exactly the conclusion we want. - - The only remaining case is `free-λᵀ`. In this case $$P = - λᵀ y ∈ A ⇒ N$$, and $$x$$ appears free in $$N$$; we also know that - $$x$$ is different from $$y$$. The difference from the previous - cases is that whereas $$P$$ is well typed under $$\Gamma$$, its - body $$N$$ is well typed under $$(Γ , y ↦ A)$$, so the IH - allows us to conclude that $$x$$ is assigned some type by the - extended context $$(Γ , y ↦ A)$$. To conclude that $$Γ$$ - assigns a type to $$x$$, we appeal the decidable equality for names - `_≟_`, noting that $$x$$ and $$y$$ are different variables. + - The only remaining case is `free-λᵀ`. In this case `P = + λᵀ y ∈ A ⇒ N`, and `x` appears free in `N`; we also know that + `x` is different from `y`. The difference from the previous + cases is that whereas `P` is well typed under `\Gamma`, its + body `N` is well typed under `(Γ , y ↦ A)`, so the IH + allows us to conclude that `x` is assigned some type by the + extended context `(Γ , y ↦ A)`. To conclude that `Γ` + assigns a type to `x`, we appeal the decidable equality for names + `_≟_`, noting that `x` and `y` are different variables. \begin{code} freeLemma free-varᵀ (Ax Γx≡justA) = (_ , Γx≡justA) @@ -261,11 +261,11 @@ freeLemma (free-λᵀ {x} {y} y≢x x∈N) (⇒-I ⊢N) with freeLemma x∈N ⊢ ... | no _ = Γx=justC \end{code} -[A subtle point: if the first argument of $$free-λᵀ$$ was of type -$$x ≢ y$$ rather than of type $$y ≢ x$$, then the type of the -term $$Γx=justC$$ would not simplify properly.] +[A subtle point: if the first argument of `free-λᵀ` was of type +`x ≢ y` rather than of type `y ≢ x`, then the type of the +term `Γx=justC` would not simplify properly.] -Next, we'll need the fact that any term $$M$$ which is well typed in +Next, we'll need the fact that any term `M` which is well typed in the empty context is closed (it has no free variables). #### Exercise: 2 stars, optional (∅⊢-closed) @@ -286,11 +286,11 @@ contradiction () \end{code} -Sometimes, when we have a proof $$Γ ⊢ M ∈ A$$, we will need to -replace $$Γ$$ by a different context $$Γ′$$. When is it safe +Sometimes, when we have a proof `Γ ⊢ M ∈ A`, we will need to +replace `Γ` by a different context `Γ′`. When is it safe to do this? Intuitively, it must at least be the case that -$$Γ′$$ assigns the same types as $$Γ$$ to all the variables -that appear free in $$M$$. In fact, this is the only condition that +`Γ′` assigns the same types as `Γ` to all the variables +that appear free in `M`. In fact, this is the only condition that is needed. \begin{code} @@ -301,45 +301,45 @@ weaken : ∀ {Γ Γ′ M A} \end{code} _Proof_: By induction on the derivation of -$$Γ ⊢ M ∈ A$$. +`Γ ⊢ M ∈ A`. - - If the last rule in the derivation was `var`, then $$t = x$$ - and $$\Gamma x = T$$. By assumption, $$\Gamma' x = T$$ as well, and - hence $$\Gamma' \vdash t : T$$ by `var`. + - If the last rule in the derivation was `var`, then `t = x` + and `\Gamma x = T`. By assumption, `\Gamma' x = T` as well, and + hence `\Gamma' \vdash t : T` by `var`. - - If the last rule was `abs`, then $$t = \lambda y:A. t'$$, with - $$T = A\to B$$ and $$\Gamma, y : A \vdash t' : B$$. The - induction hypothesis is that, for any context $$\Gamma''$$, if - $$\Gamma, y:A$$ and $$\Gamma''$$ assign the same types to all the - free variables in $$t'$$, then $$t'$$ has type $$B$$ under - $$\Gamma''$$. Let $$\Gamma'$$ be a context which agrees with - $$\Gamma$$ on the free variables in $$t$$; we must show - $$\Gamma' \vdash \lambda y:A. t' : A\to B$$. + - If the last rule was `abs`, then `t = \lambda y:A. t'`, with + `T = A\to B` and `\Gamma, y : A \vdash t' : B`. The + induction hypothesis is that, for any context `\Gamma''`, if + `\Gamma, y:A` and `\Gamma''` assign the same types to all the + free variables in `t'`, then `t'` has type `B` under + `\Gamma''`. Let `\Gamma'` be a context which agrees with + `\Gamma` on the free variables in `t`; we must show + `\Gamma' \vdash \lambda y:A. t' : A\to B`. - By $$abs$$, it suffices to show that $$\Gamma', y:A \vdash t' : t'$$. - By the IH (setting $$\Gamma'' = \Gamma', y:A$$), it suffices to show - that $$\Gamma, y:A$$ and $$\Gamma', y:A$$ agree on all the variables - that appear free in $$t'$$. + By `abs`, it suffices to show that `\Gamma', y:A \vdash t' : t'`. + By the IH (setting `\Gamma'' = \Gamma', y:A`), it suffices to show + that `\Gamma, y:A` and `\Gamma', y:A` agree on all the variables + that appear free in `t'`. - Any variable occurring free in $$t'$$ must be either $$y$$ or - some other variable. $$\Gamma, y:A$$ and $$\Gamma', y:A$$ - clearly agree on $$y$$. Otherwise, note that any variable other - than $$y$$ that occurs free in $$t'$$ also occurs free in - $$t = \lambda y:A. t'$$, and by assumption $$\Gamma$$ and - $$\Gamma'$$ agree on all such variables; hence so do $$\Gamma, y:A$$ and - $$\Gamma', y:A$$. + Any variable occurring free in `t'` must be either `y` or + some other variable. `\Gamma, y:A` and `\Gamma', y:A` + clearly agree on `y`. Otherwise, note that any variable other + than `y` that occurs free in `t'` also occurs free in + `t = \lambda y:A. t'`, and by assumption `\Gamma` and + `\Gamma'` agree on all such variables; hence so do `\Gamma, y:A` and + `\Gamma', y:A`. - - If the last rule was `app`, then $$t = t_1\;t_2$$, with - $$\Gamma \vdash t_1:A\to T$$ and $$\Gamma \vdash t_2:A$$. - One induction hypothesis states that for all contexts $$\Gamma'$$, - if $$\Gamma'$$ agrees with $$\Gamma$$ on the free variables in $$t_1$$, - then $$t_1$$ has type $$A\to T$$ under $$\Gamma'$$; there is a similar IH - for $$t_2$$. We must show that $$t_1\;t_2$$ also has type $$T$$ under - $$\Gamma'$$, given the assumption that $$\Gamma'$$ agrees with - $$\Gamma$$ on all the free variables in $$t_1\;t_2$$. By `app`, it - suffices to show that $$t_1$$ and $$t_2$$ each have the same type - under $$\Gamma'$$ as under $$\Gamma$$. But all free variables in - $$t_1$$ are also free in $$t_1\;t_2$$, and similarly for $$t_2$$; + - If the last rule was `app`, then `t = t_1\;t_2`, with + `\Gamma \vdash t_1:A\to T` and `\Gamma \vdash t_2:A`. + One induction hypothesis states that for all contexts `\Gamma'`, + if `\Gamma'` agrees with `\Gamma` on the free variables in `t_1`, + then `t_1` has type `A\to T` under `\Gamma'`; there is a similar IH + for `t_2`. We must show that `t_1\;t_2` also has type `T` under + `\Gamma'`, given the assumption that `\Gamma'` agrees with + `\Gamma` on all the free variables in `t_1\;t_2`. By `app`, it + suffices to show that `t_1` and `t_2` each have the same type + under `\Gamma'` as under `\Gamma`. But all free variables in + `t_1` are also free in `t_1\;t_2`, and similarly for `t_2`; hence the desired result follows from the induction hypotheses. \begin{code} @@ -383,16 +383,16 @@ preserves types---namely, the observation that _substitution_ preserves types. Formally, the so-called _Substitution Lemma_ says this: Suppose we -have a term $$N$$ with a free variable $$x$$, and suppose we've been -able to assign a type $$B$$ to $$N$$ under the assumption that $$x$$ has -some type $$A$$. Also, suppose that we have some other term $$V$$ and -that we've shown that $$V$$ has type $$A$$. Then, since $$V$$ satisfies -the assumption we made about $$x$$ when typing $$N$$, we should be -able to substitute $$V$$ for each of the occurrences of $$x$$ in $$N$$ -and obtain a new term that still has type $$B$$. +have a term `N` with a free variable `x`, and suppose we've been +able to assign a type `B` to `N` under the assumption that `x` has +some type `A`. Also, suppose that we have some other term `V` and +that we've shown that `V` has type `A`. Then, since `V` satisfies +the assumption we made about `x` when typing `N`, we should be +able to substitute `V` for each of the occurrences of `x` in `N` +and obtain a new term that still has type `B`. -_Lemma_: If $$Γ , x ↦ A ⊢ N ∈ B$$ and $$∅ ⊢ V ∈ A$$, then -$$Γ ⊢ (N [ x := V ]) ∈ B$$. +_Lemma_: If `Γ , x ↦ A ⊢ N ∈ B` and `∅ ⊢ V ∈ A`, then +`Γ ⊢ (N [ x := V ]) ∈ B`. \begin{code} preservation-[:=] : ∀ {Γ x A N B V} @@ -402,63 +402,63 @@ preservation-[:=] : ∀ {Γ x A N B V} \end{code} One technical subtlety in the statement of the lemma is that -we assign $$V$$ the type $$A$$ in the _empty_ context---in other -words, we assume $$V$$ is closed. This assumption considerably -simplifies the $$λᵀ$$ case of the proof (compared to assuming -$$Γ ⊢ V ∈ A$$, which would be the other reasonable assumption +we assign `V` the type `A` in the _empty_ context---in other +words, we assume `V` is closed. This assumption considerably +simplifies the `λᵀ` case of the proof (compared to assuming +`Γ ⊢ V ∈ A`, which would be the other reasonable assumption at this point) because the context invariance lemma then tells us -that $$V$$ has type $$A$$ in any context at all---we don't have to -worry about free variables in $$V$$ clashing with the variable being -introduced into the context by $$λᵀ$$. +that `V` has type `A` in any context at all---we don't have to +worry about free variables in `V` clashing with the variable being +introduced into the context by `λᵀ`. The substitution lemma can be viewed as a kind of "commutation" property. Intuitively, it says that substitution and typing can be done in either order: we can either assign types to the terms -$$N$$ and $$V$$ separately (under suitable contexts) and then combine +`N` and `V` separately (under suitable contexts) and then combine them using substitution, or we can substitute first and then -assign a type to $$N [ x := V ]$$---the result is the same either +assign a type to `N [ x := V ]`---the result is the same either way. -_Proof_: We show, by induction on $$N$$, that for all $$A$$ and -$$Γ$$, if $$Γ , x ↦ A \vdash N ∈ B$$ and $$∅ ⊢ V ∈ A$$, then -$$Γ \vdash N [ x := V ] ∈ B$$. +_Proof_: We show, by induction on `N`, that for all `A` and +`Γ`, if `Γ , x ↦ A \vdash N ∈ B` and `∅ ⊢ V ∈ A`, then +`Γ \vdash N [ x := V ] ∈ B`. - - If $$N$$ is a variable there are two cases to consider, - depending on whether $$N$$ is $$x$$ or some other variable. + - If `N` is a variable there are two cases to consider, + depending on whether `N` is `x` or some other variable. - - If $$N = varᵀ x$$, then from the fact that $$Γ , x ↦ A ⊢ N ∈ B$$ - we conclude that $$A = B$$. We must show that $$x [ x := V] = - V$$ has type $$A$$ under $$Γ$$, given the assumption that - $$V$$ has type $$A$$ under the empty context. This + - If `N = varᵀ x`, then from the fact that `Γ , x ↦ A ⊢ N ∈ B` + we conclude that `A = B`. We must show that `x [ x := V] = + V` has type `A` under `Γ`, given the assumption that + `V` has type `A` under the empty context. This follows from context invariance: if a closed term has type - $$A$$ in the empty context, it has that type in any context. + `A` in the empty context, it has that type in any context. - - If $$N$$ is some variable $$x′$$ different from $$x$$, then - we need only note that $$x′$$ has the same type under $$Γ , x ↦ A$$ - as under $$Γ$$. + - If `N` is some variable `x′` different from `x`, then + we need only note that `x′` has the same type under `Γ , x ↦ A` + as under `Γ`. - - If $$N$$ is an abstraction $$λᵀ x′ ∈ A′ ⇒ N′$$, then the IH tells us, - for all $$Γ′$$́ and $$B′$$, that if $$Γ′ , x ↦ A ⊢ N′ ∈ B′$$ - and $$∅ ⊢ V ∈ A$$, then $$Γ′ ⊢ N′ [ x := V ] ∈ B′$$. + - If `N` is an abstraction `λᵀ x′ ∈ A′ ⇒ N′`, then the IH tells us, + for all `Γ′`́ and `B′`, that if `Γ′ , x ↦ A ⊢ N′ ∈ B′` + and `∅ ⊢ V ∈ A`, then `Γ′ ⊢ N′ [ x := V ] ∈ B′`. The substitution in the conclusion behaves differently - depending on whether $$x$$ and $$x′$$ are the same variable. + depending on whether `x` and `x′` are the same variable. - First, suppose $$x ≡ x′$$. Then, by the definition of - substitution, $$N [ x := V] = N$$, so we just need to show $$Γ ⊢ N ∈ B$$. - But we know $$Γ , x ↦ A ⊢ N ∈ B$$ and, since $$x ≡ x′$$ - does not appear free in $$λᵀ x′ ∈ A′ ⇒ N′$$, the context invariance - lemma yields $$Γ ⊢ N ∈ B$$. + First, suppose `x ≡ x′`. Then, by the definition of + substitution, `N [ x := V] = N`, so we just need to show `Γ ⊢ N ∈ B`. + But we know `Γ , x ↦ A ⊢ N ∈ B` and, since `x ≡ x′` + does not appear free in `λᵀ x′ ∈ A′ ⇒ N′`, the context invariance + lemma yields `Γ ⊢ N ∈ B`. - Second, suppose $$x ≢ x′$$. We know $$Γ , x ↦ A , x′ ↦ A′ ⊢ N′ ∈ B′$$ + Second, suppose `x ≢ x′`. We know `Γ , x ↦ A , x′ ↦ A′ ⊢ N′ ∈ B′` by inversion of the typing relation, from which - $$Γ , x′ ↦ A′ , x ↦ A ⊢ N′ ∈ B′$$ follows by update permute, - so the IH applies, giving us $$Γ , x′ ↦ A′ ⊢ N′ [ x := V ] ∈ B′$$ - By $$⇒-I$$, we have $$Γ ⊢ λᵀ x′ ∈ A′ ⇒ (N′ [ x := V ]) ∈ A′ ⇒ B′$$ - and the definition of substitution (noting $$x ≢ x′$$) gives - $$Γ ⊢ (λᵀ x′ ∈ A′ ⇒ N′) [ x := V ] ∈ A′ ⇒ B′$$ as required. + `Γ , x′ ↦ A′ , x ↦ A ⊢ N′ ∈ B′` follows by update permute, + so the IH applies, giving us `Γ , x′ ↦ A′ ⊢ N′ [ x := V ] ∈ B′` + By `⇒-I`, we have `Γ ⊢ λᵀ x′ ∈ A′ ⇒ (N′ [ x := V ]) ∈ A′ ⇒ B′` + and the definition of substitution (noting `x ≢ x′`) gives + `Γ ⊢ (λᵀ x′ ∈ A′ ⇒ N′) [ x := V ] ∈ A′ ⇒ B′` as required. - - If $$N$$ is an application $$L′ ·ᵀ M′$$, the result follows + - If `N` is an application `L′ ·ᵀ M′`, the result follows straightforwardly from the definition of substitution and the induction hypotheses. @@ -483,12 +483,7 @@ just-injective refl = refl preservation-[:=] {_} {x} (Ax {_} {x′} [Γ,x↦A]x′≡B) ⊢V with x ≟ x′ ...| yes x≡x′ rewrite just-injective [Γ,x↦A]x′≡B = weaken-closed ⊢V ...| no x≢x′ = Ax [Γ,x↦A]x′≡B -{- -preservation-[:=] {Γ} {x} {A} {varᵀ x′} {B} {V} (Ax {.(Γ , x ↦ A)} {.x′} {.B} Γx′≡B) ⊢V with x ≟ x′ -...| yes x≡x′ rewrite just-injective Γx′≡B = weaken-closed ⊢V -...| no x≢x′ = Ax {Γ} {x′} {B} Γx′≡B --} -preservation-[:=] {Γ} {x} {A} {λᵀ x′ ∈ A′ ⇒ N′} {.A′ ⇒ B′} {V} (⇒-I {.(Γ , x ↦ A)} {.x′} {.N′} {.A′} {.B′} ⊢N′) ⊢V with x ≟ x′ +preservation-[:=] {Γ} {x} {A} {λᵀ x′ ∈ A′ ⇒ N′} {.A′ ⇒ B′} {V} (⇒-I ⊢N′) ⊢V with x ≟ x′ ...| yes x≡x′ rewrite x≡x′ = weaken Γ′~Γ (⇒-I ⊢N′) where Γ′~Γ : ∀ {y} → y FreeIn (λᵀ x′ ∈ A′ ⇒ N′) → (Γ , x′ ↦ A) y ≡ Γ y @@ -497,79 +492,58 @@ preservation-[:=] {Γ} {x} {A} {λᵀ x′ ∈ A′ ⇒ N′} {.A′ ⇒ B′} { ...| no _ = refl ...| no x≢x′ = ⇒-I ⊢N′V where - x′x⊢N′ : (Γ , x′ ↦ A′ , x ↦ A) ⊢ N′ ∈ B′ - x′x⊢N′ rewrite update-permute Γ x A x′ A′ x≢x′ = {!⊢N′!} + x′x⊢N′ : Γ , x′ ↦ A′ , x ↦ A ⊢ N′ ∈ B′ + x′x⊢N′ rewrite update-permute Γ x A x′ A′ x≢x′ = ⊢N′ ⊢N′V : (Γ , x′ ↦ A′) ⊢ N′ [ x := V ] ∈ B′ ⊢N′V = preservation-[:=] x′x⊢N′ ⊢V -{- -...| yes x′≡x rewrite x′≡x | update-shadow Γ x A A′ = {!!} - -- ⇒-I ⊢N′ -...| no x′≢x rewrite update-permute Γ x′ A′ x A x′≢x = {!!} - -- ⇒-I {Γ} {x′} {N′} {A′} {B′} (preservation-[:=] {(Γ , x′ ↦ A′)} {x} {A} ⊢N′ ⊢V) --} preservation-[:=] (⇒-E ⊢L ⊢M) ⊢V = ⇒-E (preservation-[:=] ⊢L ⊢V) (preservation-[:=] ⊢M ⊢V) preservation-[:=] 𝔹-I₁ ⊢V = 𝔹-I₁ preservation-[:=] 𝔹-I₂ ⊢V = 𝔹-I₂ preservation-[:=] (𝔹-E ⊢L ⊢M ⊢N) ⊢V = 𝔹-E (preservation-[:=] ⊢L ⊢V) (preservation-[:=] ⊢M ⊢V) (preservation-[:=] ⊢N ⊢V) - -{- -[:=]-preserves-⊢ {Γ} {x} v∶A (var y y∈Γ) with x ≟ y -... | yes x=y = {!!} -... | no x≠y = {!!} -[:=]-preserves-⊢ v∶A (abs t′∶B) = {!!} -[:=]-preserves-⊢ v∶A (app t₁∶A⇒B t₂∶A) = - app ([:=]-preserves-⊢ v∶A t₁∶A⇒B) ([:=]-preserves-⊢ v∶A t₂∶A) -[:=]-preserves-⊢ v∶A true = true -[:=]-preserves-⊢ v∶A false = false -[:=]-preserves-⊢ v∶A (if t₁∶bool then t₂∶B else t₃∶B) = - if [:=]-preserves-⊢ v∶A t₁∶bool - then [:=]-preserves-⊢ v∶A t₂∶B - else [:=]-preserves-⊢ v∶A t₃∶B --} \end{code} ### Main Theorem We now have the tools we need to prove preservation: if a closed -term $$M$$ has type $$A$$ and takes a step to $$N$$, then $$N$$ -is also a closed term with type $$A$$. In other words, small-step +term `M` has type `A` and takes a step to `N`, then `N` +is also a closed term with type `A`. In other words, small-step reduction preserves types. \begin{code} preservation : ∀ {M N A} → ∅ ⊢ M ∈ A → M ⟹ N → ∅ ⊢ N ∈ A \end{code} -_Proof_: By induction on the derivation of $$\vdash t : T$$. +_Proof_: By induction on the derivation of `\vdash t : T`. -- We can immediately rule out $$var$$, $$abs$$, $$T_True$$, and - $$T_False$$ as the final rules in the derivation, since in each of - these cases $$t$$ cannot take a step. +- We can immediately rule out `var`, `abs`, `T_True`, and + `T_False` as the final rules in the derivation, since in each of + these cases `t` cannot take a step. -- If the last rule in the derivation was $$app$$, then $$t = t_1 - t_2$$. There are three cases to consider, one for each rule that - could have been used to show that $$t_1 t_2$$ takes a step to $$t'$$. +- If the last rule in the derivation was `app`, then `t = t_1 + t_2`. There are three cases to consider, one for each rule that + could have been used to show that `t_1 t_2` takes a step to `t'`. - - If $$t_1 t_2$$ takes a step by $$Sapp1$$, with $$t_1$$ stepping to - $$t_1'$$, then by the IH $$t_1'$$ has the same type as $$t_1$$, and - hence $$t_1' t_2$$ has the same type as $$t_1 t_2$$. + - If `t_1 t_2` takes a step by `Sapp1`, with `t_1` stepping to + `t_1'`, then by the IH `t_1'` has the same type as `t_1`, and + hence `t_1' t_2` has the same type as `t_1 t_2`. - - The $$Sapp2$$ case is similar. + - The `Sapp2` case is similar. - - If $$t_1 t_2$$ takes a step by $$Sred$$, then $$t_1 = - \lambda x:t_{11}.t_{12}$$ and $$t_1 t_2$$ steps to $$$$x:=t_2$$t_{12}$$; the + - If `t_1 t_2` takes a step by `Sred`, then `t_1 = + \lambda x:t_{11}.t_{12}` and `t_1 t_2` steps to ``x:=t_2`t_{12}`; the desired result now follows from the fact that substitution preserves types. - - If the last rule in the derivation was $$if$$, then $$t = if t_1 - then t_2 else t_3$$, and there are again three cases depending on - how $$t$$ steps. + - If the last rule in the derivation was `if`, then `t = if t_1 + then t_2 else t_3`, and there are again three cases depending on + how `t` steps. - - If $$t$$ steps to $$t_2$$ or $$t_3$$, the result is immediate, since - $$t_2$$ and $$t_3$$ have the same type as $$t$$. + - If `t` steps to `t_2` or `t_3`, the result is immediate, since + `t_2` and `t_3` have the same type as `t`. - - Otherwise, $$t$$ steps by $$Sif$$, and the desired conclusion + - Otherwise, `t` steps by `Sif`, and the desired conclusion follows directly from the induction hypothesis. \begin{code} @@ -597,11 +571,11 @@ Proof with eauto. intros t t' T HT. generalize dependent t'. induction HT; intros t' HE; subst Gamma; subst; - try solve $$inversion HE; subst; auto$$. + try solve `inversion HE; subst; auto`. - (* app inversion HE; subst... (* Most of the cases are immediate by induction, - and $$eauto$$ takes care of them + and `eauto` takes care of them + (* Sred apply substitution_preserves_typing with t_{11}... inversion HT_1... @@ -611,7 +585,7 @@ Qed. An exercise in the [Stlc]({{ "Stlc" | relative_url }}) chapter asked about the subject expansion property for the simple language of arithmetic and boolean expressions. Does this property hold for STLC? That is, is it always the case -that, if $$t ==> t'$$ and $$has_type t' T$$, then $$empty \vdash t : T$$? If +that, if `t ==> t'` and `has_type t' T`, then `empty \vdash t : T`? If so, prove it. If not, give a counter-example not involving conditionals. @@ -630,7 +604,7 @@ Corollary soundness : forall t t' T, ~(stuck t'). Proof. intros t t' T Hhas_type Hmulti. unfold stuck. - intros $$Hnf Hnot_val$$. unfold normal_form in Hnf. + intros `Hnf Hnot_val`. unfold normal_form in Hnf. induction Hmulti. @@ -647,10 +621,10 @@ Formalize this statement and prove it. #### Exercise: 1 star (progress_preservation_statement) Without peeking at their statements above, write down the progress and preservation theorems for the simply typed lambda-calculus. -$$$$ +`` #### Exercise: 2 stars (stlc_variation1) -Suppose we add a new term $$zap$$ with the following reduction rule +Suppose we add a new term `zap` with the following reduction rule --------- (ST_Zap) t ==> zap @@ -665,7 +639,7 @@ the presence of these rules? For each property, write either "remains true" or "becomes false." If a property becomes false, give a counterexample. - - Determinism of $$step$$ + - Determinism of `step` - Progress @@ -673,7 +647,7 @@ false, give a counterexample. #### Exercise: 2 stars (stlc_variation2) -Suppose instead that we add a new term $$foo$$ with the following +Suppose instead that we add a new term `foo` with the following reduction rules: ----------------- (ST_Foo1) @@ -687,20 +661,20 @@ the presence of this rule? For each one, write either "remains true" or else "becomes false." If a property becomes false, give a counterexample. - - Determinism of $$step$$ + - Determinism of `step` - Progress - Preservation #### Exercise: 2 stars (stlc_variation3) -Suppose instead that we remove the rule $$Sapp1$$ from the $$step$$ +Suppose instead that we remove the rule `Sapp1` from the `step` relation. Which of the following properties of the STLC remain true in the presence of this rule? For each one, write either "remains true" or else "becomes false." If a property becomes false, give a counterexample. - - Determinism of $$step$$ + - Determinism of `step` - Progress @@ -718,7 +692,7 @@ the presence of this rule? For each one, write either "remains true" or else "becomes false." If a property becomes false, give a counterexample. - - Determinism of $$step$$ + - Determinism of `step` - Progress @@ -740,7 +714,7 @@ the presence of this rule? For each one, write either "remains true" or else "becomes false." If a property becomes false, give a counterexample. - - Determinism of $$step$$ + - Determinism of `step` - Progress @@ -762,7 +736,7 @@ the presence of this rule? For each one, write either "remains true" or else "becomes false." If a property becomes false, give a counterexample. - - Determinism of $$step$$ + - Determinism of `step` - Progress @@ -782,7 +756,7 @@ the presence of this rule? For each one, write either "remains true" or else "becomes false." If a property becomes false, give a counterexample. - - Determinism of $$step$$ + - Determinism of `step` - Progress @@ -824,10 +798,10 @@ with arithmetic. Specifically: the definition of values through the Type Soundness theorem), and paste it into the file at this point. - - Extend the definitions of the $$subst$$ operation and the $$step$$ + - Extend the definitions of the `subst` operation and the `step` relation to include appropriate clauses for the arithmetic operators. - - Extend the proofs of all the properties (up to $$soundness$$) of + - Extend the proofs of all the properties (up to `soundness`) of the original STLC to deal with the new syntactic forms. Make sure Agda accepts the whole file.- update-same′ : ∀ {A} (ρ : TotalMap A) (x : Id) → (ρ , x ↦ ρ x) ≡ ρ update-same′ ρ x = extensionality lemma where lemma : ∀ y → (ρ , x ↦ ρ x) y ≡ ρ y lemma y with x ≟ y ... | yes refl = refl ... | no x≢y = refl @@ -3821,222 +3208,222 @@ result is equal to $$ρ$$: #### Exercise: 3 stars, recommended (update-permute) Prove one final property of the `update` function: If we update a map -$$m$$ at two distinct keys, it doesn't matter in which order we do the +`m` at two distinct keys, it doesn't matter in which order we do the updates.- postulate update-permute : ∀ {A} (ρ : TotalMap A) (x : Id) (v : A) (y : Id) (w : A) → x ≢ y → (ρ , x ↦ v , y ↦ w) ≡ (ρ , y ↦ w , x ↦ v) @@ -4045,674 +3432,674 @@ updates.#### Exercise: 2 stars, optional (update-eq) -Next, if we update a map $$ρ$$ at a key $$x$$ with a new value $$v$$ -and then look up $$x$$ in the map resulting from the update, we get -back $$v$$: +Next, if we update a map `ρ` at a key `x` with a new value `v` +and then look up `x` in the map resulting from the update, we get +back `v`: \begin{code} postulate @@ -227,9 +208,9 @@ back $$v$$:- update-permute′ : ∀ {A} (ρ : TotalMap A) (x : Id) (v : A) (y : Id) (w : A) → x ≢ y → (ρ , x ↦ v , y ↦ w) ≡ (ρ , y ↦ w , x ↦ v) update-permute′ {A} ρ x v y w x≢y = extensionality lemma where lemma : ∀ z → (ρ , x ↦ v , y ↦ w) z ≡ (ρ , y ↦ w , x ↦ v) z lemma z with x ≟ z | y ≟ z ... | yes refl | yes refl = ⊥-elim (x≢y refl) ... | no x≢z | yes refl = sym (update-eq′ ρ z w) ... | yes refl | no y≢z = update-eq′ ρ z v ... | no x≢z | no y≢z = trans (update-neq ρ x v z x≢z) (sym (update-neq ρ y w z y≢z)) @@ -4722,595 +4109,595 @@ And a slightly different version of the same proof.- update-permute′′ : ∀ {A} (ρ : TotalMap A) (x : Id) (v : A) (y : Id) (w : A) (z : Id) → x ≢ y → (ρ , x ↦ v , y ↦ w) z ≡ (ρ , y ↦ w , x ↦ v) z update-permute′′ {A} ρ x v y w z x≢y with x ≟ z | y ≟ z ... | yes x≡z | yes y≡z = ⊥-elim (x≢y (trans x≡z (sym y≡z))) ... | no x≢z | yes y≡z rewrite y≡z = sym (update-eq′ ρ z w) ... | yes x≡z | no y≢z rewrite x≡z = update-eq′ ρ z v ... | no x≢z | no y≢z = trans (update-neq ρ x v z x≢z) (sym (update-neq ρ y w z y≢z)) @@ -5325,52 +4712,52 @@ of type `Maybe A` and default element `nothing`.-PartialMap : Set → Set PartialMap A = TotalMap (Maybe A) @@ -5378,15 +4765,15 @@ of type `Maybe A` and default element `nothing`.-module PartialMap where @@ -5394,52 +4781,52 @@ of type `Maybe A` and default element `nothing`.- ∅ : ∀ {A} → PartialMap A ∅ = TotalMap.always nothing @@ -5447,857 +4834,253 @@ of type `Maybe A` and default element `nothing`.- infixl 100 _,_↦_ + + _,_↦_ : ∀ {A} (ρ : PartialMap A) (x : Id) (v : A) → PartialMap A ρ , x ↦ v = TotalMap._,_↦_ ρ x (just v) --As before, we define handy abbreviations for updating a map two, three, or four times. - -- - _,_↦_,_↦_ : ∀ {A} → PartialMap A → Id → A → Id → A → PartialMap A - ρ , x₁ ↦ v₁ , x₂ ↦ v₂ = (ρ , x₁ ↦ v₁), x₂ ↦ v₂ - - _,_↦_,_↦_,_↦_ : ∀ {A} → PartialMap A → Id → A → Id → A → Id → A → PartialMap A - ρ , x₁ ↦ v₁ , x₂ ↦ v₂ , x₃ ↦ v₃ = ((ρ , x₁ ↦ v₁), x₂ ↦ v₂), x₃ ↦ v₃ - - _,_↦_,_↦_,_↦_,_↦_ : ∀ {A} → PartialMap A → Id → A → Id → A → Id → A → Id → A → PartialMap A - ρ , x₁ ↦ v₁ , x₂ ↦ v₂ , x₃ ↦ v₃ , x₄ ↦ v₄ = (((ρ , x₁ ↦ v₁), x₂ ↦ v₂), x₃ ↦ v₃), x₄ ↦ v₄ -We now lift all of the basic lemmas about total maps to partial maps.- apply-∅ : ∀ {A} → (x : Id) → (∅ {A} x) ≡ nothing apply-∅ x = TotalMap.apply-always nothing x @@ -6305,165 +5088,165 @@ We now lift all of the basic lemmas about total maps to partial maps.- update-eq : ∀ {A} (ρ : PartialMap A) (x : Id) (v : A) → (ρ , x ↦ v) x ≡ just v update-eq ρ x v = TotalMap.update-eq ρ x (just v) @@ -6471,213 +5254,213 @@ We now lift all of the basic lemmas about total maps to partial maps.- update-neq : ∀ {A} (ρ : PartialMap A) (x : Id) (v : A) (y : Id) → x ≢ y → (ρ , x ↦ v) y ≡ ρ y update-neq ρ x v y x≢y = TotalMap.update-neq ρ x (just v) y x≢y @@ -6685,213 +5468,213 @@ We now lift all of the basic lemmas about total maps to partial maps.- update-shadow : ∀ {A} (ρ : PartialMap A) (x : Id) (v w : A) → (ρ , x ↦ v , x ↦ w) ≡ (ρ , x ↦ w) update-shadow ρ x v w = TotalMap.update-shadow ρ x (just v) (just w) @@ -6899,186 +5682,186 @@ We now lift all of the basic lemmas about total maps to partial maps.- update-same : ∀ {A} (ρ : PartialMap A) (x : Id) (v : A) → ρ x ≡ just v → (ρ , x ↦ v) ≡ ρ update-same ρ x v ρx≡v rewrite sym ρx≡v = TotalMap.update-same ρ x @@ -7086,289 +5869,289 @@ We now lift all of the basic lemmas about total maps to partial maps.- update-permute : ∀ {A} (ρ : PartialMap A) (x : Id) (v : A) (y : Id) (w : A) → x ≢ y → (ρ , x ↦ v , y ↦ w) ≡ (ρ , y ↦ w , x ↦ v) update-permute ρ x v y w x≢y = TotalMap.update-permute ρ x (just v) y (just w) x≢y diff --git a/out/Stlc.md b/out/Stlc.md index 0a083313..d0206ece 100644 --- a/out/Stlc.md +++ b/out/Stlc.md @@ -45,7 +45,7 @@ This chapter defines the simply-typed lambda calculus. >; PartialMap;module PartialMap)open PartialMap (∅; _,_↦_)Relation.Binary.PropositionalEquality as P using (_≡_; _≢_; _≢_; refl) -- open import Relation.Binary.Core using (Rel) -- open import Data.Product using (∃; ∄; _,_) -- open import Function using (_∘_; _$_) @@ -390,264 +398,264 @@ Syntax of types and terms. All source terms are labeled with $ᵀ$.-infixr 100 _⇒_ infixl 100 _·ᵀ_ _·ᵀ_ data Type :Type : Set where 𝔹 : Type Type _⇒_ : Type → Type → → Type → Type data Term Term : Set where varᵀ : Id →: TermId → Term λᵀ_∈_⇒_ : Id →: TypeId → →Type Term→ → Term → Term _·ᵀ_ : Term → Term → Term → Term Term trueᵀ : Term Term falseᵀ : Term Term ifᵀ_then_else_ : Term → Term → → Term → →Term → Term @@ -656,368 +664,368 @@ Syntax of types and terms. All source terms are labeled with $ᵀ$. Some examples.-f x y : Id -f : = Id +f = id "f" -x = id "x" yx = id "x" +y = id "y" I[𝔹] I[𝔹] I[𝔹⇒𝔹] K[𝔹][𝔹] not[𝔹] : Term Term I[𝔹] = (λᵀ x ∈ 𝔹 ⇒ (varᵀλᵀ x ∈ 𝔹 ⇒ (varᵀ x)) I[𝔹⇒𝔹] = (λᵀ f ∈ (𝔹λᵀ f ⇒∈ (𝔹) ⇒ (λᵀ x ∈ 𝔹 ⇒ ((varᵀ f) ·ᵀ⇒ (λᵀ x ∈ 𝔹 ⇒ ((varᵀ f(varᵀ) ·ᵀ (varᵀ x)))) K[𝔹][𝔹] = (λᵀ x ∈ 𝔹 ⇒ (λᵀ y ∈ 𝔹 ⇒ (varᵀ x))) ∈ 𝔹 ⇒ (λᵀ y ∈ 𝔹 ⇒ (varᵀ x))) not[𝔹] = (λᵀ x ∈ 𝔹 ⇒ (ifᵀλᵀ x ∈ 𝔹 ⇒ (ifᵀ (varᵀ x) then falseᵀthen elsefalseᵀ else trueᵀ)) @@ -1027,138 +1035,138 @@ Some examples.-data value : Term Term → Set where value-λᵀ : ∀ {x A∀ N}{x →A value (λᵀ x ∈ A ⇒ N)} → value (λᵀ x ∈ A - value-trueᵀ :⇒ value (trueᵀN) value-falseᵀvalue-trueᵀ : value (trueᵀ) + value-falseᵀ : value (falseᵀ) @@ -1168,645 +1176,645 @@ Some examples.-_[_:=_] : Term → IdTerm → TermId → → Term → Term (varᵀ x′) [ x := Vx ]:= withV ] with x ≟ x′ ≟ x′ ... | yes _ = V ... | no _ =| no _ varᵀ= varᵀ x′ (λᵀ x′ ∈ A′x′ ⇒∈ A′ ⇒ N′) [ x := Vx ]:= withV ] with x ≟ x′ ≟ x′ ... | yes _ = λᵀ x′= ∈λᵀ A′x′ ⇒∈ N′A′ ⇒ N′ ... | no _ =| no λᵀ _ x′= ∈λᵀ A′x′ ⇒∈ A′ ⇒ (N′ [ x := Vx ]):= V ]) (L′ ·ᵀ M′) [ x := Vx ] = (L′ [ x := V ] = (L′ [ Vx ]):= ·ᵀV ]) ·ᵀ (M′ [ x := Vx ]):= V ]) (trueᵀ) [ x := Vx := V ] = trueᵀ (falseᵀ) [ x := Vx := V ] = falseᵀ (ifᵀ L′ then M′then elseM′ else N′) [ x := Vx ] = ifᵀ (L′ [ x := V ] = ifᵀ (L′ [ Vx ]):= thenV ]) then (M′ [ x := Vx ]):= elseV ]) else (N′ [ x := Vx := V ]) @@ -1816,593 +1824,593 @@ Some examples.-data _⟹_ : Term → Term → Term → Set where β⇒ : ∀ {x A∀ N{x V}A →N value V} → value V → ((λᵀ x ∈ A ⇒ N) ·ᵀ⇒ VN) ⟹·ᵀ (NV) [⟹ x(N :=[ Vx ]):= V ]) γ⇒₁ : ∀ {L L'∀ {M}L →L' M} → L ⟹ L' →⟹ L' → (L ·ᵀ M)L ⟹·ᵀ (L' ·ᵀ M) ⟹ (L' ·ᵀ M) γ⇒₂ : ∀ {V M∀ M'} → - value {V M M'} → value V → + M ⟹ M' →⟹ M' → (V ·ᵀ M) ⟹ (VM) ·ᵀ⟹ (V ·ᵀ M') β𝔹₁ : ∀ {M N}∀ →{M N} → (ifᵀ trueᵀ then Mthen elseM N) ⟹else N) ⟹ M β𝔹₂ : ∀ {M N}∀ →{M N} → (ifᵀ falseᵀ then Mthen elseM N) ⟹else N) ⟹ N γ𝔹 : ∀ {L L'∀ {ML N}L' →M N} → L ⟹ L' →⟹ L' → (ifᵀ L then Mthen elseM N) ⟹else (ifᵀN) ⟹ L'(ifᵀ thenL' Mthen elseM else N) @@ -2412,367 +2420,367 @@ Some examples.-Rel : Set → Set₁ Set₁ Rel A = A → A → Set infixl 100 _>>_ _>>_ data _* {A : Set{A }: (R Set:} Rel(R A) : Rel Rel A) : Rel A where ⟨⟩ : ∀ {x :∀ A}{x →: (RA} *)→ x(R *) x x ⟨_⟩ : ∀ {x y∀ :{x A}y →: RA} x y → (R x y *)→ x(R *) x y _>>_ : ∀ {x y∀ {x y z : A} →: (RA} *)→ x(R *) x y → (R *)→ y(R *) y z → (R *)→ x(R *) x z @@ -2780,981 +2788,1404 @@ Some examples.-_⟹*_infix : Term →80 Term → Set -_⟹*_ + +_⟹*_ =: Term (_⟹_)→ Term → Set +_⟹*_ = (_⟹_) *++ +open import Relation.Binary using (Preorder) + +⟹*-Preorder : Preorder _ _ _ +⟹*-Preorder = record + { Carrier = Term + ; _≈_ = _≡_ + ; _∼_ = _⟹*_ + ; isPreorder = record + { isEquivalence = P.isEquivalence + ; reflexive = λ {refl → ⟨⟩} + ; trans = _>>_ + } + } + +open import Relation.Binary.PreorderReasoning ⟹*-Preorder + using (begin_; _∎) renaming (_≈⟨_⟩_ to _≡⟨_⟩_; _∼⟨_⟩_ to _⟹*⟨_⟩_) + ++ Example evaluation.-example₀example₀′ : (not[𝔹] ·ᵀ trueᵀ) ⟹* falseᵀ example₀example₀′ = ⟨ step₀ ⟩ >> ⟨ step₁ ⟩ wherebegin - M₀ M₁ M₂ : Term - M₀ = (not[𝔹] ·ᵀ trueᵀ) M₁⟹*⟨ =⟨ (ifᵀ trueᵀ then falseᵀ else trueᵀ) - M₂ = falseᵀ - step₀ : M₀ ⟹ M₁ - step₀ = β⇒ value-trueᵀ ⟩ ⟩ + ifᵀ trueᵀ then falseᵀ else trueᵀ step₁⟹*⟨ :⟨ M₁ ⟹ M₂ - step₁ = β𝔹₁ ⟩ ⟩ + falseᵀ + ∎ example₁example₀ : (I[𝔹⇒𝔹] ·ᵀ I[𝔹] ·ᵀ (not[𝔹] ·ᵀ falseᵀ)) ⟹* trueᵀ) ⟹* falseᵀ example₁example₀ = ⟨ step₀ ⟩ >> ⟨ step₁ ⟩ >> ⟨ step₂ ⟩ >> ⟨ step₃ ⟩ >> ⟨ step₄ ⟩ - where - M₀ M₁ M₂ M₃ M₄ M₅ : Term - M₀ = (I[𝔹⇒𝔹] ·ᵀ I[𝔹] ·ᵀ (not[𝔹] ·ᵀ falseᵀ)) - M₁ = ((λᵀ x ∈ 𝔹 ⇒ (I[𝔹] ·ᵀ varᵀ x)) ·ᵀ (not[𝔹] ·ᵀ falseᵀ)) - M₂ = ((λᵀ x ∈ 𝔹 ⇒ (I[𝔹] ·ᵀ varᵀ x)) ·ᵀ (ifᵀ falseᵀ then falseᵀ else trueᵀ)) - M₃ = ((λᵀ x ∈ 𝔹 ⇒ (I[𝔹] ·ᵀ varᵀ x)) ·ᵀ trueᵀ) - M₄ = I[𝔹] ·ᵀ trueᵀ - M₅ = trueᵀ - step₀ : M₀ ⟹⟩ M₁>> - step₀ ⟨ step₁ =⟩ γ⇒₁ (β⇒ value-λᵀ) step₁where + M₀ : M₁ M₂ : Term + M₀ ⟹= M₂ - step₁(not[𝔹] ·ᵀ trueᵀ) + M₁ = γ⇒₂ value-λᵀ (β⇒ifᵀ value-falseᵀtrueᵀ then falseᵀ else trueᵀ) step₂ : M₂ ⟹ M₃ - step₂ = γ⇒₂ value-λᵀ β𝔹₂falseᵀ step₃step₀ : M₃M₀ ⟹ M₁ + step₀ = ⟹ M₄ - step₃ = β⇒ value-trueᵀ - step₄ : M₄ ⟹ M₅ step₄step₁ : M₁ ⟹ M₂ + step₁ = β𝔹₁ + +example₁ : (I[𝔹⇒𝔹] β⇒·ᵀ I[𝔹] ·ᵀ (not[𝔹] ·ᵀ falseᵀ)) ⟹* trueᵀ +example₁ = ⟨ step₀ ⟩ >> ⟨ step₁ ⟩ >> ⟨ step₂ ⟩ >> ⟨ step₃ ⟩ >> ⟨ step₄ ⟩ + where + M₀ M₁ M₂ M₃ M₄ M₅ : Term + M₀ = (I[𝔹⇒𝔹] ·ᵀ I[𝔹] ·ᵀ (not[𝔹] ·ᵀ falseᵀ)) + M₁ = ((λᵀ x ∈ 𝔹 ⇒ (I[𝔹] ·ᵀ varᵀ x)) ·ᵀ (not[𝔹] ·ᵀ falseᵀ)) + M₂ = ((λᵀ x ∈ 𝔹 ⇒ (I[𝔹] ·ᵀ varᵀ x)) ·ᵀ (ifᵀ falseᵀ then falseᵀ else trueᵀ)) + M₃ = ((λᵀ x ∈ 𝔹 ⇒ (I[𝔹] ·ᵀ varᵀ x)) ·ᵀ trueᵀ) + M₄ = I[𝔹] ·ᵀ trueᵀ + M₅ = trueᵀ + step₀ : M₀ ⟹ M₁ + step₀ = γ⇒₁ (β⇒ value-λᵀ) + step₁ : M₁ ⟹ M₂ + step₁ = γ⇒₂ value-λᵀ (β⇒ value-falseᵀ) + step₂ : M₂ ⟹ M₃ + step₂ = γ⇒₂ value-λᵀ β𝔹₂ + step₃ : M₃ ⟹ M₄ + step₃ = β⇒ value-trueᵀ + step₄ : M₄ ⟹ M₅ + step₄ = β⇒ value-trueᵀ @@ -3764,689 +4195,703 @@ Example evaluation.-Context : Set Context = PartialMap Type datainfix 50 _⊢_∈_ + +data _⊢_∈_ : Context → Term → Type → Set where Ax : ∀ {Γ x A} → Γ x ≡ just A → Γ ⊢ varᵀ x ∈ A ⇒-I : ∀ {Γ x N A B} → (Γ , x ↦ A) ⊢ N ∈ B → Γ ⊢ (λᵀ x ∈ A ⇒ N) ∈ (A ⇒ B) ⇒-E : ∀ {Γ L M A B} → Γ ⊢ L ∈ (A ⇒ B) → Γ ⊢ M ∈ A → Γ ⊢ L ·ᵀ M ∈ B 𝔹-I₁ : ∀ {Γ} → Γ ⊢ trueᵀ ∈ 𝔹 𝔹-I₂ : ∀ {Γ} → Γ ⊢ falseᵀ ∈ 𝔹 𝔹-E : ∀ {Γ L M N A} → Γ ⊢ L ∈ 𝔹 → Γ ⊢ M ∈ A → Γ ⊢ N ∈ A → Γ ⊢ (ifᵀ L then M else N) ∈ A diff --git a/out/StlcProp.md b/out/StlcProp.md index 6ab715d0..f7365f74 100644 --- a/out/StlcProp.md +++ b/out/StlcProp.md @@ -327,7 +327,7 @@ permalink : /StlcProp > Maps.PartialMap @@ -356,439 +356,439 @@ theorem. As we saw for the simple calculus in the [Stlc]({{ "Stlc" | relative_url }}) chapter, the first step in establishing basic properties of reduction and types is to identify the possible _canonical forms_ (i.e., well-typed closed values) -belonging to each type. For $$bool$$, these are the boolean values $$true$$ and -$$false$$. For arrow types, the canonical forms are lambda-abstractions. +belonging to each type. For `bool`, these are the boolean values `true` and +`false`. For arrow types, the canonical forms are lambda-abstractions.-data canonical_for_ : Term → Type → Set → Set where canonical-λᵀ : ∀ {x A N B} → canonical (λᵀ x (λᵀ∈ A x⇒ ∈ A ⇒ N) for (A ⇒ B) canonical-trueᵀ : canonical trueᵀ trueᵀ for 𝔹 canonical-falseᵀ : canonical falseᵀ for 𝔹 -- canonical_for_ : Term → Type → Set -- canonical L for 𝔹 = L ≡ trueᵀ ⊎ L ≡ falseᵀ -- canonical L for (A ⇒ B) = ∃₂ λ x N → L ≡ λᵀ x ∈ A ⇒ N canonicalFormsLemma : ∀ {L A} {L→ A}∅ ⊢ →L ∅∈ ⊢A L→ ∈ A →value value L → canonical L for L for A canonicalFormsLemma (Ax ⊢x) () canonicalFormsLemma (⇒-I ⊢N) value-λᵀ = canonical-λᵀ -- _ , _ , refl canonicalFormsLemma (⇒-E ⊢L ⊢M) () canonicalFormsLemma 𝔹-I₁ value-trueᵀ = canonical-trueᵀ -- inj₁ refl canonicalFormsLemma 𝔹-I₂ value-falseᵀ = canonical-falseᵀ -- inj₂ refl canonicalFormsLemma (𝔹-E ⊢L ⊢M ⊢L ⊢M ⊢N) () @@ -805,853 +805,853 @@ first, then the formal version.inj₁| value-λᵀinj₂ (L′ , L⟹L′) = inj₂ (L′ ·ᵀ M , γ⇒₁ L⟹L′) progress (⇒-E {Γ} {L} {M} {A} {B} ⊢L... ⊢M)| inj₁ valueL with progress ⊢L⊢M ... | inj₂ (L′ , L⟹L′) =(M′ inj₂ (L′ ·ᵀ M , γ⇒₁ L⟹L′M⟹M′) = inj₂ (L ·ᵀ M′ , -... γ⇒₂ | inj₁valueL valueLM⟹M′) +... with| progressinj₁ ⊢MvalueM -... |with inj₂ (M′ , M⟹M′) =canonicalFormsLemma inj₂⊢L (L ·ᵀ M′valueL +... ,| γ⇒₂canonical-λᵀ valueL M⟹M′{x)} - ...{.A} |{N} inj₁{.B} = inj₂ ((N [ x := M ]) , β⇒ valueM with canonicalFormsLemma ⊢L valueL) ... |progress canonical-λᵀ𝔹-I₁ {x} {.A} {N} {.B} = inj₁ value-trueᵀ +progress inj₂𝔹-I₂ ((N= [inj₁ x := M ]) , β⇒ valueM)value-falseᵀ progress (𝔹-E {Γ} 𝔹-I₁{L} ={M} inj₁{N} value-trueᵀ -{A} ⊢L ⊢M ⊢N) with progress 𝔹-I₂ = inj₁ value-falseᵀ⊢L progress... | inj₂ (𝔹-EL′ {Γ}, {L} {M} {N} {A} ⊢L ⊢M ⊢NL⟹L′) = inj₂ ((ifᵀ L′ then M else N) , γ𝔹 with progress ⊢LL⟹L′) ... | inj₂inj₁ (L′valueL ,with L⟹L′)canonicalFormsLemma =⊢L inj₂ ((ifᵀ L′ then MvalueL +... else N) , γ𝔹 L⟹L′) -... | canonical-trueᵀ = inj₂ inj₁(M , valueLβ𝔹₁) +... with| canonicalFormsLemmacanonical-falseᵀ ⊢L= inj₂ valueL -... | canonical-trueᵀ = inj₂ (M , β𝔹₁) -... | canonical-falseᵀ = inj₂ (N , β𝔹₂) @@ -1663,100 +1663,100 @@ instead of induction on typing derivations.-progress : ∀ {M A} {M→ A}∅ ⊢ →M ∅∈ ⊢A M→ ∈ A →value valueM ⊎ ∃ Mλ ⊎N ∃→ λM N⟹ → M ⟹ N-_Proof_: By induction on the derivation of $$\vdash t : A$$. +_Proof_: By induction on the derivation of `\vdash t : A`. - The last rule of the derivation cannot be `var`, since a variable is never well typed in an empty context. - The `true`, `false`, and `abs` cases are trivial, since in - each of these cases we can see by inspecting the rule that $$t$$ + each of these cases we can see by inspecting the rule that `t` is a value. - - If the last rule of the derivation is `app`, then $$t$$ has the - form $$t_1\;t_2$$ for som e$$t_1$$ and $$t_2$$, where we know that - $$t_1$$ and $$t_2$$ are also well typed in the empty context; in particular, - there exists a type $$B$$ such that $$\vdash t_1 : A\to T$$ and - $$\vdash t_2 : B$$. By the induction hypothesis, either $$t_1$$ is a + - If the last rule of the derivation is `app`, then `t` has the + form `t_1\;t_2` for som e`t_1` and `t_2`, where we know that + `t_1` and `t_2` are also well typed in the empty context; in particular, + there exists a type `B` such that `\vdash t_1 : A\to T` and + `\vdash t_2 : B`. By the induction hypothesis, either `t_1` is a value or it can take a reduction step. - - If $$t_1$$ is a value, then consider $$t_2$$, which by the other + - If `t_1` is a value, then consider `t_2`, which by the other induction hypothesis must also either be a value or take a step. - - Suppose $$t_2$$ is a value. Since $$t_1$$ is a value with an - arrow type, it must be a lambda abstraction; hence $$t_1\;t_2$$ + - Suppose `t_2` is a value. Since `t_1` is a value with an + arrow type, it must be a lambda abstraction; hence `t_1\;t_2` can take a step by `red`. - - Otherwise, $$t_2$$ can take a step, and hence so can $$t_1\;t_2$$ + - Otherwise, `t_2` can take a step, and hence so can `t_1\;t_2` by `app2`. - - If $$t_1$$ can take a step, then so can $$t_1 t_2$$ by `app1`. + - If `t_1` can take a step, then so can `t_1 t_2` by `app1`. - - If the last rule of the derivation is `if`, then $$t = \text{if }t_1 - \text{ then }t_2\text{ else }t_3$$, where $$t_1$$ has type $$bool$$. By - the IH, $$t_1$$ either is a value or takes a step. + - If the last rule of the derivation is `if`, then `t = \text{if }t_1 + \text{ then }t_2\text{ else }t_3`, where `t_1` has type `bool`. By + the IH, `t_1` either is a value or takes a step. - - If $$t_1$$ is a value, then since it has type $$bool$$ it must be - either $$true$$ or $$false$$. If it is $$true$$, then $$t$$ steps - to $$t_2$$; otherwise it steps to $$t_3$$. + - If `t_1` is a value, then since it has type `bool` it must be + either `true` or `false`. If it is `true`, then `t` steps + to `t_2`; otherwise it steps to `t_3`. - - Otherwise, $$t_1$$ takes a step, and therefore so does $$t$$ (by `if`). + - Otherwise, `t_1` takes a step, and therefore so does `t` (by `if`).-progress (Ax ()) +progress (⇒-I ⊢N) = inj₁ value-λᵀ +progress (⇒-E {Γ} {L} {M} {A} {B} (Ax⊢L ())⊢M) with progress ⊢L progress (⇒-I ⊢N) =...-postulate progress′ : ∀ {M A} → ∅ ⊢ M ∈ A → value M ⊎ ∃ λ N → M ⟹ N @@ -1775,23 +1775,23 @@ interesting proofs), the story goes like this: - The _preservation theorem_ is proved by induction on a typing derivation, pretty much as we did in the [Stlc]({{ "Stlc" | relative_url }}) chapter. The one case that is significantly different is the one for the - $$red$$ rule, whose definition uses the substitution operation. To see that + `red` rule, whose definition uses the substitution operation. To see that this step preserves typing, we need to know that the substitution itself does. So we prove a... - _substitution lemma_, stating that substituting a (closed) - term $$s$$ for a variable $$x$$ in a term $$t$$ preserves the type - of $$t$$. The proof goes by induction on the form of $$t$$ and + term `s` for a variable `x` in a term `t` preserves the type + of `t`. The proof goes by induction on the form of `t` and requires looking at all the different cases in the definition of substitition. This time, the tricky cases are the ones for variables and for function abstractions. In both cases, we - discover that we need to take a term $$s$$ that has been shown - to be well-typed in some context $$\Gamma$$ and consider the same - term $$s$$ in a slightly different context $$\Gamma'$$. For this + discover that we need to take a term `s` that has been shown + to be well-typed in some context `\Gamma` and consider the same + term `s` in a slightly different context `\Gamma'`. For this we prove a... - _context invariance_ lemma, showing that typing is preserved - under "inessential changes" to the context $$\Gamma$$---in + under "inessential changes" to the context `\Gamma`---in particular, changes that do not affect any of the free variables of the term. And finally, for this, we need a careful definition of... @@ -1806,614 +1806,614 @@ order... ### Free Occurrences -A variable $$x$$ _appears free in_ a term $$M$$ if $$M$$ contains some -occurrence of $$x$$ that is not under an abstraction over $$x$$. +A variable `x` _appears free in_ a term `M` if `M` contains some +occurrence of `x` that is not under an abstraction over `x`. For example: - - $$y$$ appears free, but $$x$$ does not, in $$λᵀ x ∈ (A ⇒ B) ⇒ x ·ᵀ y$$ - - both $$x$$ and $$y$$ appear free in $$(λᵀ x ∈ (A ⇒ B) ⇒ x ·ᵀ y) ·ᵀ x$$ - - no variables appear free in $$λᵀ x ∈ (A ⇒ B) ⇒ (λᵀ y ∈ A ⇒ x ·ᵀ y)$$ + - `y` appears free, but `x` does not, in `λᵀ x ∈ (A ⇒ B) ⇒ x ·ᵀ y` + - both `x` and `y` appear free in `(λᵀ x ∈ (A ⇒ B) ⇒ x ·ᵀ y) ·ᵀ x` + - no variables appear free in `λᵀ x ∈ (A ⇒ B) ⇒ (λᵀ y ∈ A ⇒ x ·ᵀ y)` Formally:-data _FreeIn_ : Id → Term → Set where + free-varᵀ : ∀ {x} → x FreeIn (varᵀ x) + free-λᵀ : ∀ {x y A N} → y ≢ x → x FreeIn N → _FreeIn_x FreeIn :(λᵀ Idy →∈ TermA ⇒ → Set whereN) free-varᵀ free-·ᵀ₁ : ∀ {x L M} → x FreeIn FreeInL (varᵀ→ x FreeIn x)(L ·ᵀ M) free-λᵀ free-·ᵀ₂ : ∀ {x y A Nx L M} → y ≢→ x → x FreeIn M → Nx → x FreeIn (λᵀL y·ᵀ ∈ A ⇒ NM) free-·ᵀ₁free-ifᵀ₁ : ∀ {x L :M ∀N} {x→ Lx M} → xFreeIn FreeIn L → x FreeIn FreeIn(ifᵀ L (Lthen ·ᵀ M else N) free-·ᵀ₂ : ∀ {xfree-ifᵀ₂ L: M}∀ →{x xL FreeInM N} → Mx → x FreeIn M → (Lx ·ᵀ M)FreeIn - free-ifᵀ₁ (ifᵀ L then :M ∀ {xelse L M N}) + free-ifᵀ₃ →: ∀ {x FreeIn L → xL FreeInM N} → (ifᵀx LFreeIn then M else N) → x FreeIn - free-ifᵀ₂ (ifᵀ L then :M ∀ {xelse L M N} → x FreeIn M → x FreeIn (ifᵀ L then M else N) - free-ifᵀ₃ : ∀ {x L M N} → x FreeIn N → x FreeIn (ifᵀ L then M else N) @@ -2423,72 +2423,72 @@ A term in which no variables appear free is said to be _closed_.-closed : Term → Set closed M = ∀ {x} → ¬ (x FreeIn M) @@ -2505,683 +2505,683 @@ are really the crux of the lambda-calculus.) ### Substitution To prove that substitution preserves typing, we first need a technical lemma connecting free variables and typing contexts: If -a variable $$x$$ appears free in a term $$M$$, and if we know $$M$$ is -well typed in context $$Γ$$, then it must be the case that -$$Γ$$ assigns a type to $$x$$. +a variable `x` appears free in a term `M`, and if we know `M` is +well typed in context `Γ`, then it must be the case that +`Γ` assigns a type to `x`.-freeLemma : ∀ {x M A Γ} → x FreeIn M → Γ ⊢ M ∈ A → ∃ λ B → Γ x ≡ just B-_Proof_: We show, by induction on the proof that $$x$$ appears - free in $$P$$, that, for all contexts $$Γ$$, if $$P$$ is well - typed under $$Γ$$, then $$Γ$$ assigns some type to $$x$$. +_Proof_: We show, by induction on the proof that `x` appears + free in `P`, that, for all contexts `Γ`, if `P` is well + typed under `Γ`, then `Γ` assigns some type to `x`. - - If the last rule used was `free-varᵀ`, then $$P = x$$, and from - the assumption that $$M$$ is well typed under $$Γ$$ we have - immediately that $$Γ$$ assigns a type to $$x$$. + - If the last rule used was `free-varᵀ`, then `P = x`, and from + the assumption that `M` is well typed under `Γ` we have + immediately that `Γ` assigns a type to `x`. - - If the last rule used was `free-·₁`, then $$P = L ·ᵀ M$$ and $$x$$ - appears free in $$L$$. Since $$L$$ is well typed under $$\Gamma$$, - we can see from the typing rules that $$L$$ must also be, and - the IH then tells us that $$Γ$$ assigns $$x$$ a type. + - If the last rule used was `free-·₁`, then `P = L ·ᵀ M` and `x` + appears free in `L`. Since `L` is well typed under `\Gamma`, + we can see from the typing rules that `L` must also be, and + the IH then tells us that `Γ` assigns `x` a type. - - Almost all the other cases are similar: $$x$$ appears free in a - subterm of $$P$$, and since $$P$$ is well typed under $$Γ$$, we - know the subterm of $$M$$ in which $$x$$ appears is well typed - under $$Γ$$ as well, and the IH gives us exactly the + - Almost all the other cases are similar: `x` appears free in a + subterm of `P`, and since `P` is well typed under `Γ`, we + know the subterm of `M` in which `x` appears is well typed + under `Γ` as well, and the IH gives us exactly the conclusion we want. - - The only remaining case is `free-λᵀ`. In this case $$P = - λᵀ y ∈ A ⇒ N$$, and $$x$$ appears free in $$N$$; we also know that - $$x$$ is different from $$y$$. The difference from the previous - cases is that whereas $$P$$ is well typed under $$\Gamma$$, its - body $$N$$ is well typed under $$(Γ , y ↦ A)$$, so the IH - allows us to conclude that $$x$$ is assigned some type by the - extended context $$(Γ , y ↦ A)$$. To conclude that $$Γ$$ - assigns a type to $$x$$, we appeal the decidable equality for names - `_≟_`, noting that $$x$$ and $$y$$ are different variables. + - The only remaining case is `free-λᵀ`. In this case `P = + λᵀ y ∈ A ⇒ N`, and `x` appears free in `N`; we also know that + `x` is different from `y`. The difference from the previous + cases is that whereas `P` is well typed under `\Gamma`, its + body `N` is well typed under `(Γ , y ↦ A)`, so the IH + allows us to conclude that `x` is assigned some type by the + extended context `(Γ , y ↦ A)`. To conclude that `Γ` + assigns a type to `x`, we appeal the decidable equality for names + `_≟_`, noting that `x` and `y` are different variables.-freeLemma free-varᵀ (Ax Γx≡justA) = (_ , Γx≡justA) +freeLemma (free-·ᵀ₁ x∈L) (⇒-E ⊢L ⊢M) = freeLemma x∈L ⊢L +freeLemma (free-·ᵀ₂ x∈M) (⇒-E ⊢L ⊢M) = freeLemma x∈M ⊢M +freeLemma (free-ifᵀ₁ x∈L) (𝔹-E ⊢L ⊢M ⊢N) = freeLemma x∈L ⊢L +freeLemma (free-ifᵀ₂ x∈M) (𝔹-E (_⊢L , Γx≡justA) -freeLemma (free-·ᵀ₁ x∈L) (⇒-E ⊢L ⊢M) ⊢N) = freeLemma x∈Lx∈M ⊢L⊢M freeLemma (free-·ᵀ₂free-ifᵀ₃ x∈Mx∈N) (⇒-E𝔹-E ⊢L ⊢M ⊢N) = freeLemma x∈N ⊢N +freeLemma (free-λᵀ ={x} freeLemma{y} y≢x x∈M ⊢M -freeLemma (free-ifᵀ₁ x∈Lx∈N) (⇒-I ⊢N) with freeLemma x∈N (𝔹-E ⊢L ⊢M ⊢N) +... =| freeLemmaΓx=justC with y ≟ x∈L ⊢Lx freeLemma... | yes y≡x = ⊥-elim (free-ifᵀ₂ x∈M)y≢x y≡x(𝔹-E) +... ⊢L| ⊢M ⊢N) = freeLemma x∈M ⊢M -freeLemma (free-ifᵀ₃ x∈N) (𝔹-E ⊢L ⊢M ⊢N) = freeLemma x∈N ⊢N -freeLemma (free-λᵀ {x} {y} y≢x x∈N) (⇒-I ⊢N) with freeLemma x∈N ⊢N -... | Γx=justC with y ≟ x -... | yes y≡x = ⊥-elim (y≢x y≡x) -... | no _ = Γx=justC-[A subtle point: if the first argument of $$free-λᵀ$$ was of type -$$x ≢ y$$ rather than of type $$y ≢ x$$, then the type of the -term $$Γx=justC$$ would not simplify properly.] +[A subtle point: if the first argument of `free-λᵀ` was of type +`x ≢ y` rather than of type `y ≢ x`, then the type of the +term `Γx=justC` would not simplify properly.] -Next, we'll need the fact that any term $$M$$ which is well typed in +Next, we'll need the fact that any term `M` which is well typed in the empty context is closed (it has no free variables). #### Exercise: 2 stars, optional (∅⊢-closed)-postulate ∅⊢-closed : ∀ {M A} → ∅ ⊢ M ∈ A → closed M @@ -3190,1094 +3190,1094 @@ the empty context is closed (it has no free variables).-Sometimes, when we have a proof $$Γ ⊢ M ∈ A$$, we will need to -replace $$Γ$$ by a different context $$Γ′$$. When is it safe +Sometimes, when we have a proof `Γ ⊢ M ∈ A`, we will need to +replace `Γ` by a different context `Γ′`. When is it safe to do this? Intuitively, it must at least be the case that -$$Γ′$$ assigns the same types as $$Γ$$ to all the variables -that appear free in $$M$$. In fact, this is the only condition that +`Γ′` assigns the same types as `Γ` to all the variables +that appear free in `M`. In fact, this is the only condition that is needed.-contradiction : ∀ {X : Set} → ∀ {x : X} → ¬ (_≡_ {A = Maybe X} (just x) nothing) contradiction () ∅⊢-closed′ : ∀ {M A} → ∅ ⊢ M ∈ A → closed M ∅⊢-closed′ {M} {A} ⊢M {x} x∈M with freeLemma x∈M ⊢M ... | (B , ∅x≡justB) = contradiction (trans (sym ∅x≡justB) (apply-∅ x))-weaken : ∀ {Γ Γ′ M A} → (∀ {x} → x FreeIn M → Γ x ≡ Γ′ x) → Γ ⊢ M ∈ A → Γ′ ⊢ M ∈ A_Proof_: By induction on the derivation of -$$Γ ⊢ M ∈ A$$. +`Γ ⊢ M ∈ A`. - - If the last rule in the derivation was `var`, then $$t = x$$ - and $$\Gamma x = T$$. By assumption, $$\Gamma' x = T$$ as well, and - hence $$\Gamma' \vdash t : T$$ by `var`. + - If the last rule in the derivation was `var`, then `t = x` + and `\Gamma x = T`. By assumption, `\Gamma' x = T` as well, and + hence `\Gamma' \vdash t : T` by `var`. - - If the last rule was `abs`, then $$t = \lambda y:A. t'$$, with - $$T = A\to B$$ and $$\Gamma, y : A \vdash t' : B$$. The - induction hypothesis is that, for any context $$\Gamma''$$, if - $$\Gamma, y:A$$ and $$\Gamma''$$ assign the same types to all the - free variables in $$t'$$, then $$t'$$ has type $$B$$ under - $$\Gamma''$$. Let $$\Gamma'$$ be a context which agrees with - $$\Gamma$$ on the free variables in $$t$$; we must show - $$\Gamma' \vdash \lambda y:A. t' : A\to B$$. + - If the last rule was `abs`, then `t = \lambda y:A. t'`, with + `T = A\to B` and `\Gamma, y : A \vdash t' : B`. The + induction hypothesis is that, for any context `\Gamma''`, if + `\Gamma, y:A` and `\Gamma''` assign the same types to all the + free variables in `t'`, then `t'` has type `B` under + `\Gamma''`. Let `\Gamma'` be a context which agrees with + `\Gamma` on the free variables in `t`; we must show + `\Gamma' \vdash \lambda y:A. t' : A\to B`. - By $$abs$$, it suffices to show that $$\Gamma', y:A \vdash t' : t'$$. - By the IH (setting $$\Gamma'' = \Gamma', y:A$$), it suffices to show - that $$\Gamma, y:A$$ and $$\Gamma', y:A$$ agree on all the variables - that appear free in $$t'$$. + By `abs`, it suffices to show that `\Gamma', y:A \vdash t' : t'`. + By the IH (setting `\Gamma'' = \Gamma', y:A`), it suffices to show + that `\Gamma, y:A` and `\Gamma', y:A` agree on all the variables + that appear free in `t'`. - Any variable occurring free in $$t'$$ must be either $$y$$ or - some other variable. $$\Gamma, y:A$$ and $$\Gamma', y:A$$ - clearly agree on $$y$$. Otherwise, note that any variable other - than $$y$$ that occurs free in $$t'$$ also occurs free in - $$t = \lambda y:A. t'$$, and by assumption $$\Gamma$$ and - $$\Gamma'$$ agree on all such variables; hence so do $$\Gamma, y:A$$ and - $$\Gamma', y:A$$. + Any variable occurring free in `t'` must be either `y` or + some other variable. `\Gamma, y:A` and `\Gamma', y:A` + clearly agree on `y`. Otherwise, note that any variable other + than `y` that occurs free in `t'` also occurs free in + `t = \lambda y:A. t'`, and by assumption `\Gamma` and + `\Gamma'` agree on all such variables; hence so do `\Gamma, y:A` and + `\Gamma', y:A`. - - If the last rule was `app`, then $$t = t_1\;t_2$$, with - $$\Gamma \vdash t_1:A\to T$$ and $$\Gamma \vdash t_2:A$$. - One induction hypothesis states that for all contexts $$\Gamma'$$, - if $$\Gamma'$$ agrees with $$\Gamma$$ on the free variables in $$t_1$$, - then $$t_1$$ has type $$A\to T$$ under $$\Gamma'$$; there is a similar IH - for $$t_2$$. We must show that $$t_1\;t_2$$ also has type $$T$$ under - $$\Gamma'$$, given the assumption that $$\Gamma'$$ agrees with - $$\Gamma$$ on all the free variables in $$t_1\;t_2$$. By `app`, it - suffices to show that $$t_1$$ and $$t_2$$ each have the same type - under $$\Gamma'$$ as under $$\Gamma$$. But all free variables in - $$t_1$$ are also free in $$t_1\;t_2$$, and similarly for $$t_2$$; + - If the last rule was `app`, then `t = t_1\;t_2`, with + `\Gamma \vdash t_1:A\to T` and `\Gamma \vdash t_2:A`. + One induction hypothesis states that for all contexts `\Gamma'`, + if `\Gamma'` agrees with `\Gamma` on the free variables in `t_1`, + then `t_1` has type `A\to T` under `\Gamma'`; there is a similar IH + for `t_2`. We must show that `t_1\;t_2` also has type `T` under + `\Gamma'`, given the assumption that `\Gamma'` agrees with + `\Gamma` on all the free variables in `t_1\;t_2`. By `app`, it + suffices to show that `t_1` and `t_2` each have the same type + under `\Gamma'` as under `\Gamma`. But all free variables in + `t_1` are also free in `t_1\;t_2`, and similarly for `t_2`; hence the desired result follows from the induction hypotheses.-weaken Γ~Γ′ (Ax Γx≡justA) rewrite (Γ~Γ′ free-varᵀ) = Ax Γx≡justA +weaken {Γ} {Γ′} {λᵀ x ∈ A ⇒ N} Γ~Γ′ (⇒-I ⊢N) = ⇒-I (weaken Γx~Γ′x ⊢N) + where + Γx~Γ′x : ∀ {y} → y FreeIn N → (Γ , x ↦ A) y ≡ (Γ′ , x ↦ A) y + Γx~Γ′x {y} y∈N with x ≟ y + ... | yes refl = refl + ... | no x≢y = Γ~Γ′ (free-λᵀ x≢y y∈N) +weaken Γ~Γ′ (⇒-E ⊢L ⊢M) = ⇒-E (weaken (Γ~Γ′ ∘ free-·ᵀ₁) ⊢L) (weaken Γ~Γ′ (Ax Γx≡justA) rewrite (Γ~Γ′ free-varᵀ∘ free-·ᵀ₂) ⊢M) +weaken Γ~Γ′ 𝔹-I₁ = Ax Γx≡justA𝔹-I₁ weaken {Γ}Γ~Γ′ {Γ′}𝔹-I₂ {λᵀ= 𝔹-I₂ +weaken x ∈ A ⇒ N} Γ~Γ′ (𝔹-E ⊢L ⊢M (⇒-I ⊢N) + = ⇒-I𝔹-E (weaken Γx~Γ′x(Γ~Γ′ ⊢N∘ free-ifᵀ₁) - where - Γx~Γ′x ⊢L) (weaken :(Γ~Γ′ ∀ {y}∘ → y FreeInfree-ifᵀ₂) N ⊢M→) (Γweaken , x ↦ A) y(Γ~Γ′ ≡∘ (Γ′ , x ↦ A) y - Γx~Γ′x {y} y∈N with x ≟ y - ... | yes refl = refl - ... | no x≢y = Γ~Γ′ (free-λᵀ x≢y y∈N) -weaken Γ~Γ′ (⇒-E ⊢L ⊢M) = ⇒-E (weaken (Γ~Γ′ ∘ free-·ᵀ₁) ⊢L) (weaken (Γ~Γ′ ∘ free-·ᵀ₂) ⊢M) -weaken Γ~Γ′ 𝔹-I₁ = 𝔹-I₁ -weaken Γ~Γ′ 𝔹-I₂ = 𝔹-I₂ -weaken Γ~Γ′ (𝔹-E ⊢L ⊢M ⊢N) - = 𝔹-E (weaken (Γ~Γ′ ∘ free-ifᵀ₁) ⊢L) (weaken (Γ~Γ′ ∘ free-ifᵀ₂) ⊢M) (weaken (Γ~Γ′ ∘ free-ifᵀ₃) ⊢N) {- replaceCtxt f (var x x∶A ) rewrite f var = var x x∶A @@ -4307,238 +4307,238 @@ preserves types---namely, the observation that _substitution_ preserves types. Formally, the so-called _Substitution Lemma_ says this: Suppose we -have a term $$N$$ with a free variable $$x$$, and suppose we've been -able to assign a type $$B$$ to $$N$$ under the assumption that $$x$$ has -some type $$A$$. Also, suppose that we have some other term $$V$$ and -that we've shown that $$V$$ has type $$A$$. Then, since $$V$$ satisfies -the assumption we made about $$x$$ when typing $$N$$, we should be -able to substitute $$V$$ for each of the occurrences of $$x$$ in $$N$$ -and obtain a new term that still has type $$B$$. +have a term `N` with a free variable `x`, and suppose we've been +able to assign a type `B` to `N` under the assumption that `x` has +some type `A`. Also, suppose that we have some other term `V` and +that we've shown that `V` has type `A`. Then, since `V` satisfies +the assumption we made about `x` when typing `N`, we should be +able to substitute `V` for each of the occurrences of `x` in `N` +and obtain a new term that still has type `B`. -_Lemma_: If $$Γ , x ↦ A ⊢ N ∈ B$$ and $$∅ ⊢ V ∈ A$$, then -$$Γ ⊢ (N [ x := V ]) ∈ B$$. +_Lemma_: If `Γ , x ↦ A ⊢ N ∈ B` and `∅ ⊢ V ∈ A`, then +`Γ ⊢ (N [ x := V ]) ∈ B`.@@ -5999,584 +5889,584 @@ preservation-[:=] {Γ} {x} {A} {varᵀ x′} {B} {V} (Ax {.(Γ , ### Main Theorem We now have the tools we need to prove preservation: if a closed -term $$M$$ has type $$A$$ and takes a step to $$N$$, then $$N$$ -is also a closed term with type $$A$$. In other words, small-step +term `M` has type `A` and takes a step to `N`, then `N` +is also a closed term with type `A`. In other words, small-step reduction preserves types.-preservation-[:=] : ∀ {Γ x A N B V} → (Γ , x ↦ A) ⊢ N ∈ B → ∅ ⊢ V ∈ A → Γ ⊢ (N [ x := V ]) ∈ BOne technical subtlety in the statement of the lemma is that -we assign $$V$$ the type $$A$$ in the _empty_ context---in other -words, we assume $$V$$ is closed. This assumption considerably -simplifies the $$λᵀ$$ case of the proof (compared to assuming -$$Γ ⊢ V ∈ A$$, which would be the other reasonable assumption +we assign `V` the type `A` in the _empty_ context---in other +words, we assume `V` is closed. This assumption considerably +simplifies the `λᵀ` case of the proof (compared to assuming +`Γ ⊢ V ∈ A`, which would be the other reasonable assumption at this point) because the context invariance lemma then tells us -that $$V$$ has type $$A$$ in any context at all---we don't have to -worry about free variables in $$V$$ clashing with the variable being -introduced into the context by $$λᵀ$$. +that `V` has type `A` in any context at all---we don't have to +worry about free variables in `V` clashing with the variable being +introduced into the context by `λᵀ`. The substitution lemma can be viewed as a kind of "commutation" property. Intuitively, it says that substitution and typing can be done in either order: we can either assign types to the terms -$$N$$ and $$V$$ separately (under suitable contexts) and then combine +`N` and `V` separately (under suitable contexts) and then combine them using substitution, or we can substitute first and then -assign a type to $$N [ x := V ]$$---the result is the same either +assign a type to `N [ x := V ]`---the result is the same either way. -_Proof_: We show, by induction on $$N$$, that for all $$A$$ and -$$Γ$$, if $$Γ , x ↦ A \vdash N ∈ B$$ and $$∅ ⊢ V ∈ A$$, then -$$Γ \vdash N [ x := V ] ∈ B$$. +_Proof_: We show, by induction on `N`, that for all `A` and +`Γ`, if `Γ , x ↦ A \vdash N ∈ B` and `∅ ⊢ V ∈ A`, then +`Γ \vdash N [ x := V ] ∈ B`. - - If $$N$$ is a variable there are two cases to consider, - depending on whether $$N$$ is $$x$$ or some other variable. + - If `N` is a variable there are two cases to consider, + depending on whether `N` is `x` or some other variable. - - If $$N = varᵀ x$$, then from the fact that $$Γ , x ↦ A ⊢ N ∈ B$$ - we conclude that $$A = B$$. We must show that $$x [ x := V] = - V$$ has type $$A$$ under $$Γ$$, given the assumption that - $$V$$ has type $$A$$ under the empty context. This + - If `N = varᵀ x`, then from the fact that `Γ , x ↦ A ⊢ N ∈ B` + we conclude that `A = B`. We must show that `x [ x := V] = + V` has type `A` under `Γ`, given the assumption that + `V` has type `A` under the empty context. This follows from context invariance: if a closed term has type - $$A$$ in the empty context, it has that type in any context. + `A` in the empty context, it has that type in any context. - - If $$N$$ is some variable $$x′$$ different from $$x$$, then - we need only note that $$x′$$ has the same type under $$Γ , x ↦ A$$ - as under $$Γ$$. + - If `N` is some variable `x′` different from `x`, then + we need only note that `x′` has the same type under `Γ , x ↦ A` + as under `Γ`. - - If $$N$$ is an abstraction $$λᵀ x′ ∈ A′ ⇒ N′$$, then the IH tells us, - for all $$Γ′$$́ and $$B′$$, that if $$Γ′ , x ↦ A ⊢ N′ ∈ B′$$ - and $$∅ ⊢ V ∈ A$$, then $$Γ′ ⊢ N′ [ x := V ] ∈ B′$$. + - If `N` is an abstraction `λᵀ x′ ∈ A′ ⇒ N′`, then the IH tells us, + for all `Γ′`́ and `B′`, that if `Γ′ , x ↦ A ⊢ N′ ∈ B′` + and `∅ ⊢ V ∈ A`, then `Γ′ ⊢ N′ [ x := V ] ∈ B′`. The substitution in the conclusion behaves differently - depending on whether $$x$$ and $$x′$$ are the same variable. + depending on whether `x` and `x′` are the same variable. - First, suppose $$x ≡ x′$$. Then, by the definition of - substitution, $$N [ x := V] = N$$, so we just need to show $$Γ ⊢ N ∈ B$$. - But we know $$Γ , x ↦ A ⊢ N ∈ B$$ and, since $$x ≡ x′$$ - does not appear free in $$λᵀ x′ ∈ A′ ⇒ N′$$, the context invariance - lemma yields $$Γ ⊢ N ∈ B$$. + First, suppose `x ≡ x′`. Then, by the definition of + substitution, `N [ x := V] = N`, so we just need to show `Γ ⊢ N ∈ B`. + But we know `Γ , x ↦ A ⊢ N ∈ B` and, since `x ≡ x′` + does not appear free in `λᵀ x′ ∈ A′ ⇒ N′`, the context invariance + lemma yields `Γ ⊢ N ∈ B`. - Second, suppose $$x ≢ x′$$. We know $$Γ , x ↦ A , x′ ↦ A′ ⊢ N′ ∈ B′$$ + Second, suppose `x ≢ x′`. We know `Γ , x ↦ A , x′ ↦ A′ ⊢ N′ ∈ B′` by inversion of the typing relation, from which - $$Γ , x′ ↦ A′ , x ↦ A ⊢ N′ ∈ B′$$ follows by update permute, - so the IH applies, giving us $$Γ , x′ ↦ A′ ⊢ N′ [ x := V ] ∈ B′$$ - By $$⇒-I$$, we have $$Γ ⊢ λᵀ x′ ∈ A′ ⇒ (N′ [ x := V ]) ∈ A′ ⇒ B′$$ - and the definition of substitution (noting $$x ≢ x′$$) gives - $$Γ ⊢ (λᵀ x′ ∈ A′ ⇒ N′) [ x := V ] ∈ A′ ⇒ B′$$ as required. + `Γ , x′ ↦ A′ , x ↦ A ⊢ N′ ∈ B′` follows by update permute, + so the IH applies, giving us `Γ , x′ ↦ A′ ⊢ N′ [ x := V ] ∈ B′` + By `⇒-I`, we have `Γ ⊢ λᵀ x′ ∈ A′ ⇒ (N′ [ x := V ]) ∈ A′ ⇒ B′` + and the definition of substitution (noting `x ≢ x′`) gives + `Γ ⊢ (λᵀ x′ ∈ A′ ⇒ N′) [ x := V ] ∈ A′ ⇒ B′` as required. - - If $$N$$ is an application $$L′ ·ᵀ M′$$, the result follows + - If `N` is an application `L′ ·ᵀ M′`, the result follows straightforwardly from the definition of substitution and the induction hypotheses. @@ -4547,425 +4547,425 @@ $$Γ \vdash N [ x := V ] ∈ B$$. We need a couple of lemmas. A closed term can be weakened to any context, and just is injective.-weaken-closed : ∀ {V A Γ} → ∅ ⊢ V ∈ A → Γ ⊢ V ∈ A weaken-closed {V} {A} {Γ} ⊢V = weaken Γ~Γ′ ⊢V where Γ~Γ′ : ∀ {x} → x FreeIn V → ∅ x ≡ Γ x Γ~Γ′ {x} x∈V = ⊥-elim (x∉V x∈V) where x∉V : ¬ (x FreeIn V) x∉V = ∅⊢-closed ⊢V {x} just-injective : ∀ {X : Set} {x y : X} → _≡_ {A = Maybe X} (just x) (just y) → x ≡ y just-injective refl = refl @@ -4973,1024 +4973,914 @@ We need a couple of lemmas. A closed term can be weakened to any context, and ju{x}(preservation-[:=] {A} {λᵀ x′ ∈ A′ ⇒⊢M N′⊢V}) {.A′ ⇒ B′} {V} (⇒-I {.(Γ , x ↦ A)} {.x′} {.N′} {.A′} {.B′} ⊢N′) ⊢V with x ≟ x′ -...| yes x≡x′ rewrite x≡x′ = weaken Γ′~Γ (⇒-I ⊢N′) - where - Γ′~Γ : ∀ {y} → y FreeIn (λᵀ x′ ∈ A′ ⇒ N′) → (Γ , x′ ↦ A) y ≡ Γ y - Γ′~Γ {y} (free-λᵀ x′≢y y∈N′) with x′ ≟ y - ...| yes x′≡y = ⊥-elim (x′≢y x′≡y) - ...| no _ = refl -...| no x≢x′ = ⇒-I ⊢N′V - where - x′x⊢N′ : (Γ , x′ ↦ A′ , x ↦ A) ⊢ N′ ∈ B′ - x′x⊢N′ rewrite update-permute Γ x A x′ A′ x≢x′ = {!⊢N′!} - ⊢N′V : (Γ , x′ ↦ A′) ⊢ N′ [ x := V ] ∈ B′ - ⊢N′V = preservation-[:=] x′x⊢N′ ⊢V -{- -...| yes x′≡x rewrite x′≡x | update-shadow Γ x A A′ = {!!} - -- ⇒-I ⊢N′ -...| no x′≢x rewrite update-permute Γ x′ A′ x A x′≢x = {!!} - -- ⇒-I {Γ} {x′} {N′} {A′} {B′} (preservation-[:=] {(Γ , x′ ↦ A′)} {x} {A} ⊢N′ ⊢V) --} -preservation-[:=] (⇒-E ⊢L ⊢M) ⊢V = ⇒-E (preservation-[:=] ⊢L ⊢V) (preservation-[:=] ⊢M ⊢V) -preservation-[:=] 𝔹-I₁ ⊢V = 𝔹-I₁ -preservation-[:=] 𝔹-I₂ ⊢V = 𝔹-I₂ -preservation-[:=] (𝔹-E ⊢L ⊢M ⊢N) ⊢V = - 𝔹-E (preservation-[:=] ⊢L ⊢V) (preservation-[:=] ⊢M ⊢V) (preservation-[:=] ⊢N ⊢V) - -{- -[:=]-preserves-⊢ {Γ} {x} v∶A (var y y∈Γ) with x ≟ y -... | yes x=y = {!!} -... | no x≠y = {!!} -[:=]-preserves-⊢ v∶A (abs t′∶B) = {!!} -[:=]-preserves-⊢ v∶A (app t₁∶A⇒B t₂∶A) = - app ([:=]-preserves-⊢ v∶A t₁∶A⇒B) ([:=]-preserves-⊢ v∶A t₂∶A) -[:=]-preserves-⊢ v∶A true = true -[:=]-preserves-⊢ v∶A false = false -[:=]-preserves-⊢ v∶A (if t₁∶bool then t₂∶B else t₃∶B) = - if [:=]-preserves-⊢ v∶A t₁∶bool - then [:=]-preserves-⊢ v∶A t₂∶B - else [:=]-preserves-⊢ v∶A t₃∶B --}-preservation-[:=] {_} {x} (Ax {_} {x′} [Γ,x↦A]x′≡B) ⊢V with x ≟ x′ +...| yes x≡x′ rewrite just-injective [Γ,x↦A]x′≡B = weaken-closed ⊢V +...| no x≢x′ = Ax [Γ,x↦A]x′≡B +preservation-[:=] {Γ} {x} {A} {λᵀ x′ ∈ A′ ⇒ N′} {.A′ ⇒ B′} {V} (⇒-I ⊢N′) ⊢V with x ≟ x′ +...| yes x≡x′ rewrite x≡x′ = weaken Γ′~Γ (⇒-I ⊢N′) + where + Γ′~Γ : ∀ {y} → y FreeIn (λᵀ x′ ∈ A′ ⇒ N′) → (Γ , x′ ↦ A) y ≡ Γ y + Γ′~Γ {y} (free-λᵀ x′≢y y∈N′) with x′ ≟ y + ...| yes x′≡y = ⊥-elim (x′≢y x′≡y) + ...| no _ = refl +...| no x≢x′ = ⇒-I ⊢N′V + where + x′x⊢N′ : Γ , x′ ↦ {_}A′ {, x} (Ax↦ {_}A ⊢ N′ {x′}∈ B′ [Γ,x↦A]x′≡B) ⊢V with x ≟ x′ -...|x′x⊢N′ yes x≡x′ rewrite just-injectiveupdate-permute [Γ,x↦A]x′≡B = weaken-closedΓ ⊢V -...|x no A x′ A′ x≢x′ = Ax [Γ,x↦A]x′≡B= ⊢N′ -{- -preservation-[:=] {Γ} {x} {A} {varᵀ x′} {B} {V} (Ax {.(Γ , x ↦ A)} {.x′} {.B} Γx′≡B) ⊢V with x ≟ x′ -...| yes x≡x′ rewrite just-injective Γx′≡B = weaken-closed ⊢V -...| no x≢x′ = Ax {Γ} {x′} {B} Γx′≡B --}⊢N′V : (Γ , x′ ↦ A′) ⊢ N′ [ x := V ] ∈ B′ -⊢N′V = preservation-[:=] x′x⊢N′ ⊢V +preservation-[:=] (⇒-E ⊢L ⊢M) ⊢V = ⇒-E (preservation-[:=] ⊢L ⊢V) (preservation-[:=] ⊢M ⊢V) +preservation-[:=] 𝔹-I₁ ⊢V = 𝔹-I₁ +preservation-[:=] 𝔹-I₂ ⊢V = 𝔹-I₂ +preservation-[:=] (𝔹-E ⊢L ⊢M ⊢N) ⊢V = + 𝔹-E (preservation-[:=] ⊢L {Γ⊢V})-preservation : ∀ {M N A} → ∅ ⊢ M ∈ A → M ⟹ N → ∅ ⊢ N ∈ A-_Proof_: By induction on the derivation of $$\vdash t : T$$. +_Proof_: By induction on the derivation of `\vdash t : T`. -- We can immediately rule out $$var$$, $$abs$$, $$T_True$$, and - $$T_False$$ as the final rules in the derivation, since in each of - these cases $$t$$ cannot take a step. +- We can immediately rule out `var`, `abs`, `T_True`, and + `T_False` as the final rules in the derivation, since in each of + these cases `t` cannot take a step. -- If the last rule in the derivation was $$app$$, then $$t = t_1 - t_2$$. There are three cases to consider, one for each rule that - could have been used to show that $$t_1 t_2$$ takes a step to $$t'$$. +- If the last rule in the derivation was `app`, then `t = t_1 + t_2`. There are three cases to consider, one for each rule that + could have been used to show that `t_1 t_2` takes a step to `t'`. - - If $$t_1 t_2$$ takes a step by $$Sapp1$$, with $$t_1$$ stepping to - $$t_1'$$, then by the IH $$t_1'$$ has the same type as $$t_1$$, and - hence $$t_1' t_2$$ has the same type as $$t_1 t_2$$. + - If `t_1 t_2` takes a step by `Sapp1`, with `t_1` stepping to + `t_1'`, then by the IH `t_1'` has the same type as `t_1`, and + hence `t_1' t_2` has the same type as `t_1 t_2`. - - The $$Sapp2$$ case is similar. + - The `Sapp2` case is similar. - - If $$t_1 t_2$$ takes a step by $$Sred$$, then $$t_1 = - \lambda x:t_{11}.t_{12}$$ and $$t_1 t_2$$ steps to $$$$x:=t_2$$t_{12}$$; the + - If `t_1 t_2` takes a step by `Sred`, then `t_1 = + \lambda x:t_{11}.t_{12}` and `t_1 t_2` steps to ``x:=t_2`t_{12}`; the desired result now follows from the fact that substitution preserves types. - - If the last rule in the derivation was $$if$$, then $$t = if t_1 - then t_2 else t_3$$, and there are again three cases depending on - how $$t$$ steps. + - If the last rule in the derivation was `if`, then `t = if t_1 + then t_2 else t_3`, and there are again three cases depending on + how `t` steps. - - If $$t$$ steps to $$t_2$$ or $$t_3$$, the result is immediate, since - $$t_2$$ and $$t_3$$ have the same type as $$t$$. + - If `t` steps to `t_2` or `t_3`, the result is immediate, since + `t_2` and `t_3` have the same type as `t`. - - Otherwise, $$t$$ steps by $$Sif$$, and the desired conclusion + - Otherwise, `t` steps by `Sif`, and the desired conclusion follows directly from the induction hypothesis.-preservation (Ax x₁) () preservation (⇒-I ⊢N) () preservation (⇒-E (⇒-I ⊢N) ⊢V) (β⇒ valueV) = preservation-[:=] ⊢N ⊢V preservation (⇒-E ⊢L ⊢M) (γ⇒₁ L⟹L′) with preservation ⊢L L⟹L′ ...| ⊢L′ = ⇒-E ⊢L′ ⊢M preservation (⇒-E ⊢L ⊢M) (γ⇒₂ valueL M⟹M′) with preservation ⊢M M⟹M′ ...| ⊢M′ = ⇒-E ⊢L ⊢M′ preservation 𝔹-I₁ () preservation 𝔹-I₂ () preservation (𝔹-E 𝔹-I₁ ⊢M ⊢N) β𝔹₁ = ⊢M preservation (𝔹-E 𝔹-I₂ ⊢M ⊢N) β𝔹₂ = ⊢N preservation (𝔹-E ⊢L ⊢M ⊢N) (γ𝔹 L⟹L′) with preservation ⊢L L⟹L′ ...| ⊢L′ = 𝔹-E ⊢L′ ⊢M ⊢N -- Writing out implicit parameters in full -- preservation (⇒-E {Γ} {λᵀ x ∈ A ⇒ N} {M} {.A} {B} (⇒-I {.Γ} {.x} {.N} {.A} {.B} ⊢N) ⊢M) (β⇒ {.x} {.A} {.N} {.M} valueM) -- = preservation-[:=] {Γ} {x} {A} {M} {N} {B} ⊢M ⊢N @@ -6587,11 +6477,11 @@ Proof with eauto. intros t t' T HT. generalize dependent t'. induction HT; intros t' HE; subst Gamma; subst; - try solve $$inversion HE; subst; auto$$. + try solve `inversion HE; subst; auto`. - (* app inversion HE; subst... (* Most of the cases are immediate by induction, - and $$eauto$$ takes care of them + and `eauto` takes care of them + (* Sred apply substitution_preserves_typing with t_{11}... inversion HT_1... @@ -6601,7 +6491,7 @@ Qed. An exercise in the [Stlc]({{ "Stlc" | relative_url }}) chapter asked about the subject expansion property for the simple language of arithmetic and boolean expressions. Does this property hold for STLC? That is, is it always the case -that, if $$t ==> t'$$ and $$has_type t' T$$, then $$empty \vdash t : T$$? If +that, if `t ==> t'` and `has_type t' T`, then `empty \vdash t : T`? If so, prove it. If not, give a counter-example not involving conditionals. @@ -6620,7 +6510,7 @@ Corollary soundness : forall t t' T, ~(stuck t'). Proof. intros t t' T Hhas_type Hmulti. unfold stuck. - intros $$Hnf Hnot_val$$. unfold normal_form in Hnf. + intros `Hnf Hnot_val`. unfold normal_form in Hnf. induction Hmulti. @@ -6637,10 +6527,10 @@ Formalize this statement and prove it. #### Exercise: 1 star (progress_preservation_statement) Without peeking at their statements above, write down the progress and preservation theorems for the simply typed lambda-calculus. -$$$$ +`` #### Exercise: 2 stars (stlc_variation1) -Suppose we add a new term $$zap$$ with the following reduction rule +Suppose we add a new term `zap` with the following reduction rule --------- (ST_Zap) t ==> zap @@ -6655,7 +6545,7 @@ the presence of these rules? For each property, write either "remains true" or "becomes false." If a property becomes false, give a counterexample. - - Determinism of $$step$$ + - Determinism of `step` - Progress @@ -6663,7 +6553,7 @@ false, give a counterexample. #### Exercise: 2 stars (stlc_variation2) -Suppose instead that we add a new term $$foo$$ with the following +Suppose instead that we add a new term `foo` with the following reduction rules: ----------------- (ST_Foo1) @@ -6677,20 +6567,20 @@ the presence of this rule? For each one, write either "remains true" or else "becomes false." If a property becomes false, give a counterexample. - - Determinism of $$step$$ + - Determinism of `step` - Progress - Preservation #### Exercise: 2 stars (stlc_variation3) -Suppose instead that we remove the rule $$Sapp1$$ from the $$step$$ +Suppose instead that we remove the rule `Sapp1` from the `step` relation. Which of the following properties of the STLC remain true in the presence of this rule? For each one, write either "remains true" or else "becomes false." If a property becomes false, give a counterexample. - - Determinism of $$step$$ + - Determinism of `step` - Progress @@ -6708,7 +6598,7 @@ the presence of this rule? For each one, write either "remains true" or else "becomes false." If a property becomes false, give a counterexample. - - Determinism of $$step$$ + - Determinism of `step` - Progress @@ -6730,7 +6620,7 @@ the presence of this rule? For each one, write either "remains true" or else "becomes false." If a property becomes false, give a counterexample. - - Determinism of $$step$$ + - Determinism of `step` - Progress @@ -6752,7 +6642,7 @@ the presence of this rule? For each one, write either "remains true" or else "becomes false." If a property becomes false, give a counterexample. - - Determinism of $$step$$ + - Determinism of `step` - Progress @@ -6772,7 +6662,7 @@ the presence of this rule? For each one, write either "remains true" or else "becomes false." If a property becomes false, give a counterexample. - - Determinism of $$step$$ + - Determinism of `step` - Progress @@ -6814,10 +6704,10 @@ with arithmetic. Specifically: the definition of values through the Type Soundness theorem), and paste it into the file at this point. - - Extend the definitions of the $$subst$$ operation and the $$step$$ + - Extend the definitions of the `subst` operation and the `step` relation to include appropriate clauses for the arithmetic operators. - - Extend the proofs of all the properties (up to $$soundness$$) of + - Extend the proofs of all the properties (up to `soundness`) of the original STLC to deal with the new syntactic forms. Make sure Agda accepts the whole file. diff --git a/src/Maps.lagda b/src/Maps.lagda index 7d4eebd1..a8d8040d 100644 --- a/src/Maps.lagda +++ b/src/Maps.lagda @@ -86,8 +86,6 @@ y = id "y" z = id "z" \end{code} -## Extensionality - ## Total Maps Our main job in this chapter will be to build a definition of @@ -112,8 +110,8 @@ TotalMap : Set → Set TotalMap A = Id → A \end{code} -Intuitively, a total map over anfi element type $$A$$ _is_ just a -function that can be used to look up ids, yielding $$A$$s. +Intuitively, a total map over anfi element type `A` _is_ just a +function that can be used to look up ids, yielding `A`s. \begin{code} module TotalMap where @@ -129,10 +127,12 @@ applied to any id. \end{code} More interesting is the update function, which (as before) takes -a map $$ρ$$, a key $$x$$, and a value $$v$$ and returns a new map that -takes $$x$$ to $$v$$ and takes every other key to whatever $$ρ$$ does. +a map `ρ`, a key `x`, and a value `v` and returns a new map that +takes `x` to `v` and takes every other key to whatever `ρ` does. \begin{code} + infixl 100 _,_↦_ + _,_↦_ : ∀ {A} → TotalMap A → Id → A → TotalMap A (ρ , x ↦ v) y with x ≟ y ... | yes x=y = v @@ -140,30 +140,11 @@ takes $$x$$ to $$v$$ and takes every other key to whatever $$ρ$$ does. \end{code} This definition is a nice example of higher-order programming. -The update function takes a _function_ $$ρ$$ and yields a new +The update function takes a _function_ `ρ` and yields a new function that behaves like the desired map. -We define handy abbreviations for updating a map two, three, or four times. - - - -\begin{code} - _,_↦_,_↦_ : ∀ {A} → TotalMap A → Id → A → Id → A → TotalMap A - ρ , x₁ ↦ v₁ , x₂ ↦ v₂ = (ρ , x₁ ↦ v₁), x₂ ↦ v₂ - - _,_↦_,_↦_,_↦_ : ∀ {A} → TotalMap A → Id → A → Id → A → Id → A → TotalMap A - ρ , x₁ ↦ v₁ , x₂ ↦ v₂ , x₃ ↦ v₃ = ((ρ , x₁ ↦ v₁), x₂ ↦ v₂), x₃ ↦ v₃ - - _,_↦_,_↦_,_↦_,_↦_ : ∀ {A} → TotalMap A → Id → A → Id → A → Id → A → Id → A → TotalMap A - ρ , x₁ ↦ v₁ , x₂ ↦ v₂ , x₃ ↦ v₃ , x₄ ↦ v₄ = (((ρ , x₁ ↦ v₁), x₂ ↦ v₂), x₃ ↦ v₃), x₄ ↦ v₄ -\end{code} - -For example, we can build a map taking ids to naturals, where $$x$$ -maps to 42, $$y$$ maps to 69, and every other key maps to 0, as follows: +For example, we can build a map taking ids to naturals, where `x` +maps to 42, `y` maps to 69, and every other key maps to 0, as follows: \begin{code} ρ₀ : TotalMap ℕ @@ -206,9 +187,9 @@ The `always` map returns its default element for all keys: