Compare commits

...

51 commits
8980 ... dev

Author Message Date
Wen Kokke
f331b37077
Cache ~/.ghcup. 2021-09-02 13:57:57 +01:00
Wen Kokke
9780deb997
Attempt to split up stack build. 2021-09-02 13:52:44 +01:00
Wen Kokke
f62da73cd4
Operation timed out. Restarting workflow in parts. 2021-09-02 13:40:53 +01:00
Wen Kokke
6cf717b152
Remove cache operations for Haskell, since they're included in haskell/actions/setup. 2021-09-01 21:19:25 +01:00
Wen Kokke
70c9a61de9
Changed cache id to match site builder name. 2021-09-01 21:17:01 +01:00
Wen Kokke
b100c017c6
Cache site builder separately. 2021-09-01 21:16:38 +01:00
Wen Kokke
19eca137ae
Abbreviated task ids. 2021-09-01 21:01:07 +01:00
Wen Kokke
c8d5f77c0e
Added key for cache-htmlproofer. 2021-09-01 20:56:20 +01:00
Wen Kokke
115547830d
Minor changes. 2021-09-01 20:46:42 +01:00
Wen Kokke
b612ae8a39
Minor updates to workflow. 2021-09-01 20:43:54 +01:00
Wen Kokke
638cbcdd00
Add Agda version to matrix. 2021-09-01 20:20:35 +01:00
Wen Kokke
ab6ee861d8
Ensure Makefile is loud. 2021-09-01 20:16:43 +01:00
Wen Kokke
53847909eb
Cache .cabal, .ghc, and .ghcup. 2021-09-01 20:06:09 +01:00
Wen Kokke
db61ac1c18
Fixed stack file after accidentially messing with it. 2021-09-01 20:03:30 +01:00
Wen Kokke
d697a4f322
Edited makefile and workflow to use system GHC. 2021-09-01 20:02:55 +01:00
Wen Kokke
a40bd03d35
Switch workflow back to macOS. 2021-09-01 19:50:38 +01:00
Wen Kokke
ac07fbf973
Switch workflow to Ubuntu. 2021-09-01 19:48:54 +01:00
Wen Kokke
20a928f74e
Wrote a more comprehensive Build action. 2021-09-01 19:42:29 +01:00
Wen Kokke
701b21bf49
Woops. Moved workflow into .github/workflows. 2021-09-01 19:20:30 +01:00
Wen Kokke
703afcac97
Created basic GitHub action. 2021-09-01 19:18:10 +01:00
wadler
af45cdcfdb Merge branch 'dev' of github.com:plfa/plfa.github.io into dev 2021-08-29 18:37:14 +01:00
Philip Wadler
cdcc12f4cc
Merge pull request #588 from lorenzo-lipparini/patch-2
Fixing minor coding style inconsistencies in part1/Quantifiers
2021-08-29 18:22:36 +01:00
Lorenzo Lipparini
d2c14f40e1
Fix coding style inconsistencies in part1/Quantifiers 2021-08-29 15:26:34 +02:00
wadler
6ea7439507 Merge branch 'dev' of github.com:plfa/plfa.github.io into dev 2021-08-29 10:33:10 +01:00
Philip Wadler
70825b81c0
Merge pull request #587 from plfa/wadler-patch-2
Update Lists.lagda.md
2021-08-29 10:30:47 +01:00
Philip Wadler
3e03d1ee03
Update Lists.lagda.md 2021-08-29 10:29:46 +01:00
wadler
27006a8181 fixed Assignment4 2021-08-28 18:28:04 +01:00
wadler
ba41d279f1 small changes: Subtyping, Inference, Assignment 4, Exam 2021-08-28 18:25:22 +01:00
Wen Kokke
b1e9152bcc
Fixed links. 2021-08-25 22:30:13 +01:00
Wen Kokke
3fc94edf3d
Added slash back into Makefile. 2021-08-25 13:40:22 +01:00
Wen Kokke
e4d7c79aa6
Update .gitignore to disallow PDFs but include plfa.pdf. 2021-08-25 13:20:17 +01:00
Wen Kokke
78108f5750
Minor refactoring. 2021-08-25 13:17:15 +01:00
Wen Kokke
a5e1f77d22
Changed header anchors. Commited WIP fix for #577. 2021-08-25 11:53:30 +01:00
Wen Kokke
07d082c9db
Fixed syntax for assigning anchors. 2021-08-25 00:17:03 +01:00
Wen Kokke
4b5d775ba1
Finished announcement. 2021-08-24 23:42:18 +01:00
Wen Kokke
6d6a1b284a
Minor fix. 2021-08-24 23:02:11 +01:00
Wen Kokke
a9f85c9ab1
EPUB and PDF now more-or-less work 2021-08-24 22:41:18 +01:00
Wen Kokke
b07957d430
Fix EPUB. 2021-08-24 21:48:13 +01:00
Wen Kokke
64ff672300
Removed superfluous metadata template. 2021-08-24 19:23:11 +01:00
Wen Kokke
b4ae07e13c
Fixed PDF and EPUB generation to use toc.metadata. 2021-08-24 19:22:53 +01:00
Wen Kokke
e8df980d4b
Fixed PDF generate to use toc.metadata. 2021-08-24 15:00:50 +01:00
Wen Kokke
2987995c7a
WIP rewrite of book rendering. 2021-08-24 01:01:23 +01:00
Wen Kokke
587985aa9d
Fixed PDF generation. 2021-08-23 18:59:51 +01:00
Wen Kokke
d14b04ebf2
Fixed PDF generation. 2021-08-23 18:34:35 +01:00
Wen Kokke
7345c36d80
Fixed binding issues with existential syntax. 2021-08-23 11:57:30 +01:00
Wen Kokke
a92e1a08c4
Fixed assignment 4. 2021-08-23 00:48:01 +01:00
Altariarite
73fdcf34bd
PDF generation (#575)
* include tools for pdf

* generating acknowledgements to include in the pdf

* changed makefile to call the makefile in pdf/
2021-08-22 19:56:17 +01:00
Yeradis P. Barbosa Marrero
327b0c2f92
fix README.md link to EPUB e-book (#553) 2021-08-22 19:55:36 +01:00
Altariarite
46371b34e8
Updated woff and css to include home brewed font (#570) 2021-08-22 19:53:43 +01:00
Liang-Ting Chen
43bdaa94f4
Use https instead of git for fetching the standard library (#572)
* Use https instead of git for fetching the standard library

* Clone the standard library shallowly
2021-08-22 19:52:55 +01:00
Liang-Ting Chen
09d04baefd
Change some instructions in the README.md (#573)
* Change git commands in the README.md for beginners

* Only clone PLFA shallowly.
* Correct the version of stdandard library for PLFA.

* Install `fix-whitespace` via Stackage

The program `fix-whitespace` is available on Stackage already.
2021-08-22 19:52:20 +01:00
55 changed files with 1349 additions and 550 deletions

117
.github/workflows/build.yml vendored Normal file
View file

@ -0,0 +1,117 @@
on: [push, pull_request]
name: Build
jobs:
build:
runs-on: ${{ matrix.os }}
strategy:
matrix:
os: [macOS-latest]
ghc: ['8.10.5']
agda: ['2.6.1.3']
steps:
# Checkout
- name: Checkout
uses: actions/checkout@v2
with:
ref: dev
fetch-depth: 1
submodules: true
- name: Update stack.yaml
run: |
stack config set system-ghc true
stack config set install-ghc false
shell: bash
# Setup & Cache Haskell
- name: Cache Haskell
uses: actions/cache@v2
id: cache-haskell
with:
path: |
~/.ghcup
key: ${{ matrix.os }}-ghc-${{ matrix.ghc }}
- name: Setup Haskell
if: steps.cache-haskell.outputs.cache-hit != 'true'
uses: haskell/actions/setup@v1
with:
enable-stack: true
ghc-version: ${{ matrix.ghc }}
cabal-version: 'latest'
stack-version: 'latest'
# Setup & Cache HTMLProofer
- name: Get Ruby Info
id: ruby-info
run: |
echo "::set-output name=ruby::$(ruby -e 'puts RUBY_VERSION')"
echo "::set-output name=gem::$(gem --version)"
echo "::set-output name=gemdir::$(gem env gemdir)"
shell: bash
- name: Cache HTMLProofer
uses: actions/cache@v2
id: cache-htmlproofer
with:
path: ${{ steps.ruby-info.outputs.gemdir }}
key: ${{ matrix.os }}-ruby-${{ steps.ruby-info.outputs.ruby }}-gem-${{ steps.ruby-info.outputs.gem }}
- name: Setup HTMLProofer
if: steps.cache-htmlproofer.outputs.cache-hit != 'true'
run: sudo gem install html-proofer
shell: bash
# Setup & Cache Agda, pandoc, and site builder
- name: Cache .stack-work
uses: actions/cache@v2
id: cache-stack-work
with:
path: $GITHUB_WORKSPACE/.stack-work
key: ${{ matrix.os }}-ghc-${{ matrix.ghc }}-agda-${{ matrix.agda }}
- name: Build Agda
if: steps.cache-site-builder.outputs.cache-hit != 'true'
run: stack build Agda-${{ matrix.agda }}
shell: bash
- name: Build Pandoc
if: steps.cache-site-builder.outputs.cache-hit != 'true'
run: stack build pandoc-2.10.1
shell: bash
- name: Build other dependencies
if: steps.cache-site-builder.outputs.cache-hit != 'true'
run: stack build --only-dependencies
shell: bash
- name: Build Site Builder
if: steps.cache-site-builder.outputs.cache-hit != 'true'
run: stack build
shell: bash
# Build & Test Website
- name: Build Website
run: make build
shell: bash
- name: Test Website
run: make test
shell: bash

8
.gitignore vendored
View file

@ -1,4 +1,5 @@
## Agda files ## Agda files
_build/
*.agdai *.agdai
.agda-stdlib.sed .agda-stdlib.sed
.links-*.sed .links-*.sed
@ -11,7 +12,6 @@ stack.yaml.lock
dist-newstyle dist-newstyle
## Jekyll files ## Jekyll files
_site/
.sass-cache/ .sass-cache/
.agda-stdlib.sed .agda-stdlib.sed
.jekyll-metadata .jekyll-metadata
@ -32,6 +32,8 @@ Gemfile.lock
auto/ auto/
## Misc build files ## Misc build files
out/
*.zip *.zip
versions/plfa.github.io-web-*/ versions/plfa.github.io-web-*/
## Include plfa.pdf
!plfa.pdf

3
.gitmodules vendored
View file

@ -1,3 +1,4 @@
[submodule "standard-library"] [submodule "standard-library"]
path = standard-library path = standard-library
url = git@github.com:agda/agda-stdlib.git url = https://github.com/agda/agda-stdlib.git
shallow = true

176
Makefile
View file

@ -1,11 +1,35 @@
################################################################################# #################################################################################
# Configuration # Configuration
################################################################################# #################################################################################
SITE_DIR := _site MAKE ?= make
CACHE_DIR := _cache STACK ?= stack
TMP_DIR := $(CACHE_DIR)/tmp
SITE_DIR := _site
RAW_DIR := $(SITE_DIR)/raw
CACHE_DIR := _cache
TMP_DIR := $(CACHE_DIR)/tmp
AGDA := $(STACK) exec agda -- --no-libraries --include-path=standard-library/src
PANDOC := $(STACK) exec pandoc --
#################################################################################
# Build PLFA site, EPUB, and PDF
#################################################################################
.PHONY: all
all:
$(MAKE) build
$(MAKE) epub-build
$(MAKE) pdf-build
.PHONY: all-clean
all-clean:
$(MAKE) clean
$(MAKE) epub-clean
$(MAKE) pdf-clean
################################################################################# #################################################################################
# Setup Git Hooks # Setup Git Hooks
@ -13,6 +37,7 @@ TMP_DIR := $(CACHE_DIR)/tmp
.PHONY: init .PHONY: init
init: setup-check-fix-whitespace setup-install-htmlproofer init: setup-check-fix-whitespace setup-install-htmlproofer
@echo "Setting up Git Hooks"
git config core.hooksPath .githooks git config core.hooksPath .githooks
@ -20,51 +45,52 @@ init: setup-check-fix-whitespace setup-install-htmlproofer
# Build PLFA site # Build PLFA site
################################################################################# #################################################################################
.PHONY: build-deps
build-deps:
$(STACK) build --only-dependencies
.PHONY: build .PHONY: build
build: \ build: standard-library/ChangeLog.md | build-deps
standard-library/ChangeLog.md @echo "Building website"
stack build && stack exec site build $(STACK) build
$(STACK) exec site build
standard-library/ChangeLog.md: standard-library/ChangeLog.md:
@echo "Updating Agda standard library"
git submodule init git submodule init
git submodule update --recursive git submodule update --recursive
################################################################################# #################################################################################
# Test generated site with HTMLProofer # Test generated site with HTMLProofer
################################################################################# #################################################################################
.PHONY: test .PHONY: test
test: setup-install-htmlproofer build test: setup-install-htmlproofer build
cd $(SITE_DIR) && htmlproofer \ @echo "Testing generated HTML using HTMLProofer"
--check-html \ cd $(SITE_DIR) && htmlproofer \
--disable-external \ --check-html \
--report-invalid-tags \ --disable-external \
--report-missing-names \ --report-invalid-tags \
--report-script-embeds \ --report-missing-names \
--report-missing-doctype \ --report-script-embeds \
--report-eof-tags \ --report-missing-doctype \
--report-mismatched-tags \ --report-eof-tags \
--check-img-http \ --report-mismatched-tags \
--check-opengraph \ --check-img-http \
--check-opengraph \
. .
#################################################################################
# Test generated EPUB with EPUBCheck
#################################################################################
.PHONY: test-epub
test-epub: setup-check-epubcheck build
epubcheck $(SITE_DIR)/plfa.epub
################################################################################# #################################################################################
# Automatically rebuild the site on changes, and start a preview server # Automatically rebuild the site on changes, and start a preview server
################################################################################# #################################################################################
.PHONY: watch .PHONY: watch
watch: \ watch: standard-library/ChangeLog.md | build-deps
standard-library/ChangeLog.md @echo "Watching for changes and rebuilding"
stack build && stack exec site watch $(STACK) build
$(STACK) exec site watch
################################################################################# #################################################################################
@ -72,8 +98,10 @@ watch: \
################################################################################# #################################################################################
.PHONY: update-contributors .PHONY: update-contributors
update-contributors: update-contributors: | build-deps
stack build && stack exec update-contributors @echo "Updating contributors from GitHub"
$(STACK) build
$(STACK) exec update-contributors
################################################################################# #################################################################################
@ -81,9 +109,10 @@ update-contributors:
################################################################################# #################################################################################
.PHONY: clean .PHONY: clean
clean: \ clean: standard-library/ChangeLog.md | build-deps
standard-library/ChangeLog.md @echo "Cleaning generated files for site"
stack build && stack exec site clean $(STACK) build
$(STACK) exec site clean
################################################################################# #################################################################################
@ -102,28 +131,28 @@ list:
.PHONY: publish .PHONY: publish
publish: setup-check-rsync publish: setup-check-rsync
@echo "Building site..." $(MAKE) all
make build @echo "Cleaning intermediate files"
@echo "Testing site..." rm -rf $(RAW_DIR)
make test $(MAKE) test
@echo "Creating web branch..." @echo "Creating web branch"
git fetch --all git fetch --all
git checkout -b web --track origin/web git checkout -b web --track origin/web
rsync -a \ rsync -a \
--filter='P _site/' \ --filter='P $(SITE_DIR)/' \
--filter='P _cache/' \ --filter='P $(CACHE_DIR)/' \
--filter='P .git/' \ --filter='P .git/' \
--filter='P .gitignore' \ --filter='P .gitignore' \
--filter='P .stack-work' \ --filter='P .stack-work' \
--filter='P .nojekyll' \ --filter='P .nojekyll' \
--filter='P CNAME' \ --filter='P CNAME' \
--delete-excluded \ --delete-excluded \
_site/ . $(SITE_DIR) .
git add -A git add -A
@echo "Publishing web branch..." @echo "Publishing web branch"
git commit -m "Publish." git commit -m "Publish."
git push origin web:web git push origin web:web
@echo "Deleting web branch..." @echo "Deleting web branch"
git checkout dev git checkout dev
git branch -D web git branch -D web
@ -142,18 +171,35 @@ ifeq (,$(wildcard $(PLFA_AFS_DIR)))
@exit 1 @exit 1
else else
ifeq (,$(wildcard $(PLFA_AFS_DIR)/html)) ifeq (,$(wildcard $(PLFA_AFS_DIR)/html))
@echo "Checkout latest version from GitHub"
git clone https://github.com/plfa/plfa.github.io.git --branch web --single-branch --depth 1 html git clone https://github.com/plfa/plfa.github.io.git --branch web --single-branch --depth 1 html
endif endif
cd $(PLFA_AFS_DIR)/html \ @echo "Checkout latest version from GitHub"
@cd $(PLFA_AFS_DIR)/html \
&& git fetch --depth 1 \ && git fetch --depth 1 \
&& git reset --hard origin/web \ && git reset --hard origin/web \
&& git clean -dfx && git clean -dfx
fsr setacl $(PLFA_AFS_DIR)/html system:groupwebserver rl @echo "Setting permissions to include web server"
@fsr setacl $(PLFA_AFS_DIR)/html system:groupwebserver rl
endif endif
################################################################################# #################################################################################
# Setup dependencies # Build EPUB
#################################################################################
include book/epub.mk
#################################################################################
# Build PDF
#################################################################################
include book/pdf.mk
#################################################################################
# Setup or check dependencies
################################################################################# #################################################################################
.PHONY: setup-check-stack .PHONY: setup-check-stack
@ -189,11 +235,12 @@ ifeq (,$(wildcard $(shell which fix-whitespace)))
@echo " stack install --stack-yaml stack-8.8.3.yaml" @echo " stack install --stack-yaml stack-8.8.3.yaml"
endif endif
.PHONY: setup-check-epubcheck .PHONY: setup-check-latexmk
setup-check-epubcheck: setup-check-latexmk:
ifeq (,$(wildcard $(shell which epubcheck))) ifeq (,$(wildcard $(shell which latexmk)))
@echo "The command you called requires EPUBCheck" @echo "The command you called requires Latexmk"
@echo "See: https://github.com/w3c/epubcheck" @echo "Latemk is included in MacTeX and MikTeX"
@echo "See: https://mg.readthedocs.io/latexmk.html"
endif endif
.PHONY: setup-check-rsync .PHONY: setup-check-rsync
@ -218,6 +265,19 @@ ifeq (,$(wildcard $(shell which bundle)))
gem install bundle gem install bundle
endif endif
.PHONY: setup-install-agda
setup-install-agda: setup-check-stack
ifeq (,$(wildcard $(shell stack exec which -- agda)))
@echo "Installing Agda"
stack build --only-dependencies
endif
.PHONY: setup-install-pandoc
setup-install-pandoc: setup-check-stack
ifeq (,$(wildcard $(shell stack exec which -- pandoc)))
@echo "Installing Pandoc"
stack build --only-dependencies
endif
################################################################################# #################################################################################
# Build legacy versions of website using Jekyll # Build legacy versions of website using Jekyll

View file

@ -94,20 +94,21 @@ Finished hello.
### Install PLFA and the Agda standard library ### Install PLFA and the Agda standard library
You can get the latest version of Programming Language Foundations in Agda from GitHub, either by cloning the repository, or by downloading [the zip archive][plfa-dev]: You can get the latest version of Programming Language Foundations in Agda from GitHub, either by cloning the repository, or by downloading [the zip archive][plfa-dev]:
```bash ```bash
git clone --recurse-submodules https://github.com/plfa/plfa.github.io plfa git clone --depth 1 --recurse-submodules --shallow-submodules https://github.com/plfa/plfa.github.io plfa
# Remove `--depth 1` and `--shallow-submodules` if you want the complete git history of PLFA and the standard library.
``` ```
PLFA ships with the required version of the Agda standard library, so if you cloned with the `--recurse-submodules` flag, youve already got, in the `standard-library` directory! PLFA ships with the required version of the Agda standard library, so if you cloned with the `--recurse-submodules` flag, youve already got, in the `standard-library` directory!
If you forgot to add the `--recurse-submodules` flag, no worries, we can fix that! If you forgot to add the `--recurse-submodules` flag, no worries, we can fix that!
```bash ```bash
cd plfa/ cd plfa/
git submodule update --recursive git submodule update --init --recursive --depth 1
# Remove `--depth 1` if you want the complete git history of the standard library.
``` ```
If you obtained PLFA by downloading the zip archive, you can get the required version of the Agda standard library from GitHub. You can either clone the repository and switch to the correct branch, or you can download the [the zip archive][agda-stdlib]: If you obtained PLFA by downloading the zip archive, you can get the required version of the Agda standard library from GitHub. You can either clone the repository and switch to the correct branch, or you can download the [the zip archive][agda-stdlib]:
```bash ```bash
git clone https://github.com/agda/agda-stdlib.git agda-stdlib git clone https://github.com/agda/agda-stdlib.git --branch v1.6 --depth 1 agda-stdlib
cd agda-stdlib # Remove `--depth 1` if you want the complete git history of the standard library.
git checkout v1.3
``` ```
Finally, we need to let Agda know where to find the Agda standard library. Finally, we need to let Agda know where to find the Agda standard library.
You'll need the path where you installed the standard library. Check to see that the file “standard-library.agda-lib” exists, and make a note of the path to this file. You'll need the path where you installed the standard library. Check to see that the file “standard-library.agda-lib” exists, and make a note of the path to this file.
@ -318,9 +319,7 @@ The repository comes with several Git hooks:
You can install these Git hooks by calling `make init`. You can install these Git hooks by calling `make init`.
You can install [fix-whitespace][fix-whitespace] by running: You can install [fix-whitespace][fix-whitespace] by running:
```bash ```bash
git clone https://github.com/agda/fix-whitespace stack install fix-whitespace
cd fix-whitespace/
stack install --stack-yaml stack-8.8.3.yaml
``` ```
If you want Stack to use your system installation of GHC, follow the instructions for [Using an existing installation of GHC](#using-an-existing-installation-of-ghc). If you want Stack to use your system installation of GHC, follow the instructions for [Using an existing installation of GHC](#using-an-existing-installation-of-ghc).

Binary file not shown.

90
book/epub.css Normal file
View file

@ -0,0 +1,90 @@
/* This is Pandoc's default ebook CSS, modified slightly. */
/* https://github.com/jgm/pandoc/blob/master/data/epub.css */
@font-face {
font-family: 'DejaVu-mononoki-Symbola-Droid';
font-weight: normal;
font-style: normal;
src: url('../fonts/DejaVu-mononoki-Symbola-Droid.woff');
}
body {
margin: 5%;
text-align: justify;
font-size: medium;
}
code {
font-family: 'DejaVu-mononoki-Symbola-Droid', monospace;
}
h1,
h2,
h3,
h4,
h5,
h6 {
text-align: left;
}
nav#toc ol,
nav#landmarks ol {
padding: 0;
margin-left: 1em;
}
nav#toc ol li,
nav#landmarks ol li {
list-style-type: none;
margin: 0;
padding: 0;
}
a.footnote-ref {
vertical-align: super;
}
em,
em em em,
em em em em em {
font-style: italic;
}
em em,
em em em em {
font-style: normal;
}
code {
white-space: pre-wrap;
}
span.smallcaps {
font-variant: small-caps;
}
span.underline {
text-decoration: underline;
}
q {
quotes: "“""”""""";
}
div.column {
display: inline-block;
vertical-align: top;
width: 50%;
}
div.hanging-indent {
margin-left: 1.5em;
text-indent: -1.5em;
}
/* Workaround for iBooks issue; see #6242 */
@media screen {
.sourceCode {
overflow: visible !important;
white-space: pre-wrap !important;
}
}

View file

@ -1,7 +1,7 @@
$for(parts)$ $for(parts)$
# $title$ # $title$
$for(sections)$ $for(sections)$
$content$ ## $title$ {#$anchor$}
$shifted_raw$
$endfor$ $endfor$
$endfor$ $endfor$

80
book/epub.mk Normal file
View file

@ -0,0 +1,80 @@
#################################################################################
# Configuration
#################################################################################
EPUB_DIR := book
EPUB_TEMPLATE_DIR := $(EPUB_DIR)/templates
EPUB_LUA_DIR := $(EPUB_DIR)/lua
EPUB_LUA_SCRIPTS := $(wildcard $(EPUB_LUA_DIR)/*.lua)
MD_DIR := src
MD_FILES := README.md $(wildcard $(MD_DIR)/plfa/**/*.md)
FRANKENFONT := public/webfonts/DejaVu-mononoki-Symbola-Droid.woff
#################################################################################
# Compile PLFA to an EPUB using Pandoc
#################################################################################
.PHONY: epub epub-build
epub: epub-build
epub-build: $(SITE_DIR)/plfa.epub
$(SITE_DIR)/plfa.epub: \
$(RAW_DIR)/epub.md $(EPUB_DIR)/epub.css $(RAW_DIR)/epub.xml $(FRANKENFONT) \
$(MD_FILES) $(EPUB_LUA_SCRIPTS) | setup-install-pandoc
@echo "Building EPUB"
@$(PANDOC) \
--strip-comments \
--css=$(EPUB_DIR)/epub.css \
--epub-embed-font=$(FRANKENFONT) \
--epub-metadata=$(RAW_DIR)/epub.xml \
--indented-code-class=default \
--lua-filter=$(EPUB_LUA_DIR)/set-default-code-class.lua -M default-code-class=agda \
--lua-filter=$(EPUB_LUA_DIR)/remove-badges.lua -M badge-url=https://img.shields.io/badge/ \
--lua-filter=$(EPUB_LUA_DIR)/epub-clean-html.lua \
--lua-filter=$(EPUB_LUA_DIR)/single-file-links.lua \
--lua-filter=$(EPUB_LUA_DIR)/single-file-identifiers.lua \
--standalone \
--toc --toc-depth=2 \
--epub-chapter-level=2 \
$< -o $@
#################################################################################
# Test EPUB with EPUBCheck
#################################################################################
.PHONY: epub-test
epub-test: $(SITE_DIR)/plfa.epub | setup-check-epubcheck
@echo "Testing EPUB with EPUBCheck"
@epubcheck $(SITE_DIR)/plfa.epub
#################################################################################
# Tasks for files that are generated by Hakyll
#################################################################################
$(RAW_DIR)/epub.xml: $(EPUB_DIR)/epub.xml
@make build
#################################################################################
# Clean up and remove generated EPUB
#################################################################################
.PHONY: epub-clean
epub-clean:
@echo "Cleaning generated files for EPUB"
@rm -f $(SITE_DIR)/plfa.epub
#################################################################################
# Setup or check dependencies
#################################################################################
.PHONY: setup-check-epubcheck
setup-check-epubcheck:
ifeq (,$(wildcard $(shell which epubcheck)))
@echo "The command you called requires EPUBCheck"
@echo "See: https://github.com/w3c/epubcheck"
endif

View file

@ -0,0 +1,5 @@
function RawBlock(el)
-- Transforms '<ul class={something}>' into '<ul>'.
el.text = el.text:gsub('%s*<%s*ul%s*class=%s*"?[%w-]+"?%s*>%s*', '<ul>')
return el
end

View file

@ -0,0 +1,47 @@
local unchecked_files = {}
local function is_checked()
-- Check if any of our input files is an unchecked file.
for _, input_file in pairs(PANDOC_STATE.input_files) do
if unchecked_files[input_file] then
if #PANDOC_STATE.input_files > 1 then
error("Cannot handle multiple unchecked input files.")
else
return false
end
end
end
return true
end
local function render_codeblock(cb)
-- If a code block has class Agda or its class is omitted, format it as...
--
-- \begin{agda}\begin{code} .. \end{agda}\end{code}
--
-- ...otherwise, format it as...
--
-- \begin{pre} .. \end{pre}
--
-- Any file which is not checked by Agda must have its code blocks typeset in the latter style.
-- Specify these files via the UNCHECKED_FILES environment variable.
if is_checked() then
if #cb.classes == 0 or cb.classes[1] == 'agda' then
return pandoc.RawBlock('tex', '\\begin{agda}\n\\begin{code}\n' .. cb.text .. '\n\\end{code}\n\\end{agda}')
end
end
return pandoc.RawBlock('tex', '\\begin{pre}\n' .. cb.text .. '\n\\end{pre}')
end
local function get_unchecked_files(meta)
if meta['unchecked-files'] then
for unchecked_file in string.gmatch(pandoc.utils.stringify(meta['unchecked-files']), "([^ ]+)") do
unchecked_files[unchecked_file] = true
end
end
end
return {
{ Meta = get_unchecked_files },
{ CodeBlock = render_codeblock }
}

View file

@ -0,0 +1,22 @@
local badge_url = nil
local function remove_badges(ln)
-- Remove link elements if their first child is a badge.
if #ln.content > 0 and ln.content[1].t == "Image" then
if string.find(ln.content[1].src, badge_url, 1, true) then
return pandoc.Null()
end
end
end
local function get_badge_url(meta)
if meta['badge-url'] then
badge_url = pandoc.utils.stringify(meta['badge-url'])
end
end
return {
{ Meta = get_badge_url },
{ Link = remove_badges }
}

View file

@ -8,19 +8,21 @@
local default_code_classes = {} local default_code_classes = {}
function add_default_code_class(el) local function add_default_code_class(el)
if #(el.classes) == 0 then if #(el.classes) == 0 then
el.classes = default_code_classes el.classes = default_code_classes
return el return el
end end
end end
function get_default_code_class(meta) local function get_default_code_class(meta)
if meta['default-code-class'] then if meta['default-code-class'] then
default_code_classes = {pandoc.utils.stringify(meta['default-code-class'])} default_code_classes = {pandoc.utils.stringify(meta['default-code-class'])}
end end
end end
return {{Meta = get_default_code_class}, return {
{Code = add_default_code_class}, {Meta = get_default_code_class},
{CodeBlock = add_default_code_class}} {Code = add_default_code_class},
{CodeBlock = add_default_code_class}
}

View file

@ -0,0 +1,31 @@
local chapter = nil
function Header(el)
if el.level == 2 then
chapter = el.attr.identifier
end
if el.level >= 3 then
if el.attr.attributes.name then
el.attr.identifier = chapter .. '-' .. el.attr.attributes.name
el.attr.attributes.name = nil
else
el.attr.identifier = chapter .. '-' .. el.attr.identifier
end
end
end
function show(t)
local str = ''
str = str .. '{'
for k,v in pairs(t) do
str = str .. k
str = str .. ':'
if type(v) == 'table' then
str = str .. show(v)
else
str = str .. v
end
end
str = str .. '}'
return str
end

View file

@ -0,0 +1,24 @@
-- from https://github.com/plfa/plfa.github.io/blob/dev/epub/rewrite-links.lua
-- Performs the following transformations on Link targets:
--
-- Case 1:
-- [text](/chapter/#more-stuff) -> [text](#chapter-more-stuff)
--
-- Case 2:
-- [text](/chapter/) -> [text](#chapter)
--
-- All other Links are ignored.
function Link(el)
local n
-- Case 1:
el.target, n = el.target:gsub("^/(%w+)/#([%w-]+)$", "#%1-%2")
-- Case 2:
if n == 0 then
el.target, n = el.target:gsub("^/(%w+)/$", "#%1")
end
-- If either Case 1 or Case 2, lowercase target:
if n ~= 0 then
el.target = string.lower(el.target)
end
return el
end

164
book/pdf.mk Normal file
View file

@ -0,0 +1,164 @@
#################################################################################
# Configuration
#################################################################################
PDF_DIR := book
PDF_TEMPLATE_DIR := $(PDF_DIR)/templates
PDF_LUA_DIR := $(PDF_DIR)/lua
MD_DIR := src
LAGDA_TEX_DIR := $(TMP_DIR)/lagda_tex
TEX_DIR := $(TMP_DIR)/tex
FRANKENFONT := $(PDF_DIR)/DejaVu-mononoki-Symbola-Droid.ttf
#################################################################################
# Helper functions for translating paths
#################################################################################
# Convert MD_DIR/%.md to its modified source path
define MD_PATH
$(patsubst $(MD_DIR)/%/acknowledgements.md,$(RAW_DIR)/%/acknowledgements.md,\
$(patsubst $(PDF_DIR)/pdf.tex,$(RAW_DIR)/pdf.tex,$(1)))
endef
# Convert MD_DIR/%.lagda.md to LAGDA_TEX_DIR/%.lagda.tex
define LAGDA_TEX_PATH
$(patsubst $(MD_DIR)/%.lagda.md,$(LAGDA_TEX_DIR)/%.lagda.tex,$(1))
endef
# Convert MD_DIR/%.md to LAGDA_TEX_DIR/%.lagda.tex or TEX_DIR/%.tex
#
# NOTE: This logic is partially duplicated in hs/Main.hs:addTexPath:texPath.
#
define TEX_PATH
$(patsubst $(RAW_DIR)/%,$(TEX_DIR)/%,\
$(patsubst $(PDF_DIR)/%,$(TEX_DIR)/%,\
$(patsubst README.md,$(TEX_DIR)/plfa/frontmatter/README.tex,\
$(patsubst $(MD_DIR)/%.md,$(TEX_DIR)/%.tex,\
$(patsubst $(MD_DIR)/%.lagda.md,$(TEX_DIR)/%.tex,$(1))))))
endef
#################################################################################
# Lists of source and intermediate files
#################################################################################
PDF_LUA_SCRIPTS := $(wildcard $(PDF_LUA_DIR)/*.lua)
PDF_STATIC_FILES := $(PDF_DIR)/pdf.tex $(FRANKENFONT)
MD_FILES := README.md $(wildcard $(MD_DIR)/plfa/**/*.md)
LAGDA_MD_FILES := $(filter %.lagda.md,$(MD_FILES))
LAGDA_TEX_FILES := $(call LAGDA_TEX_PATH,$(LAGDA_MD_FILES))
TEX_FILES := $(call TEX_PATH,$(MD_FILES) $(RAW_DIR)/pdf.tex $(PDF_STATIC_FILES))
#################################################################################
# Compile PLFA to a PDF using Pandoc and Latexmk
#################################################################################
.PHONY: pdf pdf-build
pdf: pdf-build
pdf-build: $(SITE_DIR)/plfa.pdf
$(SITE_DIR)/plfa.pdf: $(TEX_FILES)
@echo "Building PDF"
@cd $(TEX_DIR) && latexmk -pdf -lualatex -use-make -halt-on-error pdf.tex
@cp $(TEX_DIR)/pdf.pdf $(SITE_DIR)/plfa.pdf
#################################################################################
# Definitions of various compilation tasks
#################################################################################
# Copy static files needed by PDF compilation
define MK_COPYSTATIC_RULE
src := $(1)
dst := $(2)
$$(dst): $$(src)
@echo "Copy $$< to $$@"
@mkdir -p '$$(@D)'
@cp $$< $$@
endef
# Copy static files from PDF_DIR/% to TEX_DIR/%
$(foreach static_file,\
$(PDF_STATIC_FILES),\
$(eval $(call MK_COPYSTATIC_RULE,\
$(call MD_PATH,$(static_file)),\
$(call TEX_PATH,$(static_file)))))
# Compile Markdown files to LaTeX
#
# NOTE: Passing --indented-code-classes=default sets the class for /indented/ code blocks
# to 'default', which lets us distinguish them from /fenced/ code blocks without a
# class. That's important, since fenced code blocks are checked by Agda, but indented
# code blocks are /not/, so latex-render-codeblocks.lua needs to be able to tell the
# difference.
#
define MK_MD2TEX_RULE
src := $(1)
dst := $(2)
$$(dst): $$(src) $(PDF_LUA_SCRIPTS) | setup-install-pandoc
@echo "Compile $$< to $$@"
@mkdir -p '$$(@D)'
@$(PANDOC) \
--top-level-division=chapter \
--indented-code-classes=default \
--lua-filter=$(PDF_LUA_DIR)/remove-badges.lua -M badge-url=https://img.shields.io/badge/ \
--lua-filter=$(PDF_LUA_DIR)/latex-render-codeblocks.lua -M unchecked-files=README.md \
--lua-filter=$(PDF_LUA_DIR)/single-file-links.lua \
$$< -o $$@
endef
# Compile .md files (from MD_DIR/%.md to TEX_DIR/%.tex)
$(foreach md_file,\
$(filter-out %.lagda.md,$(MD_FILES)),\
$(eval $(call MK_MD2TEX_RULE,\
$(call MD_PATH,$(md_file)),\
$(call TEX_PATH,$(md_file)))))
# Compile .lagda.md files (from MD_DIR/%.md to LAGDA_TEX_DIR/%.lagda.tex)
$(foreach lagda_md_file,\
$(LAGDA_MD_FILES),\
$(eval $(call MK_MD2TEX_RULE,\
$(lagda_md_file),\
$(call LAGDA_TEX_PATH,$(lagda_md_file)))))
# Compile Literate Agda files to LaTeX
define MK_LAGDA_MD2TEX_RULE
src := $(1)
dst := $(2)
$$(dst): $$(src) $(LAGDA_TEX_FILES) | setup-install-agda setup-check-latexmk
@$(AGDA) --include-path=$(LAGDA_TEX_DIR) --latex --latex-dir=$(TEX_DIR) $$<
endef
# Compile .lagda.tex files (from LAGDA_TEX_DIR/%.lagda.tex to TEX_DIR/%.tex)
$(foreach lagda_md_file,\
$(LAGDA_MD_FILES),\
$(eval $(call MK_LAGDA_MD2TEX_RULE,\
$(call LAGDA_TEX_PATH,$(lagda_md_file)),\
$(call TEX_PATH,$(lagda_md_file)))))
#################################################################################
# Tasks for files that are generated by Hakyll
#################################################################################
$(RAW_DIR)/pdf.tex: $(PDF_DIR)/pdf.tex $(MD_DIR)/plfa/toc.metadata
@make build
# Generated by Hakyll
$(RAW_DIR)/plfa/backmatter/acknowledgements.md: $(MD_DIR)/plfa/backmatter/acknowledgements.md
@make build
#################################################################################
# Clean up and remove generated PDF
#################################################################################
.PHONY: pdf-clean
pdf-clean:
@echo "Cleaning generated files for PDF"
@rm -rf $(SITE_DIR)/plfa.pdf $(LAGDA_TEX_DIR) $(TEX_DIR)

166
book/pdf.tex Normal file
View file

@ -0,0 +1,166 @@
\documentclass[10pt]{book}
\usepackage{hyperref}
\usepackage[links]{agda}
\usepackage{fontspec}
\setmainfont{DejaVu Sans}
\setsansfont{DejaVu Sans}
\setmonofont{[DejaVu-mononoki-Symbola-Droid.ttf]}
\usepackage{soul}
\usepackage{tcolorbox}
\tcbuselibrary{skins,breakable}
\usepackage{fancyvrb}
\usepackage{xcolor}
\usepackage{tikz}
\usepackage{setspace}
\usepackage{geometry}
\geometry{
a4paper,
total={170mm,257mm},
left=20mm,
top=20mm,
}
% Wrap texttt lines
\sloppy
% Disable section numbering
\setcounter{secnumdepth}{0}
% Set the global text color:
\definecolor{textcolor}{HTML}{111111}
\color{textcolor}
% Change background color for inline code in markdown files.
% The following code does not work well for long text as the
% text will exceed the page boundary.
\definecolor{background-color}{HTML}{EEEEFF}
\let\oldtexttt\texttt%
\renewcommand{\texttt}[1]{\colorbox{background-color}{\oldtexttt{#1}}}
% Box with background colour similar to web version:
\newtcolorbox{agda}[1][]{
frame hidden,
colback=background-color,
spartan,
left=5pt,
boxrule=0pt,
breakable,
}
% Verbatim environment similarly indented to Agda code blocks.
\DefineVerbatimEnvironment{verbatim}{Verbatim}{xleftmargin=0pt}%
% Adding backgrounds to verbatim environments.
\newenvironment{pre}{
\VerbatimEnvironment
\begin{agda}
\begin{verbatim}
}{\end{verbatim}
\end{agda}
}
% Use special font families without TeX ligatures for Agda code.
% Solution inspired by a comment by Enrico Gregorio:
% https://tex.stackexchange.com/a/103078
\newfontfamily{\AgdaSerifFont}{DejaVu-Serif}
\newfontfamily{\AgdaSansSerifFont}{DejaVu-Sans}
\newfontfamily{\AgdaTypewriterFont}{DejaVu-mononoki-Symbola-Droid}
\renewcommand{\AgdaFontStyle}[1]{{\AgdaTypewriterFont{}#1}}
\renewcommand{\AgdaKeywordFontStyle}[1]{{\AgdaTypewriterFont{}#1}}
\renewcommand{\AgdaStringFontStyle}[1]{{\AgdaTypewriterFont{}#1}}
\renewcommand{\AgdaCommentFontStyle}[1]{{\AgdaTypewriterFont{}#1}}
\renewcommand{\AgdaBoundFontStyle}[1]{{\AgdaTypewriterFont{}#1}}
\AgdaNoSpaceAroundCode{}
% Adjust spacing after toc numbering
\usepackage{tocloft}
\setlength\cftchapnumwidth{3em}
\cftsetpnumwidth{4em}
% Style links with colors instead of boxes:
% https://tex.stackexchange.com/q/823
\definecolor{linkcolor}{HTML}{2A7AE2}
\hypersetup{
colorlinks,
linkcolor={linkcolor},
citecolor={linkcolor},
urlcolor={linkcolor}
}
\begin{document}
% Adjust indentation of Agda code blocks
\setlength{\mathindent}{0pt}
\setlength{\parindent}{0em}
\setlength{\parskip}{1em}
% Provide \tightlist environment
% https://tex.stackexchange.com/q/257418
\providecommand{\tightlist}{%
\setlength{\itemsep}{0pt}\setlength{\parskip}{0pt}}
% Based on \titleAM:
% https://ctan.org/pkg/titlepages
\begin{titlepage}
\newlength{\drop}%
\setlength{\drop}{0.12\textheight}%
\centering%
\vspace*{\drop}
\begingroup% Ancient Mariner, AW p. 151
{\large Philip Wadler, Wen Kokke, and Jeremy G. Siek}\\[\baselineskip]
{\Huge PROGRAMMING LANGUAGE}\\[\baselineskip]
{\Huge FOUNDATIONS}\\[\baselineskip]
{\Large IN}\\[\baselineskip]
{\Huge Agda}\\[\drop]
\vfill%
{\small\scshape 2021}\par%
\null\endgroup
\end{titlepage}
$for(parts)$
% Open front, main, and back matter sections:
$if(frontmatter)$
\frontmatter%
\setcounter{tocdepth}{0}
\tableofcontents%
\setcounter{tocdepth}{1}
$endif$
$if(mainmatter)$
\mainmatter%
$endif$
$if(backmatter)$
\appendix
\addcontentsline{toc}{part}{Appendices}
$endif$
% Only print title for main and back matter:
$if(frontmatter)$
% NOTE: Front matter titles are inserted via book/lua/single-file-headers.lua.
$else$
\part{$title$}
$endif$
% Include each section.
$for(sections)$
\hypertarget{$anchor$}{%
\chapter{$title$}\label{$anchor$}}
\input{$tex_path$}
$endfor$
% Close front, main, and back matter sections:
$if(frontmatter)$
$endif$
$if(mainmatter)$
\cleardoublepage%
\phantomsection%
$endif$
$if(backmatter)$
$endif$
$endfor$
\end{document}

View file

@ -982,15 +982,15 @@ Remember to indent all code by two spaces.
``` ```
ext∋ : ∀ {Γ B x y} ext∋ : ∀ {Γ B x y}
→ x ≢ y → x ≢ y
→ ¬ ( ∃[ A ] Γ ∋ x ⦂ A ) → ¬ ∃[ A ]( Γ ∋ x ⦂ A )
----------------------------- -----------------------------
→ ¬ ( ∃[ A ] Γ , y ⦂ B ∋ x ⦂ A ) → ¬ ∃[ A ]( Γ , y ⦂ B ∋ x ⦂ A )
ext∋ x≢y _ ⟨ A , Z ⟩ = x≢y refl ext∋ x≢y _ ⟨ A , Z ⟩ = x≢y refl
ext∋ _ ¬∃ ⟨ A , S _ ⊢x ⟩ = ¬∃ ⟨ A , ⊢x ⟩ ext∋ _ ¬∃ ⟨ A , S _ ⊢x ⟩ = ¬∃ ⟨ A , ⊢x ⟩
lookup : ∀ (Γ : Context) (x : Id) lookup : ∀ (Γ : Context) (x : Id)
----------------------- ------------------------
→ Dec (∃[ A ](Γ ∋ x ⦂ A)) → Dec (∃[ A ]( Γ ∋ x ⦂ A ))
lookup ∅ x = no (λ ()) lookup ∅ x = no (λ ())
lookup (Γ , y ⦂ B) x with x ≟ y lookup (Γ , y ⦂ B) x with x ≟ y
... | yes refl = yes ⟨ B , Z ⟩ ... | yes refl = yes ⟨ B , Z ⟩
@ -1006,7 +1006,7 @@ Remember to indent all code by two spaces.
→ Γ ⊢ L ↑ A ⇒ B → Γ ⊢ L ↑ A ⇒ B
→ ¬ Γ ⊢ M ↓ A → ¬ Γ ⊢ M ↓ A
---------------------------- ----------------------------
→ ¬ ( ∃[ B ] Γ ⊢ L · M ↑ B ) → ¬ ∃[ B ]( Γ ⊢ L · M ↑ B )
¬arg ⊢L ¬⊢M ⟨ B , ⊢L · ⊢M ⟩ rewrite dom≡ (uniq-↑ ⊢L ⊢L) = ¬⊢M ⊢M ¬arg ⊢L ¬⊢M ⟨ B , ⊢L · ⊢M ⟩ rewrite dom≡ (uniq-↑ ⊢L ⊢L) = ¬⊢M ⊢M
¬switch : ∀ {Γ M A B} ¬switch : ∀ {Γ M A B}
@ -1022,12 +1022,12 @@ Remember to indent all code by two spaces.
``` ```
synthesize : ∀ (Γ : Context) (M : Term⁺) synthesize : ∀ (Γ : Context) (M : Term⁺)
----------------------- -----------------------
→ Dec (∃[ A ](Γ ⊢ M ↑ A)) → Dec (∃[ A ]( Γ ⊢ M ↑ A ))
inherit : ∀ (Γ : Context) (M : Term⁻) (A : Type) inherit : ∀ (Γ : Context) (M : Term⁻) (A : Type)
--------------- ---------------
→ Dec (Γ ⊢ M ↓ A) → Dec (Γ ⊢ M ↓ A)
synthesize Γ (` x) with lookup Γ x synthesize Γ (` x) with lookup Γ x
... | no ¬∃ = no (λ{ ⟨ A , ⊢` ∋x ⟩ → ¬∃ ⟨ A , ∋x ⟩ }) ... | no ¬∃ = no (λ{ ⟨ A , ⊢` ∋x ⟩ → ¬∃ ⟨ A , ∋x ⟩ })

View file

@ -589,9 +589,9 @@ module Problem3 where
``` ```
ext∋ : ∀ {Γ B x y} ext∋ : ∀ {Γ B x y}
→ x ≢ y → x ≢ y
→ ¬ ( ∃[ A ] Γ ∋ x ⦂ A ) → ¬ ∃[ A ]( Γ ∋ x ⦂ A )
----------------------------- -----------------------------
→ ¬ ( ∃[ A ] Γ , y ⦂ B ∋ x ⦂ A ) → ¬ ∃[ A ]( Γ , y ⦂ B ∋ x ⦂ A )
ext∋ x≢y _ ⟨ A , Z ⟩ = x≢y refl ext∋ x≢y _ ⟨ A , Z ⟩ = x≢y refl
ext∋ _ ¬∃ ⟨ A , S _ ⊢x ⟩ = ¬∃ ⟨ A , ⊢x ⟩ ext∋ _ ¬∃ ⟨ A , S _ ⊢x ⟩ = ¬∃ ⟨ A , ⊢x ⟩
@ -614,7 +614,7 @@ module Problem3 where
→ Γ ⊢ L ↑ A ⇒ B → Γ ⊢ L ↑ A ⇒ B
→ ¬ Γ ⊢ M ↓ A → ¬ Γ ⊢ M ↓ A
---------------------------- ----------------------------
→ ¬ ( ∃[ B ] Γ ⊢ L · M ↑ B ) → ¬ ∃[ B ]( Γ ⊢ L · M ↑ B )
¬arg ⊢L ¬⊢M ⟨ B , ⊢L · ⊢M ⟩ rewrite dom≡ (uniq-↑ ⊢L ⊢L) = ¬⊢M ⊢M ¬arg ⊢L ¬⊢M ⟨ B , ⊢L · ⊢M ⟩ rewrite dom≡ (uniq-↑ ⊢L ⊢L) = ¬⊢M ⊢M
¬switch : ∀ {Γ M A B} ¬switch : ∀ {Γ M A B}
@ -631,7 +631,7 @@ module Problem3 where
``` ```
synthesize : ∀ (Γ : Context) (M : Term⁺) synthesize : ∀ (Γ : Context) (M : Term⁺)
----------------------- -----------------------
→ Dec (∃[ A ](Γ ⊢ M ↑ A)) → Dec (∃[ A ]( Γ ⊢ M ↑ A ))
inherit : ∀ (Γ : Context) (M : Term⁻) (A : Type) inherit : ∀ (Γ : Context) (M : Term⁻) (A : Type)
--------------- ---------------

View file

@ -1,42 +0,0 @@
/* This is Pandoc's default ebook CSS, modified slightly. */
/* https://github.com/jgm/pandoc/blob/master/data/epub.css */
@font-face {
font-family: 'mononoki';
font-weight: normal;
font-style: normal;
src: url('../webfonts/mononoki.woff');
}
@font-face {
font-family: 'FreeMono';
font-weight: normal;
font-style: normal;
src: url('../webfonts/FreeMono.woff');
}
@font-face {
font-family: 'DejaVuSansMono';
font-weight: normal;
font-style: normal;
src: url('../webfonts/DejaVuSansMono.woff');
}
body { margin: 5%; text-align: justify; font-size: medium; }
code { font-family: 'mononoki', 'FreeMono', 'DejaVuSansMono', monospace; }
h1, h2, h3, h4, h5, h6 { text-align: left; }
nav#toc ol, nav#landmarks ol { padding: 0; margin-left: 1em; }
nav#toc ol li, nav#landmarks ol li { list-style-type: none; margin: 0; padding: 0; }
a.footnote-ref { vertical-align: super; }
em, em em em, em em em em em { font-style: italic;}
em em, em em em em { font-style: normal; }
code{ white-space: pre-wrap; }
span.smallcaps{ font-variant: small-caps; }
span.underline{ text-decoration: underline; }
q { quotes: "“" "”" "" ""; }
div.column{ display: inline-block; vertical-align: top; width: 50%; }
div.hanging-indent{margin-left: 1.5em; text-indent: -1.5em;}
/* Workaround for iBooks issue; see #6242 */
@media screen {
.sourceCode {
overflow: visible !important;
white-space: pre-wrap !important;
}
}

View file

@ -1,3 +1,8 @@
@font-face{
font-family: DejaVu mononoki Symbola Droid;
src:
url('../webfonts/DejaVu-mononoki-Symbola-Droid.woff') format('woff')
}
@font-face { @font-face {
font-family: 'mononoki'; font-family: 'mononoki';
src: url('../webfonts/mononoki.woff2') format('woff2'), src: url('../webfonts/mononoki.woff2') format('woff2'),
@ -19,7 +24,7 @@
} }
@mixin code-font { @mixin code-font {
font-family: 'mononoki', 'DejaVu Sans Mono', 'Source Code Pro', 'Bitstream Vera Sans Mono', 'FreeMono', 'Courier New', 'Monaco', 'Menlo', monospace, serif; font-family: 'DejaVu mononoki Symbola Droid', 'mononoki', 'DejaVu Sans Mono', 'Source Code Pro', 'Bitstream Vera Sans Mono', 'FreeMono', 'Courier New', 'Monaco', 'Menlo', monospace, serif;
font-size: .85em; font-size: .85em;
} }
@mixin code-container { @mixin code-container {

View file

@ -1,61 +0,0 @@
--- include-files.lua filter to include Markdown files
---
--- Copyright: © 20192020 Albert Krewinkel
--- Copyright: © 2020 Michael Reed
--- License: MIT see LICENSE file for details
---
--- Created by Albert Krewinkel. Slightly modified by Michael Reed for use in
--- generating the EPUB for "Programming Language Foundations in Agda".
---
--- For documentation, see: https://github.com/pandoc/lua-filters/tree/master/include-files
-- pandoc's List type
local List = require 'pandoc.List'
--- Shift headings in block list by given number
function shift_headings(blocks, shift_by)
if not shift_by then
return blocks
end
local shift_headings_filter = {
Header = function (header)
header.level = header.level + shift_by
return header
end
}
return pandoc.walk_block(pandoc.Div(blocks), shift_headings_filter).content
end
--- Filter function for code blocks
function CodeBlock(cb)
-- ignore code blocks which are not of class "include".
if not cb.classes:includes 'include' then
return
end
-- Markdown is used if this is nil.
local format = cb.attributes['format']
local shift_heading_level_by =
tonumber(cb.attributes['shift-heading-level-by'])
local blocks = List:new()
for line in cb.text:gmatch('[^\n]+') do
if line:sub(1,2) ~= '//' then
-- Read in the document at the file path specified by `line`.
local fh = io.open(line)
local document = pandoc.read(fh:read '*a', format)
-- Before shifting headings, add a title heading at the beginning of the chapter.
local heading = pandoc.Header(1, pandoc.Str(pandoc.utils.stringify(document.meta.title)))
document.blocks:insert(1, heading)
-- Shift all headings by the user-specified amount, which is 0 by default.
local chapter = shift_headings(document.blocks, shift_heading_level_by)
-- Concatenate the chapter blocks (discarding the metadata) to the current document.
blocks:extend(chapter)
fh:close()
end
end
return blocks
end

View file

@ -1,19 +0,0 @@
#!/usr/bin/env ruby
require 'yaml'
require 'liquid'
unless ARGV.length == 3
abort "Usage: render-liquid-template.rb [yaml_file] [markdown_in_file] [markdown_out_file]"
end
yaml_file, markdown_file_in, markdown_file_out = ARGV
[yaml_file, markdown_file_in].each do |file|
unless File.file? file
abort "File does not exist: '%s'" % [file]
end
end
metadata = { 'site' => (YAML.load (File.read yaml_file)) }
template = Liquid::Template.parse (File.read markdown_file_in)
File.write markdown_file_out, (template.render metadata)

View file

@ -1,5 +0,0 @@
-- Transforms '<ul class={something}>' into '<ul>'.
function RawBlock (el)
el.text = el.text:gsub('%s*<%s*ul%s*class=%s*"?[%w-]+"?%s*>%s*', '<ul>')
return el
end

View file

@ -1,19 +0,0 @@
-- Performs the following transformations on Link targets:
--
-- Case 1:
-- [text](/chapter/#more-stuff) -> [text](#chapter-more-stuff)
--
-- Case 2:
-- [text](/chapter/) -> [text](#chapter)
--
-- All other Links are ignored.
function Link (el)
local n
-- Case 1:
el.target, n = el.target:gsub("^/(%w+)/#([%w-]+)$", "#%1-%2")
-- Case 2:
if n == 0 then
el.target = el.target:gsub("^/(%w+)/$", "#%1")
end
return el
end

View file

@ -5,6 +5,7 @@
module Hakyll.Web.Agda module Hakyll.Web.Agda
( agdaCompilerWith ( agdaCompilerWith
, agdaVerbosityQuiet , agdaVerbosityQuiet
, compileAgdaWith
, CommandLineOptions(..) , CommandLineOptions(..)
, PragmaOptions(..) , PragmaOptions(..)
, defaultAgdaOptions , defaultAgdaOptions
@ -50,8 +51,12 @@ defaultAgdaPragmaOptions = defaultPragmaOptions
-- |Compile literate Agda to HTML -- |Compile literate Agda to HTML
agdaCompilerWith :: CommandLineOptions -> Compiler (Item String) agdaCompilerWith :: CommandLineOptions -> Compiler (Item String)
agdaCompilerWith agdaOptions = cached "Hakyll.Web.Agda.agdaCompilerWith" $ do agdaCompilerWith agdaOptions =
item <- getResourceBody getResourceBody >>= compileAgdaWith agdaOptions
-- |Compile literate Agda to HTML
compileAgdaWith :: CommandLineOptions -> Item String -> Compiler (Item String)
compileAgdaWith agdaOptions item = cached "Hakyll.Web.Agda.agdaCompilerWith" $ do
let agdaPath = toFilePath (itemIdentifier item) let agdaPath = toFilePath (itemIdentifier item)
let moduleName = agdaModule (itemBody item) let moduleName = agdaModule (itemBody item)
TmpFile tmpPath <- newTmpFile ".lock" TmpFile tmpPath <- newTmpFile ".lock"
@ -120,7 +125,7 @@ readStdlibVersion stdlibPath = do
changelog <- T.readFile changelogPath changelog <- T.readFile changelogPath
let versionLine = head (T.lines changelog) let versionLine = head (T.lines changelog)
case T.stripPrefix "Version " versionLine of case T.stripPrefix "Version " versionLine of
Just versionStr -> return . T.unpack $ "v" <> T.strip versionStr Just versionStr -> return $ T.unpack ("v" <> T.strip versionStr)
Nothing -> error $ printf "Could not read version from '%s'" changelogPath Nothing -> error $ printf "Could not read version from '%s'" changelogPath
-- |Fix references to the Agda standard library. -- |Fix references to the Agda standard library.

View file

@ -3,7 +3,6 @@
module Hakyll.Web.Routes.Permalink module Hakyll.Web.Routes.Permalink
( convertPermalink ( convertPermalink
, permalinkRoute , permalinkRoute
, permalinkRouteWithDefault
, stripIndexFile , stripIndexFile
) where ) where
@ -19,11 +18,12 @@ convertPermalink :: FilePath -> FilePath
convertPermalink link = fromMaybe link (stripPrefix "/" link) </> "index.html" convertPermalink link = fromMaybe link (stripPrefix "/" link) </> "index.html"
permalinkRoute :: Routes permalinkRoute :: Routes
permalinkRoute = permalinkRouteWithDefault (error "Missing field 'permalink'.") permalinkRoute = metadataRoute $ \metadata ->
case lookupString "permalink" metadata of
permalinkRouteWithDefault :: Routes -> Routes Nothing ->
permalinkRouteWithDefault def = metadataRoute $ \metadata -> customRoute (\identifier -> error $ "missing field 'permalink' in metadata " <> toFilePath identifier)
maybe def (constRoute . convertPermalink) (lookupString "permalink" metadata) Just permalink ->
constRoute (convertPermalink permalink)
-- Removes "index.html" from URLs. -- Removes "index.html" from URLs.
stripIndexFile :: String -> String stripIndexFile :: String -> String

View file

@ -0,0 +1,33 @@
{-# LANGUAGE RankNTypes #-}
module Hakyll.Web.Template.Context.Derived where
import Hakyll
import Text.Printf
addDerivedField
:: String
-> (forall b. Context b -> [String] -> Item b -> Compiler ContextField)
-> Context a
-> Context a
addDerivedField key derive ctx = Context $ \k a i ->
if k == key then
derive ctx a i
else do
fld <- unContext ctx k a i
return $
case fld of
-- If the field is a list, recursively derive the field for any list items.
ListField itemCtx items -> ListField (addDerivedField key derive itemCtx) items
-- Otherwise, simply return the field.
otherFld -> otherFld
-- Retrieve a String from the context
getString :: String -> Context a -> [String] -> Item a -> Compiler String
getString key ctx a i = do
fld <- unContext ctx key a i
case fld of
StringField str -> return str
_ -> fail $ printf "Key '%s' does not return a String" key

View file

@ -1,53 +1,85 @@
module Hakyll.Web.Template.Context.Metadata where {-# LANGUAGE OverloadedStrings #-}
module Hakyll.Web.Template.Context.Metadata
( includeContext
, metadataContext
, objectContext
, restoreMetadata
) where
import Control.Monad ((<=<)) import Control.Monad ((<=<))
import Data.Aeson (Object, Value(..)) import Data.Aeson (Object, Value(..))
import qualified Data.ByteString.Char8 as BS
import qualified Data.HashMap.Strict as H import qualified Data.HashMap.Strict as H
import qualified Data.Text as T import qualified Data.Text as T
import qualified Data.Vector as V import qualified Data.Vector as V
import Hakyll import Hakyll
import Text.Printf (printf) import Text.Printf (printf)
import qualified Data.Yaml as Y
-- |Create a Context based on a JSON Object.
objectContext :: Context String -> Context Object
objectContext ctx = Context $ \k _ i ->
let o = itemBody i in
case H.lookup (T.pack k) o of
Just v -> decodeValue ctx v
Nothing -> fail $ printf "Key '%s' undefined in context '%s'" k (show o)
isObject :: Value -> Bool -- |Create a |Context| from the |Metadata| associated with an |Item|.
isObject (Object _) = True metadataContext
isObject _ = False :: Context String -- ^ |Context| used when unfolding a JSON |Array| into a |ListField|.
-> Context String
metadataContext ctx = Context $ \k a i -> do
let identifier = itemIdentifier i
metadata <- getMetadata identifier
item <- makeItem metadata
unContext (objectContext ctx) k a item
isString :: Value -> Bool -- |Create a |Context| from a JSON |Object| which loads data from files under "include" keys.
isString (String _) = True includeContext
isString _ = False :: Context String -- ^ |Context| used when unfolding a JSON |Array| into a |ListField|.
-> Context Object
includeContext ctx = Context $ \k a i -> do
let o = itemBody i
v <- lookupObject "include" o
identifier <- fromFilePath <$> toString v
unContext (ctx <> metadataContext ctx) k a =<< load identifier
-- |Create a |Context| from a JSON |Object|.
objectContext
:: Context String -- ^ |Context| used when unfolding a JSON |Array| into a |ListField|.
-> Context Object
objectContext ctx = Context $ \k _ i -> do
let o = itemBody i
decodeValue ctx =<< lookupObject k o
-- |Decode a JSON Value to a context field.
decodeValue :: Context String -> Value -> Compiler ContextField
decodeValue ctx (Array a) = do
objs <- mapM (makeItem <=< toObject) (V.toList a)
return $ ListField (includeContext ctx <> objectContext ctx) objs
decodeValue _ctx (String s) = return . StringField $ T.unpack s
decodeValue _ctx (Number n) = return . StringField $ show n
decodeValue _ctx (Bool b) = return . StringField $ show b
decodeValue _ctx v = fail $ printf "Unsupported value '%s'" (show v)
-- |Lookup the |Value| stored in an |Object| at the given key.
lookupObject :: MonadFail m => String -> Object -> m Value
lookupObject k o = maybe ifNotFound ifFound (H.lookup (T.pack k) o)
where
ifFound = return
ifNotFound = fail $ printf "Key '%s' undefined in context '%s'" k (show o)
-- |Convert a |Value| to an |Object|, or fail.
toObject :: MonadFail m => Value -> m Object toObject :: MonadFail m => Value -> m Object
toObject (Object o) = return o toObject (Object o) = return o
toObject v = fail $ printf "Not an object '%s'" (show v) toObject v = fail $ printf "Not an object '%s'" (show v)
-- |Convert a |Value| to an |String|, or fail.
toString :: MonadFail m => Value -> m String toString :: MonadFail m => Value -> m String
toString (String s) = return (T.unpack s) toString (String s) = return (T.unpack s)
toString v = fail $ printf "Not a string '%s'" (show v) toString v = fail $ printf "Not a string '%s'" (show v)
-- |Decode a JSON Value to a context field.
decodeValue :: Context String -> Value -> Compiler ContextField
decodeValue ctx (Array a)
| V.all isObject a = do
objs <- mapM (makeItem <=< toObject) (V.toList a)
return $ ListField (objectContext ctx) objs
| V.all isString a = do
items <- mapM (load . fromFilePath <=< toString) (V.toList a)
return $ ListField ctx items
decodeValue _ (String s) = return . StringField $ T.unpack s
decodeValue _ (Number n) = return . StringField $ show n
decodeValue _ (Bool b) = return . StringField $ show b
decodeValue _ v = fail $ printf "Unsupported value '%s'" (show v)
-- |Create a Context based on the Metadata.
metadataContext :: Context String -> Context String
metadataContext ctx = Context $ \k a i ->
unContext (objectContext ctx) k a <=< makeItem <=< getMetadata $ itemIdentifier i
-- |Add the original |Metadata| block back to the file.
restoreMetadata :: Item String -> Compiler (Item String)
restoreMetadata item = do
metadata <- getMetadata (itemIdentifier item)
if H.null metadata then
return item
else do
let yaml = "---\n" <> BS.unpack (Y.encode metadata) <> "---\n\n"
withItemBody (\body -> return (yaml <> body)) item

View file

@ -1,22 +1,26 @@
{-# LANGUAGE OverloadedStrings #-} {-# LANGUAGE OverloadedStrings #-}
import Control.Monad ((<=<), forM_) import Control.Monad ((<=<), (>=>), forM_)
import qualified Data.ByteString.Lazy as BL import Data.Char (toLower)
import Data.Functor ((<&>)) import Data.Functor ((<&>))
import Data.List (isPrefixOf, stripPrefix)
import qualified Data.Text as T import qualified Data.Text as T
import qualified Data.Text.ICU as RE
import qualified Data.Text.ICU.Replace as T
import Hakyll import Hakyll
import Hakyll.Web.Agda import Hakyll.Web.Agda
import Hakyll.Web.Routes.Permalink import Hakyll.Web.Routes.Permalink
import Hakyll.Web.Template.Numeric import Hakyll.Web.Template.Numeric
import Hakyll.Web.Template.Context.Derived
import Hakyll.Web.Template.Context.Metadata import Hakyll.Web.Template.Context.Metadata
import Hakyll.Web.Template.Context.Title import Hakyll.Web.Template.Context.Title
import Hakyll.Web.Sass import Hakyll.Web.Sass
import System.FilePath ((</>), takeDirectory) import System.FilePath ((</>), takeDirectory, replaceExtensions, splitPath, joinPath)
import qualified Text.CSL as CSL import qualified Text.CSL as CSL
import qualified Text.CSL.Pandoc as CSL (processCites) import qualified Text.CSL.Pandoc as CSL (processCites)
import Text.Pandoc (Pandoc(..), ReaderOptions(..), WriterOptions(..), Extension(..)) import Text.Pandoc (Pandoc(..), ReaderOptions(..), WriterOptions(..), Extension(..))
import qualified Text.Pandoc as Pandoc import qualified Text.Pandoc as Pandoc
import qualified Text.Pandoc.Filter as Pandoc (Filter(..), applyFilters) import Text.Printf
-------------------------------------------------------------------------------- --------------------------------------------------------------------------------
-- Configuration -- Configuration
@ -65,20 +69,22 @@ siteContext = mconcat
, "authors/jsiek.metadata" , "authors/jsiek.metadata"
] ]
, constField "google_analytics" "UA-125055580-1" , constField "google_analytics" "UA-125055580-1"
, defaultContext , addAnchor defaultContext
] ]
siteSectionContext :: Context String siteSectionContext :: Context String
siteSectionContext = mconcat siteSectionContext = mconcat
[ titlerunningField [ titlerunningField
, subtitleField , subtitleField
, addShiftedBody "raw" (contentField "raw" "raw")
, siteContext , siteContext
] ]
tableOfContentsContext :: Context String -> Context String tableOfContentsContext :: Context String -> Context String
tableOfContentsContext ctx = Context $ \k a _ -> do tableOfContentsContext ctx = Context $ \k a _ ->
m <- makeItem <=< getMetadata $ "src/plfa/toc.metadata" unContext (objectContext ctx) k a
unContext (objectContext ctx) k a m =<< makeItem
=<< getMetadata "src/plfa/toc.metadata"
acknowledgementsContext :: Context String acknowledgementsContext :: Context String
acknowledgementsContext = mconcat acknowledgementsContext = mconcat
@ -124,29 +130,6 @@ sassOptions = defaultSassOptions
{ sassIncludePaths = Just ["css"] { sassIncludePaths = Just ["css"]
} }
epubSectionContext :: Context String
epubSectionContext = mconcat
[ contentField "content" "content"
, titlerunningField
, subtitleField
]
epubReaderOptions :: ReaderOptions
epubReaderOptions = siteReaderOptions
{ readerStandalone = True
, readerStripComments = True
}
epubWriterOptions :: WriterOptions
epubWriterOptions = siteWriterOptions
{ writerTableOfContents = True
, writerTOCDepth = 2
, writerEpubFonts = [ "public/webfonts/DejaVuSansMono.woff"
, "public/webfonts/FreeMono.woff"
, "public/webfonts/mononoki.woff"
]
, writerEpubChapterLevel = 2
}
-------------------------------------------------------------------------------- --------------------------------------------------------------------------------
-- Build site -- Build site
@ -164,32 +147,29 @@ main = do
-- Build function to fix local URLs -- Build function to fix local URLs
fixLocalLink <- mkFixLocalLink "src" fixLocalLink <- mkFixLocalLink "src"
-- Build compiler for Markdown pages let maybeCompileAgda
let pageCompiler :: Compiler (Item String) :: Maybe CommandLineOptions -- ^ If this argument is passed, Agda compilation is used.
pageCompiler = do -> Item String
-> Compiler (Item String)
maybeCompileAgda Nothing = return
maybeCompileAgda (Just opts) =
compileAgdaWith opts >=>
withItemBody (return . withUrls fixStdlibLink) >=>
withItemBody (return . withUrls fixLocalLink)
-- Build compiler for Markdown pages with optional Literate Agda
let pageCompiler
:: Maybe CommandLineOptions -- ^ If this argument is passed, Agda compilation is used.
-> Compiler (Item String)
pageCompiler maybeOpts = do
csl <- load cslFileName csl <- load cslFileName
bib <- load bibFileName bib <- load bibFileName
getResourceBody getResourceBody
>>= saveSnapshot "raw"
>>= maybeCompileAgda maybeOpts
>>= readMarkdownWith siteReaderOptions >>= readMarkdownWith siteReaderOptions
>>= processCites csl bib >>= processCites csl bib
<&> writeHTML5With siteWriterOptions <&> writeHTML5With siteWriterOptions
>>= saveSnapshot "content"
>>= loadAndApplyTemplate "templates/page.html" siteSectionContext
>>= loadAndApplyTemplate "templates/default.html" siteSectionContext
>>= prettifyUrls
-- Build compiler for literate Agda pages
let pageWithAgdaCompiler :: CommandLineOptions -> Compiler (Item String)
pageWithAgdaCompiler opts = do
csl <- load cslFileName
bib <- load bibFileName
agdaCompilerWith opts
>>= withItemBody (return . withUrls fixStdlibLink)
>>= withItemBody (return . withUrls fixLocalLink)
>>= readMarkdownWith siteReaderOptions
>>= processCites csl bib
<&> writeHTML5With siteWriterOptions
>>= saveSnapshot "content"
>>= loadAndApplyTemplate "templates/page.html" siteSectionContext >>= loadAndApplyTemplate "templates/page.html" siteSectionContext
>>= loadAndApplyTemplate "templates/default.html" siteSectionContext >>= loadAndApplyTemplate "templates/default.html" siteSectionContext
>>= prettifyUrls >>= prettifyUrls
@ -198,40 +178,14 @@ main = do
-- --
-- NOTE: The order of the various match expressions is important: -- NOTE: The order of the various match expressions is important:
-- Special-case compilation instructions for files such as -- Special-case compilation instructions for files such as
-- "src/plfa/epub.md" and "src/plfa/index.md" would be overwritten -- "src/plfa/index.md" would be overwritten by the general
-- by the general purpose compilers for "src/**.md", which would -- purpose compilers for "src/**.md", which would
-- cause them to render incorrectly. It is possible to explicitly -- cause them to render incorrectly. It is possible to explicitly
-- exclude such files using `complement` patterns, but this vastly -- exclude such files using `complement` patterns, but this vastly
-- complicates the match patterns. -- complicates the match patterns.
-- --
hakyll $ do hakyll $ do
-- Compile EPUB
match "src/plfa/epub.md" $ do
route $ constRoute "plfa.epub"
compile $ do
epubTemplate <- load "templates/epub.html"
>>= compilePandocTemplate
epubMetadata <- load "src/plfa/epub.xml"
let ropt = epubReaderOptions
let wopt = epubWriterOptions
{ writerTemplate = Just . itemBody $ epubTemplate
, writerEpubMetadata = Just . T.pack . itemBody $ epubMetadata
}
getResourceBody
>>= applyAsTemplate (tableOfContentsContext epubSectionContext)
>>= readPandocWith ropt
>>= applyPandocFilters ropt [] "epub3"
>>= writeEPUB3With wopt
match "templates/epub.html" $
compile $ getResourceBody
>>= applyAsTemplate siteContext
match "src/plfa/epub.xml" $
compile $ getResourceBody
>>= applyAsTemplate siteContext
-- Compile Table of Contents -- Compile Table of Contents
match "src/plfa/index.md" $ do match "src/plfa/index.md" $ do
route permalinkRoute route permalinkRoute
@ -251,9 +205,9 @@ main = do
route permalinkRoute route permalinkRoute
compile $ getResourceBody compile $ getResourceBody
>>= applyAsTemplate acknowledgementsContext >>= applyAsTemplate acknowledgementsContext
>>= saveSnapshot "raw"
>>= readMarkdownWith siteReaderOptions >>= readMarkdownWith siteReaderOptions
<&> writeHTML5With siteWriterOptions <&> writeHTML5With siteWriterOptions
>>= saveSnapshot "content"
>>= loadAndApplyTemplate "templates/page.html" siteContext >>= loadAndApplyTemplate "templates/page.html" siteContext
>>= loadAndApplyTemplate "templates/default.html" siteContext >>= loadAndApplyTemplate "templates/default.html" siteContext
>>= prettifyUrls >>= prettifyUrls
@ -282,7 +236,7 @@ main = do
>>= readMarkdownWith siteReaderOptions >>= readMarkdownWith siteReaderOptions
>>= processCites csl bib >>= processCites csl bib
<&> writeHTML5With siteWriterOptions <&> writeHTML5With siteWriterOptions
>>= saveSnapshot "content" >>= saveSnapshot "content" -- used for teaser
>>= loadAndApplyTemplate "templates/post.html" postContext >>= loadAndApplyTemplate "templates/post.html" postContext
>>= loadAndApplyTemplate "templates/default.html" siteContext >>= loadAndApplyTemplate "templates/default.html" siteContext
>>= prettifyUrls >>= prettifyUrls
@ -290,12 +244,12 @@ main = do
-- Compile sections using literate Agda -- Compile sections using literate Agda
match "src/**.lagda.md" $ do match "src/**.lagda.md" $ do
route permalinkRoute route permalinkRoute
compile $ pageWithAgdaCompiler agdaOptions compile $ pageCompiler (Just agdaOptions)
-- Compile other sections and pages -- Compile other sections and pages
match ("README.md" .||. "src/**.md" .&&. complement "src/plfa/epub.md") $ do match ("README.md" .||. "src/**.md") $ do
route permalinkRoute route permalinkRoute
compile pageCompiler compile (pageCompiler Nothing)
-- Compile course pages -- Compile course pages
match "courses/**.lagda.md" $ do match "courses/**.lagda.md" $ do
@ -305,11 +259,11 @@ main = do
let courseOptions = agdaOptions let courseOptions = agdaOptions
{ optIncludePaths = courseDir : optIncludePaths agdaOptions { optIncludePaths = courseDir : optIncludePaths agdaOptions
} }
pageWithAgdaCompiler courseOptions pageCompiler (Just courseOptions)
match "courses/**.md" $ do match "courses/**.md" $ do
route permalinkRoute route permalinkRoute
compile pageCompiler compile $ pageCompiler Nothing
match "courses/**.pdf" $ do match "courses/**.pdf" $ do
route idRoute route idRoute
@ -344,9 +298,10 @@ main = do
create ["public/css/style.css"] $ do create ["public/css/style.css"] $ do
route idRoute route idRoute
compile $ do compile $ do
csses <- loadAll ("css/*.css" .||. "css/*.scss" .&&. complement "css/epub.css") csses <- loadAll ("css/*.css" .||. "css/*.scss")
makeItem $ unlines $ map itemBody csses makeItem $ unlines $ map itemBody csses
-- Copy versions -- Copy versions
let versions = ["19.08", "20.07"] let versions = ["19.08", "20.07"]
forM_ versions $ \v -> do forM_ versions $ \v -> do
@ -363,6 +318,35 @@ main = do
compile copyFileCompiler compile copyFileCompiler
-- Raw versions used from Makefile for PDF and EPUB construction
-- Compile raw version of acknowledgements used in constructing the PDF
match "src/plfa/backmatter/acknowledgements.md" $ version "raw" $ do
route $ gsubRoute "src/" (const "raw/")
compile $ getResourceBody
>>= applyAsTemplate acknowledgementsContext
>>= restoreMetadata
-- Compile raw version of index used in constructing the PDF
match "book/pdf.tex" $ version "raw" $ do
route $ gsubRoute "book/" (const "raw/")
compile $ getResourceBody
>>= applyAsTemplate (addTexPath (tableOfContentsContext siteSectionContext))
-- Compile raw version of index used in constructing the EPUB
match "book/epub.md" $ version "raw" $ do
route $ gsubRoute "book/" (const "raw/")
compile $ getResourceBody
>>= applyAsTemplate (tableOfContentsContext siteSectionContext)
>>= loadAndApplyTemplate "templates/metadata.md" siteContext
-- Compile metadata XML used in constructing the EPUB
match "book/epub.xml" $ version "raw" $ do
route $ gsubRoute "book/" (const "raw/")
compile $ getResourceBody
>>= applyAsTemplate siteContext
-------------------------------------------------------------------------------- --------------------------------------------------------------------------------
-- Custom readers and writers -- Custom readers and writers
-------------------------------------------------------------------------------- --------------------------------------------------------------------------------
@ -390,7 +374,7 @@ processCites csl bib item = do
withItemBody (return . CSL.processCites style refs) item withItemBody (return . CSL.processCites style refs) item
-- | Write a document as HTML using Pandoc, with the supplied options. -- | Write a document as HTML using Pandoc, with the supplied options.
writeHTML5With :: WriterOptions -- ^ Writer options for pandoc writeHTML5With :: WriterOptions -- ^ Writer options for Pandoc
-> Item Pandoc -- ^ Document to write -> Item Pandoc -- ^ Document to write
-> Item String -- ^ Resulting HTML -> Item String -- ^ Resulting HTML
writeHTML5With wopt (Item itemi doc) = writeHTML5With wopt (Item itemi doc) =
@ -398,27 +382,6 @@ writeHTML5With wopt (Item itemi doc) =
Left err -> error $ "Hakyll.Web.Pandoc.writePandocWith: " ++ show err Left err -> error $ "Hakyll.Web.Pandoc.writePandocWith: " ++ show err
Right item' -> Item itemi $ T.unpack item' Right item' -> Item itemi $ T.unpack item'
-- | Write a document as EPUB3 using Pandoc, with the supplied options.
writeEPUB3With :: WriterOptions -> Item Pandoc -> Compiler (Item BL.ByteString)
writeEPUB3With wopt (Item itemi doc) =
return $ case Pandoc.runPure $ Pandoc.writeEPUB3 wopt doc of
Left err -> error $ "Hakyll.Web.Pandoc.writeEPUB3With: " ++ show err
Right doc' -> Item itemi doc'
-- | Apply a filter to a Pandoc document.
applyPandocFilters :: ReaderOptions -> [Pandoc.Filter] -> String -> Item Pandoc -> Compiler (Item Pandoc)
applyPandocFilters ropt filters fmt = withItemBody $
unsafeCompiler . Pandoc.runIOorExplode . Pandoc.applyFilters ropt filters [fmt]
-- | Compile a Pandoc template (as opposed to a Hakyll template).
compilePandocTemplate :: Item String -> Compiler (Item (Pandoc.Template T.Text))
compilePandocTemplate i = do
let templatePath = toFilePath $ itemIdentifier i
let templateBody = T.pack $ itemBody i
templateOrError <- unsafeCompiler $ Pandoc.compileTemplate templatePath templateBody
template <- either fail return templateOrError
makeItem template
-------------------------------------------------------------------------------- --------------------------------------------------------------------------------
-- Supply snapshot as a field to the template -- Supply snapshot as a field to the template
@ -428,9 +391,83 @@ contentField :: String -> Snapshot -> Context String
contentField key snapshot = field key $ \item -> contentField key snapshot = field key $ \item ->
itemBody <$> loadSnapshot (itemIdentifier item) snapshot itemBody <$> loadSnapshot (itemIdentifier item) snapshot
-------------------------------------------------------------------------------- --------------------------------------------------------------------------------
-- Relativise URLs and strip "index.html" suffixes -- Relativise URLs and strip "index.html" suffixes
-------------------------------------------------------------------------------- --------------------------------------------------------------------------------
prettifyUrls :: Item String -> Compiler (Item String) prettifyUrls :: Item String -> Compiler (Item String)
prettifyUrls = relativizeUrls <=< withItemBody (return . stripIndexFile) prettifyUrls = relativizeUrls <=< withItemBody (return . stripIndexFile)
--------------------------------------------------------------------------------
-- Text wrangling for EPUB and PDF
--------------------------------------------------------------------------------
-- Convert MD_DIR/%.md to LAGDA_TEX_DIR/%.lagda.tex or TEX_DIR/%.tex
--
-- NOTE: This logic is partially duplicated in book/pdf.mk:TEX_PATH.
--
-- NOTE: This function assumes pdf.tex will be at TEX_DIR/.
--
addTexPath :: Context a -> Context a
addTexPath = addDerivedField "tex_path" deriveTexPath
where
deriveTexPath :: Context a -> [String] -> Item a -> Compiler ContextField
deriveTexPath ctx a i = do
includePath <- getString "include" ctx a i
return $ StringField (texPath includePath)
texPath :: FilePath -> FilePath
texPath fnDotMd
| fnDotMd == "README.md" = "plfa/frontmatter/README.tex"
| any (`isPrefixOf` fnDotMd) ["src/", "book/"] = dropTopDirectory (replaceExtensions fnDotMd ".tex")
| otherwise = error ("textPath: cannot map " <> fnDotMd)
dropTopDirectory :: FilePath -> FilePath
dropTopDirectory = joinPath . tail . splitPath
-- Add an anchor based on the permalink, to be used as the header id.
addAnchor :: Context a -> Context a
addAnchor = addDerivedField "anchor" deriveAnchor
where
deriveAnchor :: Context a -> [String] -> Item a -> Compiler ContextField
deriveAnchor ctx a i = do
permalink <- getString "permalink" ctx a i
StringField <$> anchor permalink
anchor :: String -> Compiler String
anchor permalink =
let maybeAnchor = map toLower <$> (stripSuffix "/" <=< stripPrefix "/") permalink
in maybe (fail $ printf "Key 'permalink' malformed '%s'" permalink) return maybeAnchor
stripSuffix :: String -> String -> Maybe String
stripSuffix suf str = reverse <$> stripPrefix (reverse suf) (reverse str)
-- Add a variant of 'key' where all headers have been shifted by 1.
addShiftedBody :: String -> Context a -> Context a
addShiftedBody key = addDerivedField ("shifted_" <> key) deriveShiftedBody
where
deriveShiftedBody :: Context a -> [String] -> Item a -> Compiler ContextField
deriveShiftedBody ctx a i = do
body <- getString key ctx a i
return $ StringField (shiftHeadersBy body)
-- Shift all headers by a given value.
--
-- NOTE: This is the /proper/ implementation of shift headers.
-- In practice, we use the fast one, which uses regular
-- expressions and only works on Markdown '#' headers.
--
-- shiftHeadersBy :: Int -> Pandoc -> Pandoc
-- shiftHeadersBy n = walk shiftHeader
-- where
-- shiftHeader :: Block -> Block
-- shiftHeader (Header level attr inlines) = Header (level + n) attr inlines
-- shiftHeader block = block
--
shiftHeadersBy :: String -> String
shiftHeadersBy body = T.unpack (T.replaceAll re "#$1" (T.pack body))
where
re = RE.regex [RE.Multiline] "^(#+)"

View file

@ -12,30 +12,33 @@ build-type: Simple
common shared-properties common shared-properties
ghc-options: -Wall ghc-options: -Wall
default-language: Haskell2010 default-language: Haskell2010
build-depends: aeson >=1.4 && <1.5 build-depends: aeson >=1.4 && <1.6
, base >=4.6 && <5 , base >=4.6 && <5
, bytestring >=0.10 && <0.11 , bytestring >=0.10 && <0.11
, containers >=0.6 && <0.7 , containers >=0.6 && <0.7
, directory >=1.2 && <1.4 , directory >=1.2 && <1.4
, doctemplates >=0.8 && <0.9 , doctemplates >=0.8 && <1.0
, extra >=1.7 && <1.8 , extra >=1.7 && <1.8
, filemanip >=0.3 && <0.4 , filemanip >=0.3 && <0.4
, filepath >=1.3 && <1.5 , filepath >=1.3 && <1.5
, frontmatter >=0.1 && <0.2 , frontmatter >=0.1 && <0.2
, hakyll >=4.10 && <4.15 , hakyll >=4.10 && <4.15
, pandoc >=2.1 && <2.11 , pandoc >=2.1 && <2.11
, pandoc-types >=1.20 && <1.21 , pandoc-types >=1.20 && <1.23
, pandoc-citeproc >=0.17 && <0.18 , pandoc-citeproc >=0.17 && <0.18
, text >=1.2 && <1.3 , text >=1.2 && <1.3
, unordered-containers >=0.2 && <0.3 , text-icu >=0.7.1 && <0.8
, vector >=0.12 && <0.13 , text-regex-replace >=0.1 && <0.2
, yaml >=0.11 && <0.12 , unordered-containers >=0.2 && <0.3
, vector >=0.12 && <0.13
, yaml >=0.11 && <0.12
library library
import: shared-properties import: shared-properties
hs-source-dirs: hs hs-source-dirs: hs
exposed-modules: Hakyll.Web.Agda exposed-modules: Hakyll.Web.Agda
, Hakyll.Web.Template.Numeric , Hakyll.Web.Template.Numeric
, Hakyll.Web.Template.Context.Derived
, Hakyll.Web.Template.Context.Metadata , Hakyll.Web.Template.Context.Metadata
, Hakyll.Web.Template.Context.Title , Hakyll.Web.Template.Context.Title
, Hakyll.Web.Sass , Hakyll.Web.Sass
@ -43,8 +46,6 @@ library
build-depends: Agda ==2.6.1.3 build-depends: Agda ==2.6.1.3
, hsass >=0.8 && <0.9 , hsass >=0.8 && <0.9
, regex-tdfa >=1.3 && <1.4 , regex-tdfa >=1.3 && <1.4
, text-icu >=0.7.1 && <0.8
, text-regex-replace >=0.1 && <0.2
executable site executable site
import: shared-properties import: shared-properties

View file

@ -4,6 +4,6 @@ title: "PLFA as EPUB"
It has been just over a year and a half since this feature was first requested in [#112][issue112]… Thanks to hard work by [Michael Reed][mreed20], and a little elbow grease on our part, it is finally here! [An EPUB version of PLFA!][EPUB] It has been just over a year and a half since this feature was first requested in [#112][issue112]… Thanks to hard work by [Michael Reed][mreed20], and a little elbow grease on our part, it is finally here! [An EPUB version of PLFA!][EPUB]
[EPUB]: https://plfa.github.io/out/epub/plfa.epub [EPUB]: https://plfa.github.io/epub/plfa.epub
[issue112]: https://github.com/plfa/plfa.github.io/issues/112 [issue112]: https://github.com/plfa/plfa.github.io/issues/112
[mreed20]: https://github.com/mreed20 [mreed20]: https://github.com/mreed20

View file

@ -0,0 +1,31 @@
---
title: "PLFA as PDF and EPUB"
---
Were pleased to announce that PLFA is now available as both [a PDF][PDF] and [an EPUB][EPUB]! Thanks to hard work by [Yuanting Mao][Altariarite], who did an internship with Phil at the University of Edinburgh, and a little elbow grease on our part, we're finally compiling to PDF!
In a double whammy, weve recently fixed compilation to EPUB as well! Compilation to EPUB never quite survived the move to Hakyll, and the EPUB available on the website was broken for quite a while. Fortunately, weve fixed the EPUB compilation, based on [Yuanting Mao][Altariarite] work on PDF compilation and [Michael Reed][mreed20]s original work on EPUB compilation.
Theres still several kinks to even out in the process:
- ([#577][issue577]) EPUB generation breaks internal links;
- ([#578][issue578]) EPUB fragment identifier is not defined;
- ([#579][issue579]) EPUB font cannot be found;
- ([#580][issue580]) EPUB attribute "name" not allowed;
- ([#581][issue581]) PDF code overflows margins;
And probably tons of other little bugs I've missed!
As always, bug reports and pull requests are very, very welcome!
[Altariarite]: https://github.com/Altariarite
[mreed20]: https://github.com/mreed20
[PDF]: https://plfa.github.io/plfa.pdf
[EPUB]: https://plfa.github.io/plfa.epub
[issue577]: https://github.com/plfa/plfa.github.io/issues/577
[issue578]: https://github.com/plfa/plfa.github.io/issues/578
[issue579]: https://github.com/plfa/plfa.github.io/issues/579
[issue580]: https://github.com/plfa/plfa.github.io/issues/580
[issue581]: https://github.com/plfa/plfa.github.io/issues/581

Binary file not shown.

View file

@ -536,7 +536,7 @@ Show empty is the right identity of sums up to isomorphism.
-- Your code goes here -- Your code goes here
``` ```
## Implication is function {name=implication} ## Implication is function {#implication}
Given two propositions `A` and `B`, the implication `A → B` holds if Given two propositions `A` and `B`, the implication `A → B` holds if
whenever `A` holds then `B` must also hold. We formalise implication using whenever `A` holds then `B` must also hold. We formalise implication using

View file

@ -561,7 +561,7 @@ postulate
-- Your code goes here -- Your code goes here
``` ```
## Proof by reflection {name=proof-by-reflection} ## Proof by reflection {#proof-by-reflection}
Let's revisit our definition of monus from Let's revisit our definition of monus from
Chapter [Naturals](/Naturals/). Chapter [Naturals](/Naturals/).

View file

@ -132,7 +132,7 @@ Again, a useful exercise is to carry out an interactive development,
checking how Agda's knowledge changes as each of the two arguments is checking how Agda's knowledge changes as each of the two arguments is
instantiated. instantiated.
## Congruence and substitution {name=cong} ## Congruence and substitution {#cong}
Equality satisfies _congruence_. If two terms are equal, Equality satisfies _congruence_. If two terms are equal,
they remain so after the same function is applied to both: they remain so after the same function is applied to both:
@ -635,7 +635,7 @@ Jesper Cockx, Dominique Devries, Andreas Nuyts, and Philip Wadler,
draft, 2017.) draft, 2017.)
## Universe polymorphism {name=unipoly} ## Universe polymorphism {#unipoly}
As we have seen, not every type belongs to `Set`, but instead every As we have seen, not every type belongs to `Set`, but instead every
type belongs somewhere in the hierarchy `Set₀`, `Set₁`, `Set₂`, and so on, type belongs somewhere in the hierarchy `Set₀`, `Set₁`, `Set₂`, and so on,

View file

@ -71,7 +71,7 @@ distributes over another operator. A careful author will often call
out these properties---or their lack---for instance by pointing out out these properties---or their lack---for instance by pointing out
that a newly introduced operator is associative but not commutative. that a newly introduced operator is associative but not commutative.
#### Exercise `operators` (practice) {name=operators} #### Exercise `operators` (practice) {#operators}
Give another example of a pair of operators that have an identity Give another example of a pair of operators that have an identity
and are associative, commutative, and distribute over one another. and are associative, commutative, and distribute over one another.
@ -599,7 +599,7 @@ the main proposition first, and the equations required to do so
will suggest what lemmas to prove. will suggest what lemmas to prove.
## Our first corollary: rearranging {name=sections} ## Our first corollary: rearranging {#sections}
We can apply associativity to rearrange parentheses however we like. We can apply associativity to rearrange parentheses however we like.
Here is an example: Here is an example:
@ -695,7 +695,7 @@ judgments where the first number is less than _m_.
There is also a completely finite approach to generating the same equations, There is also a completely finite approach to generating the same equations,
which is left as an exercise for the reader. which is left as an exercise for the reader.
#### Exercise `finite-|-assoc` (stretch) {name=finite-plus-assoc} #### Exercise `finite-|-assoc` (stretch) {#finite-plus-assoc}
Write out what is known about associativity of addition on each of the Write out what is known about associativity of addition on each of the
first four days using a finite story of creation, as first four days using a finite story of creation, as
@ -857,7 +857,7 @@ typing `C-c C-r` will fill it in, completing the proof:
+-assoc (suc m) n p rewrite +-assoc m n p = refl +-assoc (suc m) n p rewrite +-assoc m n p = refl
#### Exercise `+-swap` (recommended) {name=plus-swap} #### Exercise `+-swap` (recommended) {#plus-swap}
Show Show
@ -872,7 +872,7 @@ is associative and commutative.
``` ```
#### Exercise `*-distrib-+` (recommended) {name=times-distrib-plus} #### Exercise `*-distrib-+` (recommended) {#times-distrib-plus}
Show multiplication distributes over addition, that is, Show multiplication distributes over addition, that is,
@ -885,7 +885,7 @@ for all naturals `m`, `n`, and `p`.
``` ```
#### Exercise `*-assoc` (recommended) {name=times-assoc} #### Exercise `*-assoc` (recommended) {#times-assoc}
Show multiplication is associative, that is, Show multiplication is associative, that is,
@ -898,7 +898,7 @@ for all naturals `m`, `n`, and `p`.
``` ```
#### Exercise `*-comm` (practice) {name=times-comm} #### Exercise `*-comm` (practice) {#times-comm}
Show multiplication is commutative, that is, Show multiplication is commutative, that is,
@ -912,7 +912,7 @@ you will need to formulate and prove suitable lemmas.
``` ```
#### Exercise `0∸n≡0` (practice) {name=zero-monus} #### Exercise `0∸n≡0` (practice) {#zero-monus}
Show Show
@ -925,7 +925,7 @@ for all naturals `n`. Did your proof require induction?
``` ```
#### Exercise `∸-|-assoc` (practice) {name=monus-plus-assoc} #### Exercise `∸-|-assoc` (practice) {#monus-plus-assoc}
Show that monus associates with addition, that is, Show that monus associates with addition, that is,
@ -949,7 +949,7 @@ Show the following three laws
for all `m`, `n`, and `p`. for all `m`, `n`, and `p`.
#### Exercise `Bin-laws` (stretch) {name=Bin-laws} #### Exercise `Bin-laws` (stretch) {#Bin-laws}
Recall that Recall that
Exercise [Bin](/Naturals/#Bin) Exercise [Bin](/Naturals/#Bin)

View file

@ -83,7 +83,7 @@ g ∘′ f = λ x → g (f x)
``` ```
## Extensionality {name=extensionality} ## Extensionality {#extensionality}
Extensionality asserts that the only way to distinguish functions is Extensionality asserts that the only way to distinguish functions is
by applying them; if two functions applied to the same argument always by applying them; if two functions applied to the same argument always
@ -450,7 +450,7 @@ postulate
-- Your code goes here -- Your code goes here
``` ```
#### Exercise `_⇔_` (practice) {name=iff} #### Exercise `_⇔_` (practice) {#iff}
Define equivalence of propositions (also known as "if and only if") as follows: Define equivalence of propositions (also known as "if and only if") as follows:
``` ```
@ -465,7 +465,7 @@ Show that equivalence is reflexive, symmetric, and transitive.
-- Your code goes here -- Your code goes here
``` ```
#### Exercise `Bin-embedding` (stretch) {name=Bin-embedding} #### Exercise `Bin-embedding` (stretch) {#Bin-embedding}
Recall that Exercises Recall that Exercises
[Bin](/Naturals/#Bin) and [Bin](/Naturals/#Bin) and

View file

@ -458,7 +458,7 @@ _ =
``` ```
Now the time to reverse a list is linear in the length of the list. Now the time to reverse a list is linear in the length of the list.
## Map {name=Map} ## Map {#Map}
Map applies a function to every element of a list to generate a corresponding list. Map applies a function to every element of a list to generate a corresponding list.
Map is an example of a _higher-order function_, one which takes a function as an Map is an example of a _higher-order function_, one which takes a function as an
@ -558,7 +558,7 @@ Define a suitable map operator over trees:
-- Your code goes here -- Your code goes here
``` ```
## Fold {name=Fold} ## Fold {#Fold}
Fold takes an operator and a value, and uses the operator to combine Fold takes an operator and a value, and uses the operator to combine
each of the elements of the list, taking the given value as the result each of the elements of the list, taking the given value as the result
@ -833,7 +833,7 @@ Show that if `_⊗_` and `e` form a monoid, then `foldr _⊗_ e` and
``` ```
## All {name=All} ## All {#All}
We can also define predicates over lists. Two of the most important We can also define predicates over lists. Two of the most important
are `All` and `Any`. are `All` and `Any`.
@ -990,7 +990,7 @@ You will need to use extensionality.
#### Exercise `All-∀` (practice) #### Exercise `All-∀` (practice)
Show that `All P xs` is isomorphic to `∀ {x} → x ∈ xs → P x`. Show that `All P xs` is isomorphic to `∀ x → x ∈ xs → P x`.
``` ```
-- You code goes here -- You code goes here

View file

@ -76,7 +76,7 @@ after zero; and `2` is shorthand for `suc (suc zero)`, which is the
same as `suc 1`, the successor of one; and `3` is shorthand for the same as `suc 1`, the successor of one; and `3` is shorthand for the
successor of two; and so on. successor of two; and so on.
#### Exercise `seven` (practice) {name=seven} #### Exercise `seven` (practice) {#seven}
Write out `7` in longhand. Write out `7` in longhand.
@ -287,7 +287,7 @@ Parentheses and semicolons are among the few characters that cannot
appear in names, so we do not need extra spaces in the `using` list. appear in names, so we do not need extra spaces in the `using` list.
## Operations on naturals are recursive functions {name=plus} ## Operations on naturals are recursive functions {#plus}
Now that we have the natural numbers, what can we do with them? Now that we have the natural numbers, what can we do with them?
For instance, can we define arithmetic operations such as For instance, can we define arithmetic operations such as
@ -425,7 +425,7 @@ is not like testimony in a court which must be weighed to determine
whether the witness is trustworthy. Rather, it is ironclad. The whether the witness is trustworthy. Rather, it is ironclad. The
other word for evidence, which we will use interchangeably, is _proof_. other word for evidence, which we will use interchangeably, is _proof_.
#### Exercise `+-example` (practice) {name=plus-example} #### Exercise `+-example` (practice) {#plus-example}
Compute `3 + 4`, writing out your reasoning as a chain of equations, using the equations for `+`. Compute `3 + 4`, writing out your reasoning as a chain of equations, using the equations for `+`.
@ -486,7 +486,7 @@ Here we have omitted the signature declaring `_ : 2 * 3 ≡ 6`, since
it can easily be inferred from the corresponding term. it can easily be inferred from the corresponding term.
#### Exercise `*-example` (practice) {name=times-example} #### Exercise `*-example` (practice) {#times-example}
Compute `3 * 4`, writing out your reasoning as a chain of equations, using the equations for `*`. Compute `3 * 4`, writing out your reasoning as a chain of equations, using the equations for `*`.
(You do not need to step through the evaluation of `+`.) (You do not need to step through the evaluation of `+`.)
@ -496,7 +496,7 @@ Compute `3 * 4`, writing out your reasoning as a chain of equations, using the e
``` ```
#### Exercise `_^_` (recommended) {name=power} #### Exercise `_^_` (recommended) {#power}
Define exponentiation, which is given by the following equations: Define exponentiation, which is given by the following equations:
@ -566,7 +566,7 @@ _ =
``` ```
#### Exercise `∸-example₁` and `∸-example₂` (recommended) {name=monus-examples} #### Exercise `∸-example₁` and `∸-example₂` (recommended) {#monus-examples}
Compute `5 ∸ 3` and `3 ∸ 5`, writing out your reasoning as a chain of equations. Compute `5 ∸ 3` and `3 ∸ 5`, writing out your reasoning as a chain of equations.
@ -701,7 +701,7 @@ definitions is quite similar. They might be considered two sides of
the same coin. the same coin.
## The story of creation, finitely {name=finite-creation} ## The story of creation, finitely {#finite-creation}
The above story was told in a stratified way. First, we create The above story was told in a stratified way. First, we create
the infinite set of naturals. We take that set as given when the infinite set of naturals. We take that set as given when
@ -872,7 +872,7 @@ Haskell requires time proportional to the sum of the logarithms of
_m_ and _n_. _m_ and _n_.
#### Exercise `Bin` (stretch) {name=Bin} #### Exercise `Bin` (stretch) {#Bin}
A more efficient representation of natural numbers uses a binary A more efficient representation of natural numbers uses a binary
rather than a unary system. We represent a number as a bitstring: rather than a unary system. We represent a number as a bitstring:

View file

@ -105,7 +105,7 @@ Show that a disjunction of universals implies a universal of disjunctions:
``` ```
postulate postulate
⊎∀-implies-∀⊎ : ∀ {A : Set} {B C : A → Set} → ⊎∀-implies-∀⊎ : ∀ {A : Set} {B C : A → Set} →
(∀ (x : A) → B x) ⊎ (∀ (x : A) → C x) ∀ (x : A) → B x ⊎ C x (∀ (x : A) → B x) ⊎ (∀ (x : A) → C x) → ∀ (x : A) → B x ⊎ C x
``` ```
Does the converse hold? If so, prove; if not, explain why. Does the converse hold? If so, prove; if not, explain why.
@ -438,7 +438,7 @@ postulate
Does the converse hold? If so, prove; if not, explain why. Does the converse hold? If so, prove; if not, explain why.
#### Exercise `Bin-isomorphism` (stretch) {name=Bin-isomorphism} #### Exercise `Bin-isomorphism` (stretch) {#Bin-isomorphism}
Recall that Exercises Recall that Exercises
[Bin](/Naturals/#Bin), [Bin](/Naturals/#Bin),
@ -463,21 +463,21 @@ And to establish the following properties:
to (from b) ≡ b to (from b) ≡ b
Using the above, establish that there is an isomorphism between `` and Using the above, establish that there is an isomorphism between `` and
`∃[ b ](Can b)`. `∃[ b ] Can b`.
We recommend proving the following lemmas which show that, for a given We recommend proving the following lemmas which show that, for a given
binary number `b`, there is only one proof of `One b` and similarly binary number `b`, there is only one proof of `One b` and similarly
for `Can b`. for `Can b`.
≡One : ∀{b : Bin} (o o' : One b) → o ≡ o' ≡One : ∀ {b : Bin} (o o : One b) → o ≡ o
≡Can : ∀{b : Bin} (cb : Can b) (cb' : Can b) → cb ≡ cb' ≡Can : ∀ {b : Bin} (cb cb : Can b) → cb ≡ cb
Many of the alternatives for proving `to∘from` turn out to be tricky. Many of the alternatives for proving `to∘from` turn out to be tricky.
However, the proof can be straightforward if you use the following lemma, However, the proof can be straightforward if you use the following lemma,
which is a corollary of `≡Can`. which is a corollary of `≡Can`.
proj₁≡→Can≡ : {cb cb : ∃[ b ](Can b)} → proj₁ cb ≡ proj₁ cb → cb ≡ cb proj₁≡→Can≡ : {cb cb : ∃[ b ] Can b} → proj₁ cb ≡ proj₁ cb → cb ≡ cb
``` ```
-- Your code goes here -- Your code goes here

View file

@ -241,7 +241,7 @@ lack---for instance by saying that a newly introduced relation is a
partial order but not a total order. partial order but not a total order.
#### Exercise `orderings` (practice) {name=orderings} #### Exercise `orderings` (practice) {#orderings}
Give an example of a preorder that is not a partial order. Give an example of a preorder that is not a partial order.
@ -359,7 +359,7 @@ and `suc n ≤ suc m` and must show `suc m ≡ suc n`. The inductive
hypothesis `≤-antisym m≤n n≤m` establishes that `m ≡ n`, and our goal hypothesis `≤-antisym m≤n n≤m` establishes that `m ≡ n`, and our goal
follows by congruence. follows by congruence.
#### Exercise `≤-antisym-cases` (practice) {name=leq-antisym-cases} #### Exercise `≤-antisym-cases` (practice) {#leq-antisym-cases}
The above proof omits cases where one argument is `z≤n` and one The above proof omits cases where one argument is `z≤n` and one
argument is `s≤s`. Why is it ok to omit them? argument is `s≤s`. Why is it ok to omit them?
@ -559,7 +559,7 @@ Show that multiplication is monotonic with regard to inequality.
``` ```
## Strict inequality {name=strict-inequality} ## Strict inequality {#strict-inequality}
We can define strict inequality similarly to inequality: We can define strict inequality similarly to inequality:
``` ```
@ -597,7 +597,7 @@ and conversely. One can then give an alternative derivation of the
properties of strict inequality, such as transitivity, by properties of strict inequality, such as transitivity, by
exploiting the corresponding properties of inequality. exploiting the corresponding properties of inequality.
#### Exercise `<-trans` (recommended) {name=less-trans} #### Exercise `<-trans` (recommended) {#less-trans}
Show that strict inequality is transitive. Show that strict inequality is transitive.
@ -605,7 +605,7 @@ Show that strict inequality is transitive.
-- Your code goes here -- Your code goes here
``` ```
#### Exercise `trichotomy` (practice) {name=trichotomy} #### Exercise `trichotomy` (practice) {#trichotomy}
Show that strict inequality satisfies a weak version of trichotomy, in Show that strict inequality satisfies a weak version of trichotomy, in
the sense that for any `m` and `n` that one of the following holds: the sense that for any `m` and `n` that one of the following holds:
@ -623,7 +623,7 @@ similar to that used for totality.
-- Your code goes here -- Your code goes here
``` ```
#### Exercise `+-mono-<` (practice) {name=plus-mono-less} #### Exercise `+-mono-<` (practice) {#plus-mono-less}
Show that addition is monotonic with respect to strict inequality. Show that addition is monotonic with respect to strict inequality.
As with inequality, some additional definitions may be required. As with inequality, some additional definitions may be required.
@ -632,7 +632,7 @@ As with inequality, some additional definitions may be required.
-- Your code goes here -- Your code goes here
``` ```
#### Exercise `≤-iff-<` (recommended) {name=leq-iff-less} #### Exercise `≤-iff-<` (recommended) {#leq-iff-less}
Show that `suc m ≤ n` implies `m < n`, and conversely. Show that `suc m ≤ n` implies `m < n`, and conversely.
@ -640,7 +640,7 @@ Show that `suc m ≤ n` implies `m < n`, and conversely.
-- Your code goes here -- Your code goes here
``` ```
#### Exercise `<-trans-revisited` (practice) {name=less-trans-revisited} #### Exercise `<-trans-revisited` (practice) {#less-trans-revisited}
Give an alternative proof that strict inequality is transitive, Give an alternative proof that strict inequality is transitive,
using the relation between strict inequality and inequality and using the relation between strict inequality and inequality and
@ -749,7 +749,7 @@ evidence that the first number is odd. If it is because it is the
successor of an even number, then the result is odd because it is the successor of an even number, then the result is odd because it is the
successor of the sum of two even numbers, which is even. successor of the sum of two even numbers, which is even.
#### Exercise `o+o≡e` (stretch) {name=odd-plus-odd} #### Exercise `o+o≡e` (stretch) {#odd-plus-odd}
Show that the sum of two odd numbers is even. Show that the sum of two odd numbers is even.
@ -757,7 +757,7 @@ Show that the sum of two odd numbers is even.
-- Your code goes here -- Your code goes here
``` ```
#### Exercise `Bin-predicates` (stretch) {name=Bin-predicates} #### Exercise `Bin-predicates` (stretch) {#Bin-predicates}
Recall that Recall that
Exercise [Bin](/Naturals/#Bin) Exercise [Bin](/Naturals/#Bin)

View file

@ -1388,3 +1388,12 @@ This chapter uses the following unicode:
₆ U+2086 SUBSCRIPT SIX (\_6) ₆ U+2086 SUBSCRIPT SIX (\_6)
₇ U+2087 SUBSCRIPT SEVEN (\_7) ₇ U+2087 SUBSCRIPT SEVEN (\_7)
≠ U+2260 NOT EQUAL TO (\=n) ≠ U+2260 NOT EQUAL TO (\=n)
```
mul : ∀ {Γ} → Γ ⊢ ` ⇒ ` ⇒ `
mul = μ ƛ ƛ (case (# 1) `zero (plus · # 1 · (# 3 · # 0 · # 1)))
```
_ : eval (gas 100) (mul · two · two) ≡ [code generated by ctrl+c ctrl+n]
_ = refl

View file

@ -32,7 +32,7 @@ Chapter [Lambda](/Lambda/),
and from it we compute an intrinsically-typed term, in the style of and from it we compute an intrinsically-typed term, in the style of
Chapter [DeBruijn](/DeBruijn/). Chapter [DeBruijn](/DeBruijn/).
## Introduction: Inference rules as algorithms {name=algorithms} ## Introduction: Inference rules as algorithms {#algorithms}
In the calculus we have considered so far, a term may have more than In the calculus we have considered so far, a term may have more than
one type. For example, one type. For example,
@ -208,11 +208,11 @@ _decidable_:
synthesize : ∀ (Γ : Context) (M : Term⁺) synthesize : ∀ (Γ : Context) (M : Term⁺)
----------------------- -----------------------
→ Dec (∃[ A ](Γ ⊢ M ↑ A)) → Dec (∃[ A ]( Γ ⊢ M ↑ A ))
inherit : ∀ (Γ : Context) (M : Term⁻) (A : Type) inherit : ∀ (Γ : Context) (M : Term⁻) (A : Type)
--------------- ---------------
→ Dec (Γ ⊢ M ↓ A) → Dec (Γ ⊢ M ↓ A)
Given context `Γ` and synthesised term `M`, we must decide whether Given context `Γ` and synthesised term `M`, we must decide whether
there exists a type `A` such that `Γ ⊢ M ↑ A` holds, or its negation. there exists a type `A` such that `Γ ⊢ M ↑ A` holds, or its negation.
@ -470,7 +470,7 @@ the equality test in the application rule in the first
[section](/Inference/#algorithms). [section](/Inference/#algorithms).
#### Exercise `bidirectional-mul` (recommended) {name=bidirectional-mul} #### Exercise `bidirectional-mul` (recommended) {#bidirectional-mul}
Rewrite your definition of multiplication from Rewrite your definition of multiplication from
Chapter [Lambda](/Lambda/), decorated to support inference. Chapter [Lambda](/Lambda/), decorated to support inference.
@ -480,7 +480,7 @@ Chapter [Lambda](/Lambda/), decorated to support inference.
``` ```
#### Exercise `bidirectional-products` (recommended) {name=bidirectional-products} #### Exercise `bidirectional-products` (recommended) {#bidirectional-products}
Extend the bidirectional type rules to include products from Extend the bidirectional type rules to include products from
Chapter [More](/More/). Chapter [More](/More/).
@ -580,9 +580,9 @@ such that `Γ ∋ x ⦂ A` holds, then there is also no type `A` such that
``` ```
ext∋ : ∀ {Γ B x y} ext∋ : ∀ {Γ B x y}
→ x ≢ y → x ≢ y
→ ¬ ( ∃[ A ] Γ ∋ x ⦂ A ) → ¬ ∃[ A ]( Γ ∋ x ⦂ A )
----------------------------- -----------------------------
→ ¬ ( ∃[ A ] Γ , y ⦂ B ∋ x ⦂ A ) → ¬ ∃[ A ]( Γ , y ⦂ B ∋ x ⦂ A )
ext∋ x≢y _ ⟨ A , Z ⟩ = x≢y refl ext∋ x≢y _ ⟨ A , Z ⟩ = x≢y refl
ext∋ _ ¬∃ ⟨ A , S _ ∋x ⟩ = ¬∃ ⟨ A , ∋x ⟩ ext∋ _ ¬∃ ⟨ A , S _ ∋x ⟩ = ¬∃ ⟨ A , ∋x ⟩
``` ```
@ -597,8 +597,8 @@ there exists a type `A` such that `Γ ∋ x ⦂ A` holds, or its
negation: negation:
``` ```
lookup : ∀ (Γ : Context) (x : Id) lookup : ∀ (Γ : Context) (x : Id)
----------------------- -------------------------
→ Dec (∃[ A ](Γ ∋ x ⦂ A)) → Dec (∃[ A ]( Γ ∋ x ⦂ A ))
lookup ∅ x = no (λ ()) lookup ∅ x = no (λ ())
lookup (Γ , y ⦂ B) x with x ≟ y lookup (Γ , y ⦂ B) x with x ≟ y
... | yes refl = yes ⟨ B , Z ⟩ ... | yes refl = yes ⟨ B , Z ⟩
@ -639,7 +639,7 @@ there is no term `B` such that `Γ ⊢ L · M ↑ B` holds:
→ Γ ⊢ L ↑ A ⇒ B → Γ ⊢ L ↑ A ⇒ B
→ ¬ Γ ⊢ M ↓ A → ¬ Γ ⊢ M ↓ A
---------------------------- ----------------------------
→ ¬ ( ∃[ B ] Γ ⊢ L · M ↑ B ) → ¬ ∃[ B ]( Γ ⊢ L · M ↑ B )
¬arg ⊢L ¬⊢M ⟨ B , ⊢L · ⊢M ⟩ rewrite dom≡ (uniq-↑ ⊢L ⊢L) = ¬⊢M ⊢M ¬arg ⊢L ¬⊢M ⟨ B , ⊢L · ⊢M ⟩ rewrite dom≡ (uniq-↑ ⊢L ⊢L) = ¬⊢M ⊢M
``` ```
Let `⊢L` be evidence that `Γ ⊢ L ↑ A ⇒ B` holds and `¬⊢M` be evidence Let `⊢L` be evidence that `Γ ⊢ L ↑ A ⇒ B` holds and `¬⊢M` be evidence
@ -686,8 +686,8 @@ and a type `A` and either returns evidence that `Γ ⊢ M ↓ A`,
or its negation: or its negation:
``` ```
synthesize : ∀ (Γ : Context) (M : Term⁺) synthesize : ∀ (Γ : Context) (M : Term⁺)
----------------------- ---------------------------
→ Dec (∃[ A ](Γ ⊢ M ↑ A)) → Dec (∃[ A ]( Γ ⊢ M ↑ A ))
inherit : ∀ (Γ : Context) (M : Term⁻) (A : Type) inherit : ∀ (Γ : Context) (M : Term⁻) (A : Type)
--------------- ---------------

View file

@ -213,7 +213,7 @@ definition may use `plusᶜ` as defined earlier (or may not
``` ```
#### Exercise `primed` (stretch) {name=primed} #### Exercise `primed` (stretch) {#primed}
Some people find it annoying to write `` ` "x" `` instead of `x`. Some people find it annoying to write `` ` "x" `` instead of `x`.
We can make examples with lambda terms slightly easier to write We can make examples with lambda terms slightly easier to write
@ -1201,7 +1201,7 @@ the three places where a bound variable is introduced.
The rules are deterministic, in that at most one rule applies to every term. The rules are deterministic, in that at most one rule applies to every term.
### Example type derivations {name=derivation} ### Example type derivations {#derivation}
Type derivations correspond to trees. In informal notation, here Type derivations correspond to trees. In informal notation, here
is a type derivation for the Church numeral two, is a type derivation for the Church numeral two,

View file

@ -147,7 +147,7 @@ Here `M †` is the translation of term `M` from a calculus with the
construct to a calculus without the construct. construct to a calculus without the construct.
## Products {name=products} ## Products {#products}
### Syntax ### Syntax
@ -285,7 +285,7 @@ We can also translate back the other way:
(`proj₁ L) ‡ = case× (L ‡) [⟨ x , y ⟩⇒ x ] (`proj₁ L) ‡ = case× (L ‡) [⟨ x , y ⟩⇒ x ]
(`proj₂ L) ‡ = case× (L ‡) [⟨ x , y ⟩⇒ y ] (`proj₂ L) ‡ = case× (L ‡) [⟨ x , y ⟩⇒ y ]
## Sums {name=sums} ## Sums {#sums}
### Syntax ### Syntax

View file

@ -195,7 +195,7 @@ distinct? (x ∷ xs)
... | no x∉xs ... | no x∉xs
with distinct? xs with distinct? xs
... | yes dxs = yes ⟨ x∉xs , dxs ⟩ ... | yes dxs = yes ⟨ x∉xs , dxs ⟩
... | no ¬dxs = no λ x₁ → ¬dxs (proj₂ x₁) ... | no ¬dxs = no λ z → ¬dxs (proj₂ z)
``` ```
With this decision procedure in hand, we define the following With this decision procedure in hand, we define the following
@ -879,7 +879,7 @@ typed (C-suc c) = ⊢suc (typed c)
typed (C-rcd ⊢Ms dks As<:Bs) = <: (rcd Ms dks) As<:Bs typed (C-rcd ⊢Ms dks As<:Bs) = <: (rcd Ms dks) As<:Bs
``` ```
## Progress <a name="subtyping-progress"></a> ## Progress {#subtyping-progress}
The Progress theorem states that a well-typed term may either take a The Progress theorem states that a well-typed term may either take a
reduction step or it is already a value. The proof of Progress is like reduction step or it is already a value. The proof of Progress is like
@ -952,7 +952,7 @@ progress (⊢<: {A = A}{B} ⊢M A<:B) = progress ⊢M
* Case `⊢<:`: we invoke the induction hypothesis on sub-derivation of `Γ ⊢ M ⦂ A`. * Case `⊢<:`: we invoke the induction hypothesis on sub-derivation of `Γ ⊢ M ⦂ A`.
## Preservation <a name="subtyping-preservation"></a> ## Preservation {#subtyping-preservation}
In this section we prove that when a well-typed term takes a reduction In this section we prove that when a well-typed term takes a reduction
step, the result is another well-typed term with the same type. step, the result is another well-typed term with the same type.

View file

@ -1012,7 +1012,7 @@ all-funs∈ {⊥} f with f {⊥} refl
... | fun () ... | fun ()
all-funs∈ {v ↦ w} f = ⟨ v , ⟨ w , refl ⟩ ⟩ all-funs∈ {v ↦ w} f = ⟨ v , ⟨ w , refl ⟩ ⟩
all-funs∈ {u ⊔ u} f all-funs∈ {u ⊔ u} f
with all-funs∈ λ z → f (inj₁ z) with all-funs∈ (λ z → f (inj₁ z))
... | ⟨ v , ⟨ w , m ⟩ ⟩ = ⟨ v , ⟨ w , (inj₁ m) ⟩ ⟩ ... | ⟨ v , ⟨ w , m ⟩ ⟩ = ⟨ v , ⟨ w , (inj₁ m) ⟩ ⟩
``` ```

View file

@ -1,45 +1,48 @@
--- ---
parts: parts:
- title: "Front matter" - frontmatter: True
title: "Front matter"
sections: sections:
- src/plfa/frontmatter/dedication.md - include: src/plfa/frontmatter/dedication.md
- src/plfa/frontmatter/preface.md - include: src/plfa/frontmatter/preface.md
- README.md - include: README.md
- title: "Part 1: Logical Foundations" - mainmatter: True
title: "Part 1: Logical Foundations"
sections: sections:
- src/plfa/part1/Naturals.lagda.md - include: src/plfa/part1/Naturals.lagda.md
- src/plfa/part1/Induction.lagda.md - include: src/plfa/part1/Induction.lagda.md
- src/plfa/part1/Relations.lagda.md - include: src/plfa/part1/Relations.lagda.md
- src/plfa/part1/Equality.lagda.md - include: src/plfa/part1/Equality.lagda.md
- src/plfa/part1/Isomorphism.lagda.md - include: src/plfa/part1/Isomorphism.lagda.md
- src/plfa/part1/Connectives.lagda.md - include: src/plfa/part1/Connectives.lagda.md
- src/plfa/part1/Negation.lagda.md - include: src/plfa/part1/Negation.lagda.md
- src/plfa/part1/Quantifiers.lagda.md - include: src/plfa/part1/Quantifiers.lagda.md
- src/plfa/part1/Decidable.lagda.md - include: src/plfa/part1/Decidable.lagda.md
- src/plfa/part1/Lists.lagda.md - include: src/plfa/part1/Lists.lagda.md
- title: "Part 2: Programming Language Foundations" - title: "Part 2: Programming Language Foundations"
sections: sections:
- src/plfa/part2/Lambda.lagda.md - include: src/plfa/part2/Lambda.lagda.md
- src/plfa/part2/Properties.lagda.md - include: src/plfa/part2/Properties.lagda.md
- src/plfa/part2/DeBruijn.lagda.md - include: src/plfa/part2/DeBruijn.lagda.md
- src/plfa/part2/More.lagda.md - include: src/plfa/part2/More.lagda.md
- src/plfa/part2/Bisimulation.lagda.md - include: src/plfa/part2/Bisimulation.lagda.md
- src/plfa/part2/Inference.lagda.md - include: src/plfa/part2/Inference.lagda.md
- src/plfa/part2/Untyped.lagda.md - include: src/plfa/part2/Untyped.lagda.md
- src/plfa/part2/Confluence.lagda.md - include: src/plfa/part2/Confluence.lagda.md
- src/plfa/part2/BigStep.lagda.md - include: src/plfa/part2/BigStep.lagda.md
- title: "Part 3: Denotational Semantics" - title: "Part 3: Denotational Semantics"
sections: sections:
- src/plfa/part3/Denotational.lagda.md - include: src/plfa/part3/Denotational.lagda.md
- src/plfa/part3/Compositional.lagda.md - include: src/plfa/part3/Compositional.lagda.md
- src/plfa/part3/Soundness.lagda.md - include: src/plfa/part3/Soundness.lagda.md
- src/plfa/part3/Adequacy.lagda.md - include: src/plfa/part3/Adequacy.lagda.md
- src/plfa/part3/ContextualEquivalence.lagda.md - include: src/plfa/part3/ContextualEquivalence.lagda.md
- title: "Appendix" - backmatter: True
title: "Appendix"
sections: sections:
- src/plfa/part2/Substitution.lagda.md - include: src/plfa/part2/Substitution.lagda.md
- title: "Back matter" - title: "Back matter"
sections: sections:
- src/plfa/backmatter/acknowledgements.md - include: src/plfa/backmatter/acknowledgements.md
- src/plfa/backmatter/Fonts.lagda.md - include: src/plfa/backmatter/Fonts.lagda.md
--- ---

View file

@ -1,10 +1,8 @@
resolver: lts-17.2 resolver: lts-17.2
# Use GHC 8.10.3 compiler: ghc-8.10.5
compiler: ghc-8.10.3
compiler-check: match-exact compiler-check: match-exact
# Allow never versions of packages
allow-newer: true allow-newer: true
flags: flags:
@ -15,28 +13,16 @@ packages:
- . - .
extra-deps: extra-deps:
# Agda dependencies:
- Agda-2.6.1.3@sha256:87769ebab4259b184c5b11d5beaff39b88bcc37902dfb3341f4fc46c5c7d3134,32945 - Agda-2.6.1.3@sha256:87769ebab4259b184c5b11d5beaff39b88bcc37902dfb3341f4fc46c5c7d3134,32945
# Hakyll dependencies:
- hakyll-4.13.4.1 - hakyll-4.13.4.1
- pandoc-2.10.1@sha256:23d7ec480c7cb86740475a419d6ca4819987b6dd23bbae9b50bc3d42a7ed2f9f,36933 - pandoc-2.10.1@sha256:23d7ec480c7cb86740475a419d6ca4819987b6dd23bbae9b50bc3d42a7ed2f9f,36933
- pandoc-citeproc-0.17.0.2@sha256:39c5c60a5eca2c1cb50ae9a00dc7093ca1baac78ad5be4e222505de257dce456,8737 - pandoc-citeproc-0.17.0.2@sha256:39c5c60a5eca2c1cb50ae9a00dc7093ca1baac78ad5be4e222505de257dce456,8737
- commonmark-0.1.1.4@sha256:8717891c53c124ff64187c463619450241a41c0951cda2a43267d40f78992362,3278 - commonmark-0.1.1.4@sha256:8717891c53c124ff64187c463619450241a41c0951cda2a43267d40f78992362,3278
- commonmark-extensions-0.2.0.4@sha256:6a437bcfa3c757af4262b71336513619990eafb5cfdc33e57a499c93ad225608,3184 - commonmark-extensions-0.2.0.4@sha256:6a437bcfa3c757af4262b71336513619990eafb5cfdc33e57a499c93ad225608,3184
- commonmark-pandoc-0.2.0.1@sha256:529c6e2c6cabf61558b66a28123eafc1d90d3324be29819f59f024e430312c1f,1105 - commonmark-pandoc-0.2.0.1@sha256:529c6e2c6cabf61558b66a28123eafc1d90d3324be29819f59f024e430312c1f,1105
# Sass dependencies:
- hsass-0.8.0@sha256:05fb3d435dbdf9f66a98db4e1ee57a313170a677e52ab3a5a05ced1fc42b0834,2899 - hsass-0.8.0@sha256:05fb3d435dbdf9f66a98db4e1ee57a313170a677e52ab3a5a05ced1fc42b0834,2899
- hlibsass-0.1.10.1@sha256:08db56c633e9a83a642d8ea57dffa93112b092d05bf8f3b07491cfee9ee0dfa5,2565 - hlibsass-0.1.10.1@sha256:08db56c633e9a83a642d8ea57dffa93112b092d05bf8f3b07491cfee9ee0dfa5,2565
# GitHub API dependencies:
- github-0.26@sha256:a9d4046325c3eb28cdc7bef2c3f5bb213328caeae0b7dce6f51de655f0bffaa1,7162 - github-0.26@sha256:a9d4046325c3eb28cdc7bef2c3f5bb213328caeae0b7dce6f51de655f0bffaa1,7162
- binary-instances-1.0.0.1@sha256:e234be994da675479a3661f050d4a1d53565c9ed7786d9a68b7a29ba8b54b5a7,2659 - binary-instances-1.0.0.1@sha256:e234be994da675479a3661f050d4a1d53565c9ed7786d9a68b7a29ba8b54b5a7,2659
# Regex dependencies:
- text-regex-replace-0.1.1.3@sha256:e7f612df671c93ced54a3d26528db37852069884e5cb67d5afbb49d3defb5eb9,1627 - text-regex-replace-0.1.1.3@sha256:e7f612df671c93ced54a3d26528db37852069884e5cb67d5afbb49d3defb5eb9,1627
# Version of text-icu compatible with icu68+
- text-icu-0.7.1.0@sha256:44e8b5966fcf61356a7356f3d2e8a5c5cc538170a15e94ff7b79b9f48ce9fd2f,3497 - text-icu-0.7.1.0@sha256:44e8b5966fcf61356a7356f3d2e8a5c5cc538170a15e94ff7b79b9f48ce9fd2f,3497

View file

@ -1,26 +0,0 @@
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE html>
<html xmlns="http://www.w3.org/1999/xhtml" xmlns:epub="http://www.idpf.org/2007/ops" xml:lang="$language$">
<head>
<meta charset="utf-8" />
<title>$pagetitle$</title>
<link rel="stylesheet" type="text/css" href="css/agda.css" />
<link rel="stylesheet" type="text/css" href="css/epub.css" />
</head>
<body>
<section epub:type="titlepage" class="titlepage">
<h1 class="title">$pagetitle$</h1>
$for(authors)$
<p class="author">$name$</p>
$endfor$
$if(rights)$
<div class="rights">$rights$</div>
$endif$
</section>
<!-- NOTE: body is escaped because it is filled in by the Pandoc template engine, -->
<!-- whereas the rest is filled in by the Hakyll template engine, both of -->
<!-- which use the same syntax. Yes, this is confusing. -->
$$body$$
</body>
</html>

19
templates/metadata.md Normal file
View file

@ -0,0 +1,19 @@
---
comment: This file exists to save the correct metadata to the generated `epub.md` file.
---
---
lang : $language$
pagetitle : $pagetitle$
title : $pagetitle$
titlepage : true
description : $description$
author :
$for(authors)$
- $name$
$endfor$
date : $modified$
rights : $rights$
---
$body$

View file

@ -1,7 +1,7 @@
<article class="post"> <article class="post">
$partial("templates/next.html")$ $partial("templates/next.html")$
<header class="post-header"> <header class="post-header">
<h1 class="post-title">$title$</h1> <h1 class="post-title" $if(anchor)$id="$anchor$"$endif$>$title$</h1>
</header> </header>
<div class="post-content"> <div class="post-content">
$body$ $body$