Newer
Older
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
* Inductive, Coinductive and Recursive Types in GTT
In order to apply the GTT model to non-trivial (statically typed)
programs, we need to add some facility for inductive, co-recursive
and/or recursive types.
In principle this should not be a problem: the uniqueness principles
are derivable in the presence of functoriality of connectives, and
recursive types don't disrupt that. One difficulty is in establishing
formally that the casts we derive for *recursive* types are actually
values for recursive value types and stacks for recursive stack types.
The usual presentation of recursive types by just adding in an
isomorphism between (rec X. A) and A[rec X.A] is not sufficient
because we would use something like the Y combinator to implement
functorial stuff. This presentation is useful for minimalistic
*operational* semantics, but is insufficient for a minimalistic
*axiomatic* semantics.
By contrast, Inductive and Co-inductive types present no difficulty at
all. We add in an injection and the induction principle, with the
induction principle being a *value form*. This makes it easy to show
that the upcast implementation is a value. The same holds in the dual:
the coinduction principle for co-inductive computations should be
astack.
Then the result should hold because any recursive type can be written
using inductive types and *guarded* recursive types where guarded here
means the variable occurs under a thunk.
That is any recursive value type `rec X. A` can be written in the form
`rec X. μ X'. [A]` where every occurrence of `X'` in `A` is strictly
positive and every occurrence of `X` in `A` occurs under a thunk `U`.
For example, take a simple Schemish CBV dynamic type:
rec D. 1 + (D * D) + U(D -> F D)
The intuition about this type is that dynamically typed values are
S-Expressions whose leaves are either nil or a unary procedure.
The factorization makes this clearer:
rec D. μ T. 1 + (T * T) + U(D -> F D)
We can think of `D` as the full recursive dynamic type and `T` as
representing the inductive first-order component of the type:
S-expressions.
I haven't worked out the details but I think this observation should
be sufficient to prove syntactically that upcasts are values/downcasts
are stacks.
* Precision and Equi-Recursion vs Iso-Recursion
Static typing naturally lends itself to iso-recursion whereas dynamic
typing naturally lends itself to equi-recursion. Gradual typing can
accommodate this difference quite easily: in the cast calculus, the
recursion is iso-recursive, but the *precision* is equi-recursive.
But what rules should we have? Should we just state that they are
equi-dynamic with their one level unrolling?
rec X. A == A[rec X.A]
Or should we say that in fact they are *least pre-fixed points*
in the precision ordering? I haven't thought through the consequences
yet but my gut says least fixed point will be more useful.
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
The lfp rules are simple. We say first that the type mu X. A is a pre-fixed point of X.A:
-----------------------
A[mu X. A/X] <= mu X. A
and second that it is the *least* such pre-fixed point:
A[A'/X] <= A'
-------------
mu X. A <= A'
From this we can derive that lfp are functorial:
X <= X' |- A <= A'
--------------------
mu X. A <= mu X'. A'
by using the lfp property and monotonicity of all connectives:
A[mu X'. A'/X] <= A'[mu X'. A'/X'] <= mu X'. A'
-----------------------------------------------
A[mu X'. A'/X] <= mu X'. A'
---------------------------
mu X. A <= mu X'. A'
* Recursive Types as a Precision-Least Pre-fixed Point
Here's an interesting idea, somewhat inspired by denotational
semantics. What if the recursive type mu X. A were just *defined* to
be the least pre-fixed point of X.A in the dynamism ordering? That is,
we treat mu X. A similar to the dynamic type ? in that we only
interact with it via casts.
Then we could define for example:
G |- V : A[mu X. A/X]
------------------------------------------------------
G |- roll(V) : mu X. A =: <mu X. A <--< A[mu X. A/X]>V
and
G |- V : mu X. A
------------------------------------------------------------
G |- unroll(V) : A[mu X. A/X] =: <A[mu X.A/X] <--< mu X.A>V
where the dynamism derivation of the latter comes from
A <= A and A[mu X. A/X] <= mu X.A
-------------------------------subst
A[A[mu X.A/X]/X] <= A[mu X.A /X]
----------------------least pre-fixed point
mu X. A <= A[mu X.A/X]
we can prove that roll, unroll form an isomorphism from our proof that
equi-dynamic types are isomorphic.
What's even cooler than that is we can derive reduction rules:
Say we have a derivation
A[A'/X] <= A'
-------------lfp
mu X. A <= A'
Then there is an upcast
<A' <--< mu X. A>V
We can derive a reduction rule for this cast when it is applied to roll:
<A' <--< mu X.A>roll(V)
= <A' <--< mu X.A><mu X. A <--< A[mu X.A/X]>V
= <A' <--< A[A'/X]><A[A'/X] <--< A[mu X.A/X]>V
By coherence of upcasts, we need only verify that appropriate type
dynamism derivations can be constructed:
1. A[A'/X] <= A'
2. A <= A and mu X.A <= A'
-----------------------------subst
A[mu X.A/X] <= A[A'/X]
TODO: look at the downcasts involving mu, and dualize everything for
computation types.