Newer
Older
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
\section{Focusing on an implementation}
Call-by-push-value with complex values and stacks is odd from an
operational perspective.
%
Values, rather than being simple trees built out of their
constructors, can perform pattern matching on free variables, which
would mean that they seemingly need ot be reduced operationally, when
they are expected to be inert.
%
Dually, stacks, rather than being simple composites of
\emph{destructors}, can also consist of $\lambda$s and code tuples,
which are expected to \emph{delay} evaluation of their bodies in an
operational semantics, whereas they are expected to \emph{force} the
evaluation of the term plugged into the hole.
%
Levy resolves these seeming oddities by showing that as long as the
values and stacks occur inside a larger term, the ``complex'' portions
can be \emph{compiled away}.
%
Today, many years later, with the benefit of much hindsight, we can
see Levy's proof as an application of the method of \emph{focusing}.
Here we adapt that proof to get an operational semantics for
\emph{Gradual} CBPV that will .
%
If we focus even more intensely we can make all upcasts between
positive connectives implicit, but allowing positive variables rules
out that possibility.
\begin{figure}[H]
\mbox{Values: $\Gamma \vdash V : A$}\\
\begin{mathpar}
\inferrule
{\Gamma \vdash \hat V : A_1 \and A_1 \ltdyn A_2}
{\Gamma \vdash \upcast {A_1}{A_2} \hat V : A_2}
\end{mathpar}
\mbox{Value Constructors: $\Gamma \vdash\hat V : A$}\\
\begin{mathpar}
\inferrule
{x : A \in \Gamma}
{\Gamma \vdash x : A}
\inferrule
{\Gamma \vdash V : A \and\Gamma \vdash V' : A'}
{\Gamma \vdash ( V, V') : A \times A'}
\inferrule
{\Gamma \vdash V : A}
{\Gamma \vdash \sigma_{A,A'} V : A + A'}
\inferrule
{\Gamma \vdash V' : A'}
{\Gamma \vdash \sigma_{A,A'}' V' : A + A'}
\inferrule
{}
{\Gamma \vdash () : 1}
\inferrule
{\Gamma \vdash M : \u B}
{\Gamma \vdash \thunk M : U \u B}
\end{mathpar}
\mbox{Terms: $\Gamma \vdash M : \u B$}
\begin{mathpar}
\inferrule
{}
{\Gamma \vdash \err_{\u B} : \u B}
\inferrule
{\Gamma \vdash V : A}
{\Gamma \vdash \ret V : \u F A}
\inferrule
{\Gamma \vdash V : U \u B\and
\Gamma \pipe [ \u B ] \vdash S : \u C
}
{\Gamma \vdash \force V; S : \u B}
\inferrule
{\Gamma, x : A \vdash M : \u B}
{\Gamma \vdash \lambda x : A. M : A \to \u B}
\inferrule
{}
{\Gamma \vdash [] : \top}
\inferrule
{\Gamma \vdash M : \u B\and
\Gamma \vdash M' : \u B'}
{\Gamma \vdash [\pi \mapsto M \pipe \pi' \mapsto M'] : \u B \wedge \u B'}
\inferrule
{\Gamma \vdash V : A \times A'\and
\Gamma, x : A, x': A' \vdash M : \u B}
{\Gamma \vdash \lett (x,x') = V; M : \u B}
\inferrule
{\Gamma \vdash V : A + A'\and
\Gamma , x:A \vdash M : \u B\and
\Gamma , x:A' \vdash M' : \u B}
{\Gamma \vdash \case V \{ \sigma x \mapsto M \pipe \sigma' x' \mapsto M' \} : \u B}
\inferrule
{\Gamma \vdash \hat M : \u B_2 \and \u B_1 \ltdyn \u B_2}
{\Gamma \vdash \dncast{\u B_1}{\u B_2} \hat M : \u B_1}
\end{mathpar}
\mbox{Spines $\Gamma \pipe [ \u B ] \vdash S : \u C$}
\begin{mathpar}
\inferrule
{\Gamma \pipe [ \u B_1] \vdash S : \u C \and \u B_1 \ltdyn \u B_2}
{\Gamma \pipe [\u B_2] \vdash \dncast{\u B_1}{\u B_2}; S : \u C}
\end{mathpar}
\mbox{Computation Destructors $\Gamma\pipe [ \u B ] \vdash \hat S : \u C$}
\begin{mathpar}
\inferrule
{}
{\Gamma \pipe [\u B ] \vdash \bullet : \u B}
\inferrule
{\Gamma\pipe [\u B] \vdash S : \u C \and
\Gamma \vdash V : A}
{\Gamma\pipe [ A \to \u B ] \vdash 'V; S : \u C}
\inferrule
{\Gamma \pipe [\u B]\vdash S : C}
{\Gamma \pipe [\u B \wedge \u B'] \vdash \pi; S : \u C}
\inferrule
{\Gamma \pipe [\u B']\vdash S : C}
{\Gamma \pipe [\u B \wedge \u B'] \vdash \pi'; S : \u C}
\inferrule
{\Gamma, x : A \vdash M : \u C}
{\Gamma \pipe [\u F A] \vdash \too x. M : \u C}
\end{mathpar}
\caption{Operational Gradual Call By Push Value (Sketchy)}
\end{figure}
\section{The Notes we Don't Play}
From a ``completionist'' perspective, call-by-push-value is missing
some interesting connectives that are easy to define.
%
When added to call-by-push-value, the language is called the enriched
effect calculus (EEC) and has been studied extensively (cite).
First, there are 3 missing multiplicative connectives: the pure
function space $A \Rightarrow A'$, linear function space $\u B
\multimap \u B'$ and tensor product of a value and computation type $A
\otimes \u B$.
%
Since they are problematic I will only describe their sorts and their
sequent calculus invertible rule:
\begin{mathpar}
\inferrule
{A \vtype \and A' \vtype}
{A \Rightarrow A' \vtype}
\inferrule
{\Gamma, A \vdash^V A'}
{\Gamma \vdash^V A \Rightarrow A'}
\inferrule
{\u B \ctype \and \u B' \ctype}
{\u B \multimap \u B' \vtype}
\inferrule
{\Gamma \pipe \u B \vdash \u B'}
{\Gamma \vdash \u B \multimap \u B'}
\inferrule
{A \vtype \and \u B \ctype}
{A \otimes \u B \ctype}
\inferrule
{\Gamma, A \pipe \u B \vdash \u C}
{\Gamma \pipe A \otimes \u B \vdash \u C}
\end{mathpar}
First, they are ``boundary-crossing'' connectives in that they each
have a \emph{covariant} argument whose sort is different from the sort
of the constructor or a \emph{contravariant} argument whose sort is
the same as the constructor.
%
The pure function space has a contravariant argument of the same sort,
the linear function space has a covariant computation type argument
while it is a value type and the value-computation tensor has a
covariant value type argument while it is a computation type.
Second, from the perspective of our focusing operational semantics,
each of them violates the rule of our focusing system that the only
negative value type is $U$ and the only positive computation type is
$\u F$.
%
Note that this is similar to but not the same as the boundary crossing
rule, and there are some \emph{additives} that we violate the focusing
restriction but not the boundary-crossing restriction: the negative
value product and the positive computation sum, which we show now.
\begin{mathpar}
\inferrule
{A \vtype \and A' \vtype}
{A \& A' \vtype}
\inferrule
{\Gamma \vdash A \and \Gamma \vdash A'}
{\Gamma \vdash A \& A'}
\inferrule
{\u B \ctype \and \u B' \ctype}
{\u B \oplus \u B' \ctype}
\inferrule
{{\Gamma \pipe \u B \vdash \u C} \and
{\Gamma \pipe \u B' \vdash \u C}}
{\Gamma \pipe \u B \oplus \u B' \vdash \u C}
\end{mathpar}