\section{Focusing on an implementation}

Call-by-push-value with complex values and stacks is odd from an
operational perspective.
%
Values, rather than being simple trees built out of their
constructors, can perform pattern matching on free variables, which
would mean that they seemingly need ot be reduced operationally, when
they are expected to be inert.
%
Dually, stacks, rather than being simple composites of
\emph{destructors}, can also consist of $\lambda$s and code tuples,
which are expected to \emph{delay} evaluation of their bodies in an
operational semantics, whereas they are expected to \emph{force} the
evaluation of the term plugged into the hole.
%
Levy resolves these seeming oddities by showing that as long as the
values and stacks occur inside a larger term, the ``complex'' portions
can be \emph{compiled away}.
%
Today, many years later, with the benefit of much hindsight, we can
see Levy's proof as an application of the method of \emph{focusing}.

Here we adapt that proof to get an operational semantics for
\emph{Gradual} CBPV that will .
%
If we focus even more intensely we can make all upcasts between
positive connectives implicit, but allowing positive variables rules
out that possibility.

\begin{figure}[H]
  \mbox{Values: $\Gamma \vdash V : A$}\\
  \begin{mathpar}
    \inferrule
    {\Gamma \vdash \hat V : A_1 \and A_1 \ltdyn A_2}
    {\Gamma \vdash \upcast {A_1}{A_2} \hat V : A_2}
  \end{mathpar}
  \mbox{Value Constructors: $\Gamma \vdash\hat V : A$}\\
  \begin{mathpar}
    \inferrule
    {x : A \in \Gamma}
    {\Gamma \vdash x : A}

    \inferrule
    {\Gamma \vdash V : A \and\Gamma \vdash V' : A'}
    {\Gamma \vdash ( V,  V') : A \times A'}

    \inferrule
    {\Gamma \vdash  V : A}
    {\Gamma \vdash \sigma_{A,A'}  V : A + A'}

    \inferrule
    {\Gamma \vdash  V' : A'}
    {\Gamma \vdash \sigma_{A,A'}'  V' : A + A'}

    \inferrule
    {}
    {\Gamma \vdash () : 1}

    \inferrule
    {\Gamma \vdash M : \u B}
    {\Gamma \vdash \thunk M : U \u B}
  \end{mathpar}

  \mbox{Terms: $\Gamma \vdash M : \u B$}
  \begin{mathpar}
    \inferrule
    {}
    {\Gamma \vdash \err_{\u B} : \u B}

    \inferrule
    {\Gamma \vdash V : A}
    {\Gamma \vdash \ret V : \u F A}

    \inferrule
    {\Gamma \vdash V : U \u B\and
      \Gamma \pipe [ \u B ] \vdash S : \u C
    }
    {\Gamma \vdash \force V; S : \u B}

    \inferrule
    {\Gamma, x : A \vdash M : \u B}
    {\Gamma \vdash \lambda x : A. M : A \to \u B}

    \inferrule
    {}
    {\Gamma \vdash [] : \top}

    \inferrule
    {\Gamma \vdash M : \u B\and
      \Gamma \vdash M' : \u B'}
    {\Gamma \vdash [\pi \mapsto M \pipe \pi' \mapsto M'] : \u B \wedge \u B'}

    \inferrule
    {\Gamma \vdash  V : A \times A'\and
      \Gamma, x : A, x': A' \vdash M : \u B}
    {\Gamma \vdash \lett (x,x') =  V; M : \u B}

    \inferrule
    {\Gamma \vdash  V : A + A'\and
      \Gamma , x:A \vdash M : \u B\and
      \Gamma , x:A' \vdash M' : \u B}
    {\Gamma \vdash \case  V \{ \sigma x \mapsto M \pipe \sigma' x' \mapsto M' \} : \u B}

    \inferrule
    {\Gamma \vdash \hat M : \u B_2 \and \u B_1 \ltdyn \u B_2}
    {\Gamma \vdash \dncast{\u B_1}{\u B_2} \hat M : \u B_1}
  \end{mathpar}

  \mbox{Spines $\Gamma \pipe [ \u B ] \vdash S : \u C$}
  \begin{mathpar}
    \inferrule
    {\Gamma \pipe [ \u B_1] \vdash S : \u C \and \u B_1 \ltdyn \u B_2}
    {\Gamma \pipe [\u B_2] \vdash \dncast{\u B_1}{\u B_2}; S : \u C}
  \end{mathpar}

  \mbox{Computation Destructors $\Gamma\pipe [ \u B ] \vdash \hat S : \u C$}
  \begin{mathpar}
    \inferrule
    {}
    {\Gamma \pipe [\u B ] \vdash \bullet : \u B}

    \inferrule
    {\Gamma\pipe [\u B] \vdash S : \u C \and
      \Gamma \vdash V : A}
    {\Gamma\pipe [ A \to \u B ] \vdash 'V; S : \u C}

    \inferrule
    {\Gamma \pipe [\u B]\vdash S : C}
    {\Gamma \pipe [\u B \wedge \u B'] \vdash \pi; S : \u C}

    \inferrule
    {\Gamma \pipe [\u B']\vdash S : C}
    {\Gamma \pipe [\u B \wedge \u B'] \vdash \pi'; S : \u C}

    \inferrule
    {\Gamma, x : A \vdash M : \u C}
    {\Gamma \pipe [\u F A] \vdash \too x. M : \u C}
  \end{mathpar}
  \caption{Operational Gradual Call By Push Value (Sketchy)}
\end{figure}

\section{The Notes we Don't Play}

From a ``completionist'' perspective, call-by-push-value is missing
some interesting connectives that are easy to define.
%
When added to call-by-push-value, the language is called the enriched
effect calculus (EEC) and has been studied extensively (cite).

First, there are 3 missing multiplicative connectives: the pure
function space $A \Rightarrow A'$, linear function space $\u B
\multimap \u B'$ and tensor product of a value and computation type $A
\otimes \u B$.
%
Since they are problematic I will only describe their sorts and their
sequent calculus invertible rule:

\begin{mathpar}
  \inferrule
  {A \vtype \and A' \vtype}
  {A \Rightarrow A' \vtype}

  \inferrule
  {\Gamma, A \vdash^V A'}
  {\Gamma \vdash^V A \Rightarrow A'}

  \inferrule
  {\u B \ctype \and \u B' \ctype}
  {\u B \multimap \u B' \vtype}

  \inferrule
  {\Gamma \pipe \u B \vdash \u B'}
  {\Gamma \vdash \u B \multimap \u B'}

  \inferrule
  {A \vtype \and \u B \ctype}
  {A \otimes \u B \ctype}

  \inferrule
  {\Gamma, A \pipe \u B \vdash \u C}
  {\Gamma \pipe A \otimes \u B \vdash \u C}
\end{mathpar}

First, they are ``boundary-crossing'' connectives in that they each
have a \emph{covariant} argument whose sort is different from the sort
of the constructor or a \emph{contravariant} argument whose sort is
the same as the constructor.
%
The pure function space has a contravariant argument of the same sort,
the linear function space has a covariant computation type argument
while it is a value type and the value-computation tensor has a
covariant value type argument while it is a computation type.

Second, from the perspective of our focusing operational semantics,
each of them violates the rule of our focusing system that the only
negative value type is $U$ and the only positive computation type is
$\u F$.
%
Note that this is similar to but not the same as the boundary crossing
rule, and there are some \emph{additives} that we violate the focusing
restriction but not the boundary-crossing restriction: the negative
value product and the positive computation sum, which we show now.

\begin{mathpar}
  \inferrule
  {A \vtype \and A' \vtype}
  {A \& A' \vtype}

  \inferrule
  {\Gamma \vdash A \and \Gamma \vdash A'}
  {\Gamma \vdash A \& A'}

  \inferrule
  {\u B \ctype \and \u B' \ctype}
  {\u B \oplus \u B' \ctype}

  \inferrule
  {{\Gamma \pipe \u B \vdash \u C} \and
  {\Gamma \pipe \u B' \vdash \u C}}
  {\Gamma \pipe \u B \oplus \u B' \vdash \u C}
\end{mathpar}