Skip to content
Snippets Groups Projects
Commit db994beb authored by Eric Giovannini's avatar Eric Giovannini
Browse files

changes to layout and prose of paper

parent 45e0fe23
No related branches found
No related tags found
No related merge requests found
...@@ -34,6 +34,10 @@ ...@@ -34,6 +34,10 @@
\newcommand{\up}[2]{\langle{#2}\uarrowl{#1}\rangle} \newcommand{\up}[2]{\langle{#2}\uarrowl{#1}\rangle}
\newcommand{\dn}[2]{\langle{#1}\darrowl{#2}\rangle} \newcommand{\dn}[2]{\langle{#1}\darrowl{#2}\rangle}
\newcommand{\upc}[2]{\text{up}\,{#1}\,{#2}}
\newcommand{\dnc}[2]{\text{dn}\,{#1}\,{#2}}
\newcommand{\ret}{\mathsf{ret}} \newcommand{\ret}{\mathsf{ret}}
\newcommand{\err}{\mho} \newcommand{\err}{\mho}
\newcommand{\zro}{\textsf{zro}} \newcommand{\zro}{\textsf{zro}}
...@@ -80,6 +84,9 @@ ...@@ -80,6 +84,9 @@
\newcommand{\pertdyn}[2]{\text{pert-dyn}({#1}, {#2})} \newcommand{\pertdyn}[2]{\text{pert-dyn}({#1}, {#2})}
\newcommand{\delaypert}[1]{\text{delay-pert}({#1})} \newcommand{\delaypert}[1]{\text{delay-pert}({#1})}
\newcommand{\pertc}{\text{Pert}_{\text{C}}}
\newcommand{\pertv}{\text{Pert}_{\text{V}}}
% SGDT and Intensional Stuff % SGDT and Intensional Stuff
......
This diff is collapsed.
...@@ -8,7 +8,7 @@ relations, then show how to give them a semantics using SGDT. ...@@ -8,7 +8,7 @@ relations, then show how to give them a semantics using SGDT.
% TODO mention intensional syntax % TODO mention intensional syntax
\subsection{Term Precision for GTLC} \subsection{Term Precision for GTLC}\label{sec:gtlc-term-precision-axioms}
% --------------------------------------------------------------------------------------- % ---------------------------------------------------------------------------------------
% --------------------------------------------------------------------------------------- % ---------------------------------------------------------------------------------------
...@@ -375,10 +375,10 @@ More concretely, consider a simplified version of the DnL rule shown below: ...@@ -375,10 +375,10 @@ More concretely, consider a simplified version of the DnL rule shown below:
\begin{mathpar} \begin{mathpar}
\inferrule*{M \ltdyn_i N : B} \inferrule*{M \ltdyn_i N : B}
{\dn c M \ltdyn_i N : c} {\dnc{c}{M} \ltdyn_i N : c}
\end{mathpar} \end{mathpar}
If $c$ is inj-arr, then when we downcast $M$ from $dyn$ to $\dyn \ra \dyn$, If $c$ is inj-arr, then when we downcast $M$ from $dyn$ to $\dyntodyn$,
semantically this will involve a $\theta$ because the value of type $dyn$ semantically this will involve a $\theta$ because the value of type $dyn$
in the semantics will contain a \emph{later} function $\tilde{f}$. in the semantics will contain a \emph{later} function $\tilde{f}$.
Thus, in order for the right-hand side to be related to the downcast, Thus, in order for the right-hand side to be related to the downcast,
...@@ -387,7 +387,7 @@ we need to insert a delay on the right. ...@@ -387,7 +387,7 @@ we need to insert a delay on the right.
The need for delays affects the cast rules involving upcasts as well, because The need for delays affects the cast rules involving upcasts as well, because
the upcast for functions involves a downcast on the domain: the upcast for functions involves a downcast on the domain:
\[ \up{c_i \ra c_o}{M} \equiv \lambda (x : B_i). \up{c_o}(M\, (\dn {c_i} x)). \] \[ \up{A_i \ra A_o}{B_i \ra B_o}{M} \equiv \lambda (x : B_i). \up{A_o}{B_o}(M\, (\dn {A_i}{B_i} x)). \]
Thus, the correct versions of the cast rules involve delays on the side that was not casted. Thus, the correct versions of the cast rules involve delays on the side that was not casted.
...@@ -418,11 +418,11 @@ for embeddings $\perte$ and for projections $\pertp$ by the following rules: ...@@ -418,11 +418,11 @@ for embeddings $\perte$ and for projections $\pertp$ by the following rules:
{\delta_c \ra \delta_d : \pertp (A \ra B)} {\delta_c \ra \delta_d : \pertp (A \ra B)}
\inferrule \inferrule
{\delta_\nat : \perte \nat \and \delta_f : \perte (\dyn \ra \dyn)} {\delta_\nat : \perte \nat \and \delta_f : \perte (\dyntodyn)}
{\pertdyn{\delta_\nat}{\delta_f} : \perte \dyn} {\pertdyn{\delta_\nat}{\delta_f} : \perte \dyn}
\inferrule \inferrule
{\delta_\nat : \pertp \nat \and \delta_f : \pertp (\dyn \ra \dyn)} {\delta_\nat : \pertp \nat \and \delta_f : \pertp (\dyntodyn)}
{\pertdyn{\delta_\nat}{\delta_f} : \pertp \dyn} {\pertdyn{\delta_\nat}{\delta_f} : \pertp \dyn}
\end{mathpar} \end{mathpar}
......
\section{Technical Background}\label{sec:technical-background} \section{Technical Background}\label{sec:technical-background}
\subsection{Gradual Typing}
% Cast calculi
In a gradually-typed language, the mixing of static and dynamic code is seamless, in that
the dynamically typed parts are checked at runtime. This type checking occurs at the elimination
forms of the language (e.g., pattern matching, field reference, etc.).
Gradual languages are generally elaborated to a \emph{cast calculus}, in which the dynamic
type checking is made explicit through the insertion of \emph{type casts}.
% Up and down casts
In a cast calculus, there is a relation $\ltdyn$ on types such that $A \ltdyn B$ means that
$A$ is a \emph{more precise} type than $B$.
There a dynamic type $\dyn$ with the property that $A \ltdyn\, \dyn$ for all $A$.
%
If $A \ltdyn B$, a term $M$ of type $A$ may be \emph{up}casted to $B$, written $\up A B M$,
and a term $N$ of type $B$ may be \emph{down}casted to $A$, written $\dn A B N$.
Upcasts always succeed, while downcasts may fail at runtime.
%
% Syntactic term precision
We also have a notion of \emph{syntactic term precision}.
If $A \ltdyn B$, and $M$ and $N$ are terms of type $A$ and $B$ respectively, we write
$M \ltdyn N : A \ltdyn B$ to mean that $M$ is more precise than $N$, i.e., $M$ and $N$
behave the same except that $M$ may error more.
% Modeling the dynamic type as a recursive sum type? % Modeling the dynamic type as a recursive sum type?
% Observational equivalence and approximation? % Observational equivalence and approximation?
% synthetic guarded domain theory, denotational semantics therein % synthetic guarded domain theory, denotational semantics therein
\subsection{Operational Reduction Proofs}
\subsection{Classical Domain Models}
New and Licata \cite{new-licata18} developed an axiomatic account of the
graduality relation on cast calculus terms and gave a denotational
model of this calculus using classical domain theory based on
$\omega$-CPOs. This semantics has scaled up to an analysis of a
dependently typed gradual calculus in \cite{asdf}. This meets our
criterion of being a reusable mathematical theory, as general semantic
theorems about gradual domains can be developed independent of any
particular syntax and then reused in many different denotational
models. However, it is widely believed that such classical domain
theoretic techniques cannot be extended to model higher-order store, a
standard feature of realistic gradually typed languages such as Typed
Racket. Thus if we want a reusable mathematical theory of gradual
typing that can scale to realistic programming languages, we need to
look elsewhere to so-called ``step-indexed'' techniques.
A series of works \cite{new-ahmed2018, new-licata-ahmed2019, new-jamner-ahmed19}
developed step-indexed logical relations models of gradually typed
languages based on operational semantics. Unlike, classical domain
theory, such step-indexed techniques are capable of modeling
higher-order store and runtime-extensible dynamic types
\cite{amalsthesis,nonpmetricparamorsomething,new-jamner-ahmed19}. However
their proof developments are highly repetitive and technical, with
each development formulating a logical relation from first-principles
and proving many of the same tedious lemmas without reusable
mathematical abstractions. Our goal in the current work is to extract
these reusable mathematical principles from these tedious models to
make formalization of realistic gradual languages tractible.
\subsection{Difficulties in Prior Semantics} \subsection{Difficulties in Prior Semantics}
% Difficulties in prior semantics % Difficulties in prior semantics
...@@ -76,9 +20,8 @@ make formalization of realistic gradual languages tractible. ...@@ -76,9 +20,8 @@ make formalization of realistic gradual languages tractible.
% %
Reasoning about step-indexed logical relations Reasoning about step-indexed logical relations
can be tedious and error-prone, and there are some very subtle aspects that must can be tedious and error-prone, and there are some very subtle aspects that must
be taken into account in the proofs. Figure \ref{TODO} shows an example of a step-indexed logical be taken into account in the proofs.
relation for the gradually-typed lambda calculus. %
In particular, the prior approach of New and Ahmed requires two separate logical In particular, the prior approach of New and Ahmed requires two separate logical
relations for terms, one in which the steps of the left-hand term are counted, relations for terms, one in which the steps of the left-hand term are counted,
and another in which the steps of the right-hand term are counted. and another in which the steps of the right-hand term are counted.
......
...@@ -128,7 +128,7 @@ as well a $\beta$ and $\eta$ law for bind. ...@@ -128,7 +128,7 @@ as well a $\beta$ and $\eta$ law for bind.
The type precision rules specify what it means for a type $A$ to be more precise than $A'$. The type precision rules specify what it means for a type $A$ to be more precise than $A'$.
We have reflexivity rules for $\dyn$ and $\nat$, as well as rules that $\nat$ is more precise than $\dyn$ We have reflexivity rules for $\dyn$ and $\nat$, as well as rules that $\nat$ is more precise than $\dyn$
and $\dyn \ra \dyn$ is more precise than $\dyn$. and $\dyntodyn$ is more precise than $\dyn$.
We also have a congruence rule for function types stating that given $A_i \ltdyn A'_i$ and $A_o \ltdyn A'_o$, we can prove We also have a congruence rule for function types stating that given $A_i \ltdyn A'_i$ and $A_o \ltdyn A'_o$, we can prove
$A_i \ra A_o \ltdyn A'_i \ra A'_o$. Note that precision is covariant in both the domain and codomain. $A_i \ra A_o \ltdyn A'_i \ra A'_o$. Note that precision is covariant in both the domain and codomain.
Finally, we can lift a relation on value types $A \ltdyn A'$ to a relation $\Ret A \ltdyn \Ret A'$ on Finally, we can lift a relation on value types $A \ltdyn A'$ to a relation $\Ret A \ltdyn \Ret A'$ on
...@@ -150,10 +150,10 @@ computation types. ...@@ -150,10 +150,10 @@ computation types.
\inferrule*[right = $\textsf{Inj}_{\ra}$] \inferrule*[right = $\textsf{Inj}_{\ra}$]
{ } { }
{(\dyn \ra \dyn) \ltdyn\, \dyn} {(\dyntodyn) \ltdyn\, \dyn}
\inferrule*[right = $\injarr{}$] \inferrule*[right = $\injarr{}$]
{(R \ra S) \ltdyn\, (\dyn \ra \dyn)} {(R \ra S) \ltdyn\, (\dyntodyn)}
{(R \ra S) \ltdyn\, \dyn} {(R \ra S) \ltdyn\, \dyn}
...@@ -182,20 +182,20 @@ This notion will be used below in the statement of transitivity of the term prec ...@@ -182,20 +182,20 @@ This notion will be used below in the statement of transitivity of the term prec
\subsection{Removing Casts as Primitives} \subsection{Removing Casts as Primitives}
% We now observe that all casts, except those between $\nat$ and $\dyn$ % We now observe that all casts, except those between $\nat$ and $\dyn$
% and between $\dyn \ra \dyn$ and $\dyn$, are admissible, in the sense that % and between $\dyntodyn$ and $\dyn$, are admissible, in the sense that
% we can start from $\extlcm$, remove casts except the aforementioned ones, % we can start from $\extlcm$, remove casts except the aforementioned ones,
% and in the resulting language we will be able to derive the other casts. % and in the resulting language we will be able to derive the other casts.
We now observe that all casts, except those between $\nat$ and $\dyn$ We now observe that all casts, except those between $\nat$ and $\dyn$
and between $\dyn \ra \dyn$ and $\dyn$, are admissible. and between $\dyntodyn$ and $\dyn$, are admissible.
That is, consider a new language ($\extlcprime$) in which That is, consider a new language ($\extlcprime$) in which
instead of having arbitrary casts, we have injections from $\nat$ and instead of having arbitrary casts, we have injections from $\nat$ and
$\dyn \ra \dyn$ into $\dyn$, and a case inspection on $\dyn$. $\dyntodyn$ into $\dyn$, and a case inspection on $\dyn$.
We claim that in $\extlcprime$, all of the casts present in $\extlc$ are derivable. We claim that in $\extlcprime$, all of the casts present in $\extlc$ are derivable.
It will suffice to verify that casts for function type are derivable. It will suffice to verify that casts for function type are derivable.
This holds because function casts are constructed inductively from the casts This holds because function casts are constructed inductively from the casts
of their domain and codomain. The base case is one of the casts involving $\nat$ of their domain and codomain. The base case is one of the casts involving $\nat$
or $\dyn \ra \dyn$ which are present in $\extlcprime$ as injections and case inspections. or $\dyntodyn$ which are present in $\extlcprime$ as injections and case inspections.
The resulting calculus $\extlcprime$ now lacks arbitrary casts as a primitive notion: The resulting calculus $\extlcprime$ now lacks arbitrary casts as a primitive notion:
...@@ -228,14 +228,14 @@ for case-nat (the rules for case-arrow are analogous). ...@@ -228,14 +228,14 @@ for case-nat (the rules for case-arrow are analogous).
% inj-arr % inj-arr
\inferrule* \inferrule*
{\hasty \Gamma M (\dyn \ra \dyn)} {\hasty \Gamma M (\dyntodyn)}
{\hasty \Gamma {\injarr M} \dyn} {\hasty \Gamma {\injarr M} \dyn}
% Case dyn % Case dyn
\inferrule* \inferrule*
{\hasty{\Delta|_V}{V}{\dyn} \and {\hasty{\Delta|_V}{V}{\dyn} \and
\hasty{\Delta , x : \nat }{M_{nat}}{B} \and \hasty{\Delta , x : \nat }{M_{nat}}{B} \and
\hasty{\Delta , x : (\dyn \ra \dyn) }{M_{fun}}{B} \hasty{\Delta , x : (\dyntodyn) }{M_{fun}}{B}
} }
{\hasty {\Delta} {\casedyn{V}{n}{M_{nat}}{f}{M_{fun}}} {B}} {\hasty {\Delta} {\casedyn{V}{n}{M_{nat}}{f}{M_{fun}}} {B}}
\end{mathpar} \end{mathpar}
...@@ -252,7 +252,7 @@ for case-nat (the rules for case-arrow are analogous). ...@@ -252,7 +252,7 @@ for case-nat (the rules for case-arrow are analogous).
{\casedyn {\injnat {V}} {n} {M_{nat}} {f} {M_{fun}} = M_{nat}[V/n]} {\casedyn {\injnat {V}} {n} {M_{nat}} {f} {M_{fun}} = M_{nat}[V/n]}
\inferrule* \inferrule*
{\hasty \Gamma V {\dyn \ra \dyn} } {\hasty \Gamma V {\dyntodyn} }
{\casedyn {\injarr {V}} {n} {M_{nat}} {f} {M_{fun}} = M_{fun}[V/f]} {\casedyn {\injarr {V}} {n} {M_{nat}} {f} {M_{fun}} = M_{fun}[V/f]}
% Case-dyn Eta % Case-dyn Eta
...@@ -405,7 +405,7 @@ as a monotone function from $\sem{\Gamma}$ to $\li \sem{A}$. ...@@ -405,7 +405,7 @@ as a monotone function from $\sem{\Gamma}$ to $\li \sem{A}$.
Recall that $\Dyn$ is isomorphic to $\Nat\, + \later (\Dyn \to \li \Dyn)$. Recall that $\Dyn$ is isomorphic to $\Nat\, + \later (\Dyn \to \li \Dyn)$.
Thus, the semantics of $\injnat{\cdot}$ is simply $\inl$ and the semantics Thus, the semantics of $\injnat{\cdot}$ is simply $\inl$ and the semantics
of $\injarr{\cdot}$ is simply $\inr \circ \next$. of $\injarr{\cdot}$ is simply $\inr \circ \nxt$.
The semantics of case inspection on dyn performs a case analysis on the sum. The semantics of case inspection on dyn performs a case analysis on the sum.
The interpretation of $\lda{x}{M}$ works as follows. Recall by the typing rule for The interpretation of $\lda{x}{M}$ works as follows. Recall by the typing rule for
......
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment