Skip to content
Snippets Groups Projects
Commit db994beb authored by Eric Giovannini's avatar Eric Giovannini
Browse files

changes to layout and prose of paper

parent 45e0fe23
No related branches found
No related tags found
No related merge requests found
...@@ -34,6 +34,10 @@ ...@@ -34,6 +34,10 @@
\newcommand{\up}[2]{\langle{#2}\uarrowl{#1}\rangle} \newcommand{\up}[2]{\langle{#2}\uarrowl{#1}\rangle}
\newcommand{\dn}[2]{\langle{#1}\darrowl{#2}\rangle} \newcommand{\dn}[2]{\langle{#1}\darrowl{#2}\rangle}
\newcommand{\upc}[2]{\text{up}\,{#1}\,{#2}}
\newcommand{\dnc}[2]{\text{dn}\,{#1}\,{#2}}
\newcommand{\ret}{\mathsf{ret}} \newcommand{\ret}{\mathsf{ret}}
\newcommand{\err}{\mho} \newcommand{\err}{\mho}
\newcommand{\zro}{\textsf{zro}} \newcommand{\zro}{\textsf{zro}}
...@@ -80,6 +84,9 @@ ...@@ -80,6 +84,9 @@
\newcommand{\pertdyn}[2]{\text{pert-dyn}({#1}, {#2})} \newcommand{\pertdyn}[2]{\text{pert-dyn}({#1}, {#2})}
\newcommand{\delaypert}[1]{\text{delay-pert}({#1})} \newcommand{\delaypert}[1]{\text{delay-pert}({#1})}
\newcommand{\pertc}{\text{Pert}_{\text{C}}}
\newcommand{\pertv}{\text{Pert}_{\text{V}}}
% SGDT and Intensional Stuff % SGDT and Intensional Stuff
......
...@@ -2,18 +2,24 @@ ...@@ -2,18 +2,24 @@
% gradual typing, graduality % gradual typing, graduality
\subsection{Gradual Typing and Graduality} \subsection{Gradual Typing and Graduality}
One of the principal ways of categorizing type systems of programming languages is In modern programming language design, there is a tension between \emph{static} typing
whether they are \emph{static} or \emph{dynamic}. and \emph{dynamic} typing disciplines.
In static typing, the code is type-checked at compile time, while in dynamic typing, With static typing, the code is type-checked at compile time, while in dynamic typing,
the type checking is deferred to run-time. Both approaches have benefits: with static typing, the type checking is deferred to run-time. Both approaches have benefits: with static
the programmer is assured that if the program passes the type-checker, their program typing, the programmer is assured that if the program passes the type-checker, their
is free of type errors, and moreover, the equational theory implies that program optimizations are valid. program is free of type errors, and moreover, soundness of the equational theory implies
that program optimizations are valid.
Meanwhile, dynamic typing allows the programmer to rapidly prototype Meanwhile, dynamic typing allows the programmer to rapidly prototype
their application code without needing to commit to fixed type signatures for their functions. their application code without needing to commit to fixed type signatures for their
functions. Unfortunately, most languages choose between static or dynamic typing.
\emph{Gradually-typed languages} \cite{siek-taha06, tobin-hochstadt06} allow As a result, programmers that initially write their code in a dynamically typed language
for both static and dynamic typing disciplines to be used in the same codebase, in order to benefit from faster prototyping and development time will need to rewrite the
and support smooth interoperability between statically-typed and dynamically-typed code. entire codebase in a static language if they would like to receive the benefits of static
typing once their codebase has matured.
\emph{Gradually-typed languages} \cite{siek-taha06, tobin-hochstadt06} seek to resolve this tension
by allowing for both static and dynamic typing disciplines to be used in the same codebase,
and by supporting smooth interoperability between statically-typed and dynamically-typed code.
This flexibility allows programmers to begin their projects in a dynamic style and This flexibility allows programmers to begin their projects in a dynamic style and
enjoy the benefits of dynamic typing related to rapid prototyping and easy modification enjoy the benefits of dynamic typing related to rapid prototyping and easy modification
while the codebase ``solidifies''. Over time, as parts of the code become more mature while the codebase ``solidifies''. Over time, as parts of the code become more mature
...@@ -22,34 +28,49 @@ and the programmer is more certain of what the types should be, the code can be ...@@ -22,34 +28,49 @@ and the programmer is more certain of what the types should be, the code can be
rewrite the project in a completely different language. rewrite the project in a completely different language.
%Gradually-typed languages should satisfy two intuitive properties. %Gradually-typed languages should satisfy two intuitive properties.
The following two properties have been identified as useful for gradually typed languages. % The following two properties have been identified as useful for gradually typed languages.
First, the interaction between the static and dynamic components of the codebase
In order for this to work as expected, gradually-typed languages should allow for
different parts of the codebase to be in different places along the spectrum from
dynamic to static, and allow for those different parts to interact with one another.
Moreover, gradually-typed languages should support the smooth migration from
dynamic typing to static typing, in that the programmer can initially leave off the
typing annotations and provide them later without altering the meaning of the program.
% Sound gradual typing
Furthermore, the parts of the program that are written in a dynamic style should soundly
interoperate with the parts that are written in a static style.
That is, the interaction between the static and dynamic components of the codebase
should preserve the guarantees made by the static types. should preserve the guarantees made by the static types.
In particular, while statically-typed code can error at runtime in a gradually-typed language, In particular, while statically-typed code can error at runtime in a gradually-typed language,
such an error can always be traced back to a dynamically-typed term that such an error can always be traced back to a dynamically-typed term that
violated the typing contract imposed by statically typed code. violated the typing contract imposed by statically typed code.
% In other words, in the statically-typed portions of the codebase, type soundness must be preserved.
Second, gradually-typed languages should support the smooth migration from dynamic typing % Moreover, gradually-typed languages should allow for
to static typing, in that the programmer can initially leave off the % different parts of the codebase to be in different places along the spectrum from
typing annotations and provide them later without altering the meaning of the % dynamic to static, and allow for those different parts to interact with one another.
program. % In a \emph{sound} gradually-typed language,
% this interaction should respect the guarantees made by the static types.
% Graduality property % Graduality property
Formally speaking, gradually typed languages should satisfy the Formally speaking, gradually typed languages should satisfy the
\emph{dynamic gradual guarantee}, originally defined by Siek, Vitousek, Cimini, \emph{dynamic gradual guarantee}, originally defined by Siek, Vitousek, Cimini,
and Boyland \cite{siek_et_al:LIPIcs:2015:5031}. and Boyland \cite{siek_et_al:LIPIcs:2015:5031}.
This property is also referred to as \emph{graduality} \cite{new-ahmed2018}, by analogy with parametricity. This property is also referred to as \emph{graduality} \cite{new-ahmed2018},
by analogy with parametricity.
Intuitively, graduality says that going from a dynamic to static style should not Intuitively, graduality says that going from a dynamic to static style should not
introduce changes in the meaning of the program. introduce changes in the meaning of the program.
More specifically, making the types more precise by adding typing annotations More specifically, making the types more precise by adding typing annotations
% (replacing $\dyn$ with a more specific type
%such as integers)
will either result in the same behavior as the original, less precise program, will either result in the same behavior as the original, less precise program,
or will result in a type error. or will result in a type error.
\subsection{Current Approaches to Proving Graduality} \subsection{Current Approaches to Proving Graduality}
Current approaches to proving graduality include the methods of Abstracting Gradual Typing
\subsubsection{From Static to Gradual}
Current approaches to constructing languages that satisfy the graduality property
include the methods of Abstracting Gradual Typing
\cite{garcia-clark-tanter2016} and the formal tools of the Gradualizer \cite{cimini-siek2016}. \cite{garcia-clark-tanter2016} and the formal tools of the Gradualizer \cite{cimini-siek2016}.
These allow the language developer to start with a statically typed language and derive a These allow the language developer to start with a statically typed language and derive a
gradually typed language that satisfies the gradual guarantee. The main downside to gradually typed language that satisfies the gradual guarantee. The main downside to
...@@ -58,13 +79,15 @@ the language designer must adhere to the predefined framework. ...@@ -58,13 +79,15 @@ the language designer must adhere to the predefined framework.
Many gradually typed languages do not fit into either framework, e.g., Typed Racket Many gradually typed languages do not fit into either framework, e.g., Typed Racket
\cite{tobin-hochstadt06, tobin-hochstadt08}. \cite{tobin-hochstadt06, tobin-hochstadt08}.
% %
Furthermore, while these frameworks do prove Furthermore, while these frameworks do prove graduality of the resulting languages,
graduality of the resulting languages, they do not show the correctness of the equational theory, they do not show the correctness of the equational theory, which is equally important
which is equally important to sound gradual typing. For example, programmers often refactor their to sound gradual typing.
code, and in so doing they rely implicitly on the validity of the laws in the equational theory. For example, programmers often refactor their code, and in so doing they rely
Similarly, correctness of compiler optimizations rests on the validity of the corresponding equations implicitly on the validity of the laws in the equational theory.
from the equational theory. It is therefore important that the languages that claim to be gradually typed Similarly, correctness of compiler optimizations rests on the validity of the
have provably correct equational theories. corresponding equations from the equational theory. It is therefore important
that the languages that claim to be gradually typed have provably correct
equational theories.
% The approaches are too inflexible... the fact that the resulting semantics are too lazy % The approaches are too inflexible... the fact that the resulting semantics are too lazy
% is a consequence of that inflexibility. % is a consequence of that inflexibility.
...@@ -76,34 +99,81 @@ have provably correct equational theories. ...@@ -76,34 +99,81 @@ have provably correct equational theories.
%in which the semantics determines the type checking. Without a clear semantic interpretation of type %in which the semantics determines the type checking. Without a clear semantic interpretation of type
%dynamism, it becomes difficult to extend these techniques to new language features such as polymorphism. %dynamism, it becomes difficult to extend these techniques to new language features such as polymorphism.
% Proving graduality via LR
New and Ahmed \cite{new-ahmed2018} % [Eric] I moved the next two paragraphs from the technical background section
have developed a semantic approach to specifying type dynamism in terms of % to here in the intro.
\emph{embedding-projection pairs}, which allows for a particularly elegant formulation of the \subsubsection{Classical Domain Models}
gradual guarantee.
Moreover, their axiomatic account of program equivalence allows for type-based reasoning New and Licata \cite{new-licata18} developed an axiomatic account of the
about program equivalences. graduality relation on cast calculus terms and gave a denotational
% model of this calculus using classical domain theory based on
In this approach, a logical relation is constructed and used to show that the equational theory $\omega$-CPOs. This semantics has scaled up to an analysis of a
is sound with respect to the operational semantics. dependently typed gradual calculus in \cite{asdf}. This meets our
%and shown to be sound with respect to criterion of being a reusable mathematical theory, as general semantic
%the notion of observational approximation that specifies when one program is more precise than another. theorems about gradual domains can be developed independent of any
particular syntax and then reused in many different denotational
models. However, it is widely believed that such classical domain
theoretic techniques cannot be extended to model higher-order store, a
standard feature of realistic gradually typed languages such as Typed
Racket. Thus if we want a reusable mathematical theory of gradual
typing that can scale to realistic programming languages, we need to
look elsewhere to so-called ``step-indexed'' techniques.
\subsubsection{Operational Reduction Proofs}
A series of works \cite{new-ahmed2018, new-licata-ahmed2019, new-jamner-ahmed19}
developed step-indexed logical relations models of gradually typed
languages based on operational semantics. Unlike classical domain
theory, such step-indexed techniques are capable of modeling
higher-order store and runtime-extensible dynamic types
\cite{amalsthesis,nonpmetricparamorsomething,new-jamner-ahmed19}. However,
their proof developments are highly repetitive and technical, with
each development formulating a logical relation from first-principles
and proving many of the same tedious lemmas without reusable
mathematical abstractions. Our goal in the current work is to extract
these reusable mathematical principles from these tedious models to
make formalization of realistic gradual languages tractible.
% Alternative phrasing:
\begin{comment}
\subsubsection{Embedding-Projection Pairs}
The series of works by New, Licata, and Ahmed \cite{new-licata18,new-ahmed2018,new-licata-ahmed2019}
develop an axiomatic account of gradual typing involving \emph{embedding-projection pairs}.
This allows for a particularly elegant formulation of the gradual guarantee.
Moreover, their axiomatic account of program equivalence allows for type-based reasoning about program equivalences.
In \cite{new-licata18}, New and Licata construct a denotational model of the axioms of
gradual typing using tools from classical domain theory. The benefit to this approach is that it is a reusable mathematical theory:
general semantic theorems about gradual domains can be developed independent of any particular syntax and then reused in many different denotational models.
Unfortunately, however, it is unclear how to extend these domain-theoretic techniques to incorporate more advanced language features such as higher-order store, a standard feature of realistic gradually typed languages such as Typed
Racket.
Thus, if we want a reusable mathematical theory of gradual typing that can scale to realistic programming languages, we need to look to different techniques.
In \cite{new-ahmed2018,new-licata-ahmed2019}, a more traditional operational semantics for gradual typing is derived from the axioms.
% The axioms are then proven to be sound with respect to this operational semantics by constructing a logical relation.
A logical relations model is constructed and used to prove that the axioms are sound with respect to the operational semantics.
The downside of this approach is that each new language requires a different logical relation The downside of this approach is that each new language requires a different logical relation
to prove graduality. Furthermore, the logical relations tend to be quite complicated due to prove graduality, even though the logical relations for different languages end up sharing many commonalities.
to a technical requirement known as \emph{step-indexing}, where the stepping behavior of terms Furthermore, the logical relations tend to be quite complicated due to a technical requirement known as \emph{step-indexing},
must be accounted for in the logical relation. where the stepping behavior of terms must be accounted for in the logical relation.
As a result, developments using this approach tend to require vast effort, with the As a result, developments using this approach tend to require vast effort, with the
corresponding technical reports having 50+ pages of proofs. corresponding technical reports having 50+ pages of proofs.
%
Additionally, while the axioms of gradual type theory are compositional at a ``global'' level, % In this approach, a logical relation is constructed and used to show that the equational theory
they do not compose in the step-indexed setting. One of the main goals of the present work % is sound with respect to the operational semantics.
is to formulate a composable theory of gradual typing in a setting where the stepping behavior
is tracked. % Additionally, while the axioms of gradual type theory are compositional at a ``global'' level,
% they do not compose in the step-indexed setting. One of the main goals of the present work
% is to formulate a composable theory of gradual typing in a setting where the stepping behavior
% is tracked.
\end{comment}
An alternative approach, which we investigate in this paper, is provided by An alternative approach, which we investigate in this paper, is provided by
\emph{synthetic guarded domain theory}. \emph{synthetic guarded domain theory}.
The techniques of synthetic guarded domain theory allow us to internalize the The techniques of synthetic guarded domain theory allow us to internalize the
step-index reasoning normally required in logical relations proofs of graduality, step-indexed reasoning normally required in logical relations proofs of graduality,
ultimately allowing us to specify the logical relation in a manner that looks nearly ultimately allowing us to specify the logical relation in a manner that looks nearly
identical to a typical, non-step-indexed logical relation. identical to a typical, non-step-indexed logical relation.
...@@ -112,12 +182,11 @@ and soundness of the equational theory of cast calculi using the techniques of S ...@@ -112,12 +182,11 @@ and soundness of the equational theory of cast calculi using the techniques of S
We take a step toward mechanization in the Agda proof assistant by describing a compositional We take a step toward mechanization in the Agda proof assistant by describing a compositional
denotational semantics for gradual typing in a setting where the steps taken by terms are tracked. denotational semantics for gradual typing in a setting where the steps taken by terms are tracked.
Our longer-term goal is to mechanize these proofs in a reusable way, Our longer-term goal is to mechanize these proofs in a reusable way, thereby providing a framework
thereby providing a framework to use to more easily and to use to more easily and conveniently prove that existing languages satisfy graduality and have
conveniently prove that existing languages satisfy graduality and have sound equational theories. Moreover, the aim is for designers of new languages to utilize the
sound equational theories. Moreover, the aim is for designers of new languages framework to facilitate the design of new provably-correct gradually-typed languages with more
to utilize the framework to facilitate the design of new provably-correct complex features.
gradually-typed languages with more complex features.
\subsection{Contributions} \subsection{Contributions}
...@@ -131,17 +200,16 @@ to prove graduality for the simply-typed gradual lambda calculus. ...@@ -131,17 +200,16 @@ to prove graduality for the simply-typed gradual lambda calculus.
% independent interest. % independent interest.
\subsection{Overview of Remainder of Paper} \subsection{Outline of Remainder of Paper}
In Section \ref{sec:overview}, we give an overview of the gradually-typed lambda % In Section \ref{sec:overview}, we give an overview of the gradually-typed lambda
calculus and the graduality theorem. % calculus and the graduality theorem.
% %
In Section \ref{sec:technical-background}, we provide technical background on gradually typed languages and In Section \ref{sec:technical-background}, we provide technical background on synthetic guarded domain theory.
on synthetic guarded domain theory.
% %
In Section \ref{sec:gtlc-terms}, we introduce the gradually-typed cast calculus In Section \ref{sec:gtlc-terms}, we formally introduce the gradually-typed cast calculus
for which we will prove graduality. We give a semantics to the terms for which we will prove graduality. We give a semantics to the terms using
using the tools of guarded type theory. synthetic guarded domain theory.
% Important here are the notions of syntactic % Important here are the notions of syntactic
% type precision and term precision. For reasons we describe below, we % type precision and term precision. For reasons we describe below, we
% introduce two related calculi, one in which the stepping behavior of terms is % introduce two related calculi, one in which the stepping behavior of terms is
...@@ -149,8 +217,9 @@ using the tools of guarded type theory. ...@@ -149,8 +217,9 @@ using the tools of guarded type theory.
% is made explicit (an \emph{intensional} calculus, $\intlc$). % is made explicit (an \emph{intensional} calculus, $\intlc$).
% %
In Section \ref{sec:gtlc-precision}, we define the term precision ordering In Section \ref{sec:gtlc-precision}, we define the term precision ordering for
and describe our approach to assigning a denotational semantics to this ordering. the gradually-typed lambda calculus and describe our approach to assigning a
denotational semantics to this ordering.
This approach builds on the semantics constructed in the previous section, This approach builds on the semantics constructed in the previous section,
but extending it to the term ordering presents new challenges. but extending it to the term ordering presents new challenges.
...@@ -165,15 +234,12 @@ but extending it to the term ordering presents new challenges. ...@@ -165,15 +234,12 @@ but extending it to the term ordering presents new challenges.
% constructions defined in the previous section. % constructions defined in the previous section.
% %
In Section \ref{sec:graduality}, we outline in more detail the proof of graduality for the In Section \ref{sec:graduality}, we outline in more detail the proof of graduality for the
extensional gradual lambda calculus. gradual lambda calculus.
% %
In Section \ref{sec:discussion}, we discuss the benefits and drawbacks to our approach in comparison In Section \ref{sec:discussion}, we discuss the benefits and drawbacks to our approach in comparison
to the traditional step-indexing approach, as well as possibilities for future work. to the traditional step-indexing approach, as well as possibilities for future work.
\subsection{Overview of Results}\label{sec:overview}
\section{Overview}\label{sec:overview}
% This section used to be part of the intro. % This section used to be part of the intro.
% \subsection{Proving Graduality in SGDT} % \subsection{Proving Graduality in SGDT}
...@@ -183,12 +249,47 @@ to the traditional step-indexing approach, as well as possibilities for future w ...@@ -183,12 +249,47 @@ to the traditional step-indexing approach, as well as possibilities for future w
In this paper, we will utilize SGDT techniques to prove graduality for a particularly In this paper, we will utilize SGDT techniques to prove graduality for a particularly
simple gradually-typed cast calculus, the gradually-typed lambda calculus. simple gradually-typed cast calculus, the gradually-typed lambda calculus.
This is the usual simply-typed lambda calculus with a dynamic type $\dyn$ such that Before stating the graduality theorem, we introduce some definitions related to gradual typing.
$A \ltdyn\, \dyn$ for all types $A$, as well as upcasts and downcasts between any types
$A$ and $B$ such that $A \ltdyn B$. The complete definition will be provided in % Cast calculi
Section \ref{sec:GTLC}. % TODO move this to earlier?
The graduality theorem is shown below. Gradually typed languages are generally elaborated to a \emph{cast calculus}, in which run-time type checking
that is made explicit. Elaboration inserts \emph{type casts} at the boundaries between static and dynamic code.
In particular, cast insertion happens at the elimination forms of the language (e.g., pattern matching, field reference, etc.).
The original gradually typed language that is elaborated to a cast calculus is called the \emph{surface language}.
% Up and down casts
In a cast calculus, there is a relation $\ltdyn$ on types such that $A \ltdyn B$ means that
$A$ is a \emph{more precise} type than $B$.
There a distinguished type $\dyn$, the \emph{dynamic type}, with the property that $A \ltdyn\, \dyn$ for all $A$.
The parts of the code that are dynamically typed will be given type $\dyn$ in the cast calculus.
%
If $A \ltdyn B$, a term $M$ of type $A$ may be \emph{up}casted to $B$, written $\up A B M$,
and a term $N$ of type $B$ may be \emph{down}casted to $A$, written $\dn A B N$.
Upcasts always succeed, while downcasts may fail at runtime, resulting in a type error.
Cast calculi have a term $\mho$ representing a run-time type error for any type $A$.
% Syntactic term precision
We also have a notion of \emph{syntactic term precision}.
If $A \ltdyn B$, and $M$ and $N$ are terms of type $A$ and $B$ respectively, we write
$M \ltdyn N : A \ltdyn B$ to mean that
$M$ is ``syntactically more precise'' than $N$, or equivalently, $N$ is
``more dynamic'' than $M$. Intuitively, this models the situation where $M$ and $N$
behave the same except that $M$ may error more.
Term precision is defined by a set of axioms that capture this intuitive notion.
The specific rules for term precision depend on the specifics of the language, but
many of the rules can be derived from the typing rules in a straightforward way.
% Cast calculi also have a term $\mho$ representing a run-time type error for any type $A$,
% and since
Since $\mho$ always errors, there is a term precision rule stating that $\mho \ltdyn M$ for all $M$.
% GTLC
The gradually-typed lambda calculus is the usual simply-typed lambda calculus with a dynamic
type $\dyn$ such that $A \ltdyn\, \dyn$ for all types $A$, as well as upcasts and downcasts
between any types $A$ and $B$ such that $A \ltdyn B$. The complete definition will be provided in
Section \ref{sec:gtlc-precision}.
With these definitions, we can state the graduality theorem for the gradually-typed lambda calculus.
% \begin{theorem}[Graduality] % \begin{theorem}[Graduality]
% If $M \ltdyn N : \nat$, then either: % If $M \ltdyn N : \nat$, then either:
...@@ -200,34 +301,16 @@ The graduality theorem is shown below. ...@@ -200,34 +301,16 @@ The graduality theorem is shown below.
% \end{enumerate} % \end{enumerate}
% \end{theorem} % \end{theorem}
\begin{theorem}[Graduality] \begin{theorem}[Graduality]
If $\cdot \vdash M \ltdyn N : \nat$, then Let $\cdot \vdash M \ltdyn N : \nat$.
\begin{enumerate} If $M \Downarrow v_?$ and $N \Downarrow v'_?$, then either $v_? = \mho$, or $v_? = v'_?$.
\item $M \Downarrow$ iff $N \Downarrow$.
\item If $M \Downarrow v_?$ and $N \Downarrow v'_?$ then either $v_? = \mho$, or $v_? = v'_?$.
\end{enumerate}
\end{theorem} \end{theorem}
Details can be found in later sections, but we provide a brief explanation of the terminology and notation: Note:
\begin{itemize} \begin{itemize}
\item $M \ltdyn N : \nat$ means $M$ and $N$ are terms of type $\nat$ such that
$M$ is ``syntactically more precise'' than $N$, or equivalently, $N$ is
``more dynamic'' than $M$. Intuitively this means that $M$ and $N$ are the
same except that in some places where $M$ has explicit typing annotations,
$N$ has $\dyn$ instead. The relation $\ltdyn$ on terms is called ``term precision''.
Term precision is intended to model the situation where a term $M$ behaves the
same as another term $N$ except that $M$ may error more than $N$.
Term precision is defined by a set of axioms that capture this intuitive notion.
This will be described in more detail in Section \ref{sec:GTLC}.
\item $\cdot \Downarrow$ is a relation on terms that is defined such that $M \Downarrow$ means
that $M$ terminates, either with a run-time error or a value $n$ of type $\nat$.
\item $\mho$ is a syntactic representation of a run-time type error, which \item $\cdot \Downarrow$ is a relation between terms that is defined such that $M \Downarrow$ means
happens, for example, when a programmer tries to call a function with a value whose type that $M$ terminates, either with a run-time error or a value $n$ of type $\nat$.
is found to be incompatible with the argument type of the function.
\item $v_?$ is shorthand for the syntactic representation of a term that is either equal to \item $v_?$ is shorthand for the syntactic representation of a term that is either equal to
$\mho$, or equal to the syntactic representation of a value $n$ of type $\nat$. $\mho$, or equal to the syntactic representation of a value $n$ of type $\nat$.
......
...@@ -8,7 +8,7 @@ relations, then show how to give them a semantics using SGDT. ...@@ -8,7 +8,7 @@ relations, then show how to give them a semantics using SGDT.
% TODO mention intensional syntax % TODO mention intensional syntax
\subsection{Term Precision for GTLC} \subsection{Term Precision for GTLC}\label{sec:gtlc-term-precision-axioms}
% --------------------------------------------------------------------------------------- % ---------------------------------------------------------------------------------------
% --------------------------------------------------------------------------------------- % ---------------------------------------------------------------------------------------
...@@ -375,10 +375,10 @@ More concretely, consider a simplified version of the DnL rule shown below: ...@@ -375,10 +375,10 @@ More concretely, consider a simplified version of the DnL rule shown below:
\begin{mathpar} \begin{mathpar}
\inferrule*{M \ltdyn_i N : B} \inferrule*{M \ltdyn_i N : B}
{\dn c M \ltdyn_i N : c} {\dnc{c}{M} \ltdyn_i N : c}
\end{mathpar} \end{mathpar}
If $c$ is inj-arr, then when we downcast $M$ from $dyn$ to $\dyn \ra \dyn$, If $c$ is inj-arr, then when we downcast $M$ from $dyn$ to $\dyntodyn$,
semantically this will involve a $\theta$ because the value of type $dyn$ semantically this will involve a $\theta$ because the value of type $dyn$
in the semantics will contain a \emph{later} function $\tilde{f}$. in the semantics will contain a \emph{later} function $\tilde{f}$.
Thus, in order for the right-hand side to be related to the downcast, Thus, in order for the right-hand side to be related to the downcast,
...@@ -387,7 +387,7 @@ we need to insert a delay on the right. ...@@ -387,7 +387,7 @@ we need to insert a delay on the right.
The need for delays affects the cast rules involving upcasts as well, because The need for delays affects the cast rules involving upcasts as well, because
the upcast for functions involves a downcast on the domain: the upcast for functions involves a downcast on the domain:
\[ \up{c_i \ra c_o}{M} \equiv \lambda (x : B_i). \up{c_o}(M\, (\dn {c_i} x)). \] \[ \up{A_i \ra A_o}{B_i \ra B_o}{M} \equiv \lambda (x : B_i). \up{A_o}{B_o}(M\, (\dn {A_i}{B_i} x)). \]
Thus, the correct versions of the cast rules involve delays on the side that was not casted. Thus, the correct versions of the cast rules involve delays on the side that was not casted.
...@@ -418,11 +418,11 @@ for embeddings $\perte$ and for projections $\pertp$ by the following rules: ...@@ -418,11 +418,11 @@ for embeddings $\perte$ and for projections $\pertp$ by the following rules:
{\delta_c \ra \delta_d : \pertp (A \ra B)} {\delta_c \ra \delta_d : \pertp (A \ra B)}
\inferrule \inferrule
{\delta_\nat : \perte \nat \and \delta_f : \perte (\dyn \ra \dyn)} {\delta_\nat : \perte \nat \and \delta_f : \perte (\dyntodyn)}
{\pertdyn{\delta_\nat}{\delta_f} : \perte \dyn} {\pertdyn{\delta_\nat}{\delta_f} : \perte \dyn}
\inferrule \inferrule
{\delta_\nat : \pertp \nat \and \delta_f : \pertp (\dyn \ra \dyn)} {\delta_\nat : \pertp \nat \and \delta_f : \pertp (\dyntodyn)}
{\pertdyn{\delta_\nat}{\delta_f} : \pertp \dyn} {\pertdyn{\delta_\nat}{\delta_f} : \pertp \dyn}
\end{mathpar} \end{mathpar}
......
\section{Technical Background}\label{sec:technical-background} \section{Technical Background}\label{sec:technical-background}
\subsection{Gradual Typing}
% Cast calculi
In a gradually-typed language, the mixing of static and dynamic code is seamless, in that
the dynamically typed parts are checked at runtime. This type checking occurs at the elimination
forms of the language (e.g., pattern matching, field reference, etc.).
Gradual languages are generally elaborated to a \emph{cast calculus}, in which the dynamic
type checking is made explicit through the insertion of \emph{type casts}.
% Up and down casts
In a cast calculus, there is a relation $\ltdyn$ on types such that $A \ltdyn B$ means that
$A$ is a \emph{more precise} type than $B$.
There a dynamic type $\dyn$ with the property that $A \ltdyn\, \dyn$ for all $A$.
%
If $A \ltdyn B$, a term $M$ of type $A$ may be \emph{up}casted to $B$, written $\up A B M$,
and a term $N$ of type $B$ may be \emph{down}casted to $A$, written $\dn A B N$.
Upcasts always succeed, while downcasts may fail at runtime.
%
% Syntactic term precision
We also have a notion of \emph{syntactic term precision}.
If $A \ltdyn B$, and $M$ and $N$ are terms of type $A$ and $B$ respectively, we write
$M \ltdyn N : A \ltdyn B$ to mean that $M$ is more precise than $N$, i.e., $M$ and $N$
behave the same except that $M$ may error more.
% Modeling the dynamic type as a recursive sum type? % Modeling the dynamic type as a recursive sum type?
% Observational equivalence and approximation? % Observational equivalence and approximation?
% synthetic guarded domain theory, denotational semantics therein % synthetic guarded domain theory, denotational semantics therein
\subsection{Operational Reduction Proofs}
\subsection{Classical Domain Models}
New and Licata \cite{new-licata18} developed an axiomatic account of the
graduality relation on cast calculus terms and gave a denotational
model of this calculus using classical domain theory based on
$\omega$-CPOs. This semantics has scaled up to an analysis of a
dependently typed gradual calculus in \cite{asdf}. This meets our
criterion of being a reusable mathematical theory, as general semantic
theorems about gradual domains can be developed independent of any
particular syntax and then reused in many different denotational
models. However, it is widely believed that such classical domain
theoretic techniques cannot be extended to model higher-order store, a
standard feature of realistic gradually typed languages such as Typed
Racket. Thus if we want a reusable mathematical theory of gradual
typing that can scale to realistic programming languages, we need to
look elsewhere to so-called ``step-indexed'' techniques.
A series of works \cite{new-ahmed2018, new-licata-ahmed2019, new-jamner-ahmed19}
developed step-indexed logical relations models of gradually typed
languages based on operational semantics. Unlike, classical domain
theory, such step-indexed techniques are capable of modeling
higher-order store and runtime-extensible dynamic types
\cite{amalsthesis,nonpmetricparamorsomething,new-jamner-ahmed19}. However
their proof developments are highly repetitive and technical, with
each development formulating a logical relation from first-principles
and proving many of the same tedious lemmas without reusable
mathematical abstractions. Our goal in the current work is to extract
these reusable mathematical principles from these tedious models to
make formalization of realistic gradual languages tractible.
\subsection{Difficulties in Prior Semantics} \subsection{Difficulties in Prior Semantics}
% Difficulties in prior semantics % Difficulties in prior semantics
...@@ -76,9 +20,8 @@ make formalization of realistic gradual languages tractible. ...@@ -76,9 +20,8 @@ make formalization of realistic gradual languages tractible.
% %
Reasoning about step-indexed logical relations Reasoning about step-indexed logical relations
can be tedious and error-prone, and there are some very subtle aspects that must can be tedious and error-prone, and there are some very subtle aspects that must
be taken into account in the proofs. Figure \ref{TODO} shows an example of a step-indexed logical be taken into account in the proofs.
relation for the gradually-typed lambda calculus. %
In particular, the prior approach of New and Ahmed requires two separate logical In particular, the prior approach of New and Ahmed requires two separate logical
relations for terms, one in which the steps of the left-hand term are counted, relations for terms, one in which the steps of the left-hand term are counted,
and another in which the steps of the right-hand term are counted. and another in which the steps of the right-hand term are counted.
......
...@@ -128,7 +128,7 @@ as well a $\beta$ and $\eta$ law for bind. ...@@ -128,7 +128,7 @@ as well a $\beta$ and $\eta$ law for bind.
The type precision rules specify what it means for a type $A$ to be more precise than $A'$. The type precision rules specify what it means for a type $A$ to be more precise than $A'$.
We have reflexivity rules for $\dyn$ and $\nat$, as well as rules that $\nat$ is more precise than $\dyn$ We have reflexivity rules for $\dyn$ and $\nat$, as well as rules that $\nat$ is more precise than $\dyn$
and $\dyn \ra \dyn$ is more precise than $\dyn$. and $\dyntodyn$ is more precise than $\dyn$.
We also have a congruence rule for function types stating that given $A_i \ltdyn A'_i$ and $A_o \ltdyn A'_o$, we can prove We also have a congruence rule for function types stating that given $A_i \ltdyn A'_i$ and $A_o \ltdyn A'_o$, we can prove
$A_i \ra A_o \ltdyn A'_i \ra A'_o$. Note that precision is covariant in both the domain and codomain. $A_i \ra A_o \ltdyn A'_i \ra A'_o$. Note that precision is covariant in both the domain and codomain.
Finally, we can lift a relation on value types $A \ltdyn A'$ to a relation $\Ret A \ltdyn \Ret A'$ on Finally, we can lift a relation on value types $A \ltdyn A'$ to a relation $\Ret A \ltdyn \Ret A'$ on
...@@ -150,10 +150,10 @@ computation types. ...@@ -150,10 +150,10 @@ computation types.
\inferrule*[right = $\textsf{Inj}_{\ra}$] \inferrule*[right = $\textsf{Inj}_{\ra}$]
{ } { }
{(\dyn \ra \dyn) \ltdyn\, \dyn} {(\dyntodyn) \ltdyn\, \dyn}
\inferrule*[right = $\injarr{}$] \inferrule*[right = $\injarr{}$]
{(R \ra S) \ltdyn\, (\dyn \ra \dyn)} {(R \ra S) \ltdyn\, (\dyntodyn)}
{(R \ra S) \ltdyn\, \dyn} {(R \ra S) \ltdyn\, \dyn}
...@@ -182,20 +182,20 @@ This notion will be used below in the statement of transitivity of the term prec ...@@ -182,20 +182,20 @@ This notion will be used below in the statement of transitivity of the term prec
\subsection{Removing Casts as Primitives} \subsection{Removing Casts as Primitives}
% We now observe that all casts, except those between $\nat$ and $\dyn$ % We now observe that all casts, except those between $\nat$ and $\dyn$
% and between $\dyn \ra \dyn$ and $\dyn$, are admissible, in the sense that % and between $\dyntodyn$ and $\dyn$, are admissible, in the sense that
% we can start from $\extlcm$, remove casts except the aforementioned ones, % we can start from $\extlcm$, remove casts except the aforementioned ones,
% and in the resulting language we will be able to derive the other casts. % and in the resulting language we will be able to derive the other casts.
We now observe that all casts, except those between $\nat$ and $\dyn$ We now observe that all casts, except those between $\nat$ and $\dyn$
and between $\dyn \ra \dyn$ and $\dyn$, are admissible. and between $\dyntodyn$ and $\dyn$, are admissible.
That is, consider a new language ($\extlcprime$) in which That is, consider a new language ($\extlcprime$) in which
instead of having arbitrary casts, we have injections from $\nat$ and instead of having arbitrary casts, we have injections from $\nat$ and
$\dyn \ra \dyn$ into $\dyn$, and a case inspection on $\dyn$. $\dyntodyn$ into $\dyn$, and a case inspection on $\dyn$.
We claim that in $\extlcprime$, all of the casts present in $\extlc$ are derivable. We claim that in $\extlcprime$, all of the casts present in $\extlc$ are derivable.
It will suffice to verify that casts for function type are derivable. It will suffice to verify that casts for function type are derivable.
This holds because function casts are constructed inductively from the casts This holds because function casts are constructed inductively from the casts
of their domain and codomain. The base case is one of the casts involving $\nat$ of their domain and codomain. The base case is one of the casts involving $\nat$
or $\dyn \ra \dyn$ which are present in $\extlcprime$ as injections and case inspections. or $\dyntodyn$ which are present in $\extlcprime$ as injections and case inspections.
The resulting calculus $\extlcprime$ now lacks arbitrary casts as a primitive notion: The resulting calculus $\extlcprime$ now lacks arbitrary casts as a primitive notion:
...@@ -228,14 +228,14 @@ for case-nat (the rules for case-arrow are analogous). ...@@ -228,14 +228,14 @@ for case-nat (the rules for case-arrow are analogous).
% inj-arr % inj-arr
\inferrule* \inferrule*
{\hasty \Gamma M (\dyn \ra \dyn)} {\hasty \Gamma M (\dyntodyn)}
{\hasty \Gamma {\injarr M} \dyn} {\hasty \Gamma {\injarr M} \dyn}
% Case dyn % Case dyn
\inferrule* \inferrule*
{\hasty{\Delta|_V}{V}{\dyn} \and {\hasty{\Delta|_V}{V}{\dyn} \and
\hasty{\Delta , x : \nat }{M_{nat}}{B} \and \hasty{\Delta , x : \nat }{M_{nat}}{B} \and
\hasty{\Delta , x : (\dyn \ra \dyn) }{M_{fun}}{B} \hasty{\Delta , x : (\dyntodyn) }{M_{fun}}{B}
} }
{\hasty {\Delta} {\casedyn{V}{n}{M_{nat}}{f}{M_{fun}}} {B}} {\hasty {\Delta} {\casedyn{V}{n}{M_{nat}}{f}{M_{fun}}} {B}}
\end{mathpar} \end{mathpar}
...@@ -252,7 +252,7 @@ for case-nat (the rules for case-arrow are analogous). ...@@ -252,7 +252,7 @@ for case-nat (the rules for case-arrow are analogous).
{\casedyn {\injnat {V}} {n} {M_{nat}} {f} {M_{fun}} = M_{nat}[V/n]} {\casedyn {\injnat {V}} {n} {M_{nat}} {f} {M_{fun}} = M_{nat}[V/n]}
\inferrule* \inferrule*
{\hasty \Gamma V {\dyn \ra \dyn} } {\hasty \Gamma V {\dyntodyn} }
{\casedyn {\injarr {V}} {n} {M_{nat}} {f} {M_{fun}} = M_{fun}[V/f]} {\casedyn {\injarr {V}} {n} {M_{nat}} {f} {M_{fun}} = M_{fun}[V/f]}
% Case-dyn Eta % Case-dyn Eta
...@@ -405,7 +405,7 @@ as a monotone function from $\sem{\Gamma}$ to $\li \sem{A}$. ...@@ -405,7 +405,7 @@ as a monotone function from $\sem{\Gamma}$ to $\li \sem{A}$.
Recall that $\Dyn$ is isomorphic to $\Nat\, + \later (\Dyn \to \li \Dyn)$. Recall that $\Dyn$ is isomorphic to $\Nat\, + \later (\Dyn \to \li \Dyn)$.
Thus, the semantics of $\injnat{\cdot}$ is simply $\inl$ and the semantics Thus, the semantics of $\injnat{\cdot}$ is simply $\inl$ and the semantics
of $\injarr{\cdot}$ is simply $\inr \circ \next$. of $\injarr{\cdot}$ is simply $\inr \circ \nxt$.
The semantics of case inspection on dyn performs a case analysis on the sum. The semantics of case inspection on dyn performs a case analysis on the sum.
The interpretation of $\lda{x}{M}$ works as follows. Recall by the typing rule for The interpretation of $\lda{x}{M}$ works as follows. Recall by the typing rule for
......
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment