Skip to content
Snippets Groups Projects
Commit cd198c98 authored by Max New's avatar Max New
Browse files

mostly writing updates

parent f6f401cd
No related branches found
No related tags found
No related merge requests found
......@@ -77,18 +77,37 @@ gradual guarantee rules are as follows:
}
{\Gamma^\ltdyn \vdash (M :: A_2) \ltdyn M' : c'}
\end{mathpar}
These two rules are admissible from the following principles:
Where the cast $M :: A_2$ is defined to be
\[ \dnc {dyn(A_2)}{\upc {dyn(A)} M} \]
These two rules are admissible from the following principles:
%% \begin{mathpar}
%% \inferrule
%% {}
%% {\dnc {\injarr{}} \upc {\injarr{}} M \equidyn M}
%% %% \inferrule
%% %% {}
%% %% {}
%% \end{mathpar}
For the first rule, we first prove that $\dnc {dyn(A_2)}\upc {dyn(A)} M = \dnc {c'}\upc{c} M$
\begin{align*}
\dnc {dyn(A_2)}\upc {dyn(A)} M
&= \dnc {c \,\textrm{dyn}(A')}\upc{c' \,\textrm{dyn}(A')} M \tag{All derivations are equal}\\
&= \dnc {c}\dnc {\textrm{dyn}(A')}\upc{\textrm{dyn}(A')}\upc{c'} M\tag{functoriality}\\
&= \dnc {c}\upc{c'} M\tag{retraction}\\
\end{align*}
Then the rest follows by the up/dn rules above and the fact that precision derivations are all equal.
\begin{mathpar}
\inferrule*
{c' = \textrm{dyn}(A_2)\max{todo}}
{\dnc {c'}\upc {c} M \ltdyn M' : c'}
\end{mathpar}
Thus the following properties are sufficient to provide an extensional
model of gradual typing without requiring transitivity of term
precision:
\begin{enumerate}
\item Every precision is representable in the above sense,
\item The association of casts to precision is functorial
\item Type constructors are covariant functorial with respect to
relational identity and composition
\end{enumerate}
\section{Call-by-push-value}
......
......@@ -312,11 +312,11 @@ function morphisms as follows:
In summary, an extensional model consists of:
\begin{enumerate}
\item A CBPV model internal to reflexive graphs.
\item Composition and identity on value and computation relations that form a category.
\item A reflexive graph internal to CBPV models.
\item Composition of value and computation relations that form a category with the reflexive relations as identity.
\item An identity-on-objects functor $\upf : \mathcal V_e \to \mathcal V_f$ taking each value relation to a morphism that left-represents it.
\item An identity-on-objects functor $\dnf : \mathcal E_e^{op} \to \mathcal E_f$ taking each computation relation to a morphism that right-represents it.
\item The CBPV connectives $U,F,\times,\to$ are all \emph{covariant} functorial on relations
\item The CBPV connectives $U,F,\times,\to$ are all \emph{covariant} functorial on relations\footnote{the reflexive graph structure already requires that these functors preserve identity relations}
\begin{itemize}
\item $U(dd') = U(d)U(d')$
\item $F(cc') = F(c)F(c')$
......@@ -327,6 +327,9 @@ In summary, an extensional model consists of:
$\injarr{}: U(D \to F D) \rel D$ satisfying $\dnc {\injarr{}}F(\upc{\injarr{}}) = \id$
\end{enumerate}
\subsection{Stuff about Intensional Models}
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
......
......@@ -36,8 +36,8 @@
\newcommand{\up}[2]{\langle{#2}\uarrowl{#1}\rangle}
\newcommand{\dn}[2]{\langle{#1}\darrowl{#2}\rangle}
\newcommand{\upc}[2]{\text{up}\,{#1}\,{#2}}
\newcommand{\dnc}[2]{\text{dn}\,{#1}\,{#2}}
\newcommand{\upc}[1]{\text{up}\,{#1}\,}
\newcommand{\dnc}[1]{\text{dn}\,{#1}\,}
% \newcommand{\ret}{\mathsf{ret}}
......@@ -45,6 +45,11 @@
\newcommand{\zro}{\textsf{zro}}
\newcommand{\suc}{\textsf{suc}}
\newcommand{\lda}[2]{\lambda {#1} . {#2}}
\newcommand{\Injarr}{\textsf{Inj}_\ra}
\newcommand{\Injnat}{\textsf{Inj}_\nat}
\newcommand{\Injtimes}{\textsf{Inj}_\times}
\newcommand{\injarr}[1]{\textsf{Inj}_\ra ({#1})}
\newcommand{\injnat}[1]{\textsf{Inj}_\text{nat} ({#1})}
\newcommand{\casenat}[4]{\text{Case}_\text{nat} ({#1}) \{ \text{no} \to {#2} \alt \text{nat}({#3}) \to {#4} \}}
......@@ -249,4 +254,3 @@
\newcommand{\tnat}{\text{nat}}
\newcommand{\ttimes}{\text{times}}
\newcommand{\tfun}{\text{fun}}
......@@ -3,6 +3,8 @@
\subsection{Related Work}
% Discuss Joey Eremondi's thesis on gradual dependent types
% Discuss Jeremy Siek's work on graduality in Agda
\subsection{Mechanization}
% Discuss Guarded Cubical Agda and mechanization efforts
......@@ -10,17 +12,25 @@
\subsection{Synthetic Ordering}
While the use of synthetic guarded domain theory allows us to very
conveniently work with non-well-founded recursive constructions while
abstracting away the precise details of step-indexing, we do work with
the error ordering in a mostly analytic fashion in that gradual types
are interpreted as sets equipped with an ordering relation, and all
terms must be proven to be monotone.
A key to managing the complexity of our concrete construction is in
using a \emph{synthetic} approach to step-indexing rather than working
analytically with presheaves. This has helped immensely in our ongoing
mechanization in cubical Agda as it sidesteps the need to formalize
these constructions internally.
%
However, there are other aspects of the model, the bisimilarity and
the monotonicity, which are treated analytically and are .
%
It is possible that a combination of synthetic guarded domain theory
with \emph{directed} type theory would allow for an a synthetic
treatment of the error ordering as well.
It may be possible to utilize further synthetic techniques to reduce
this burden as well, and have all type intrinsically carry a notion of
bisimilarity and ordering relation, and all constructions to
automatically preserve them.
%
A synthetic approach to ordering is common in (non-guarded) synthetic
domain theory and has also been used for synthetic reasoning for cost
models \cite{synthetic-domain-theory,decalf}.
\subsection{Future Work}
% Cite GrEff paper
\ No newline at end of file
% Cite GrEff paper
......@@ -78,30 +78,37 @@ Our goal in this work is to provide a reusable semantic framework for
gradually typed languages that can be used to prove the aforementioned
properties: graduality and type-based reasoning. \max{TODO: say more here}
\subsection{Limitations of Prior Work}
\subsection{Current Approaches to Proving Graduality}
We give an overview of current approaches to proving graduality of
languages and why they do not meet our criteria of a reusable semantic
framework.
\subsubsection{From Static to Gradual}
Current approaches to constructing languages that satisfy the graduality property
include the methods of Abstracting Gradual Typing
\cite{garcia-clark-tanter2016} and the formal tools of the Gradualizer \cite{cimini-siek2016}.
These allow the language developer to start with a statically typed language and derive a
gradually typed language that satisfies the gradual guarantee. The main downside to
these approaches lies in their inflexibility: since the process in entirely mechanical,
the language designer must adhere to the predefined framework.
Many gradually typed languages do not fit into either framework, e.g., Typed Racket
\cite{tobin-hochstadt06, tobin-hochstadt08}.
Current approaches to constructing languages that satisfy the
graduality property include the methods of Abstracting Gradual Typing
\cite{garcia-clark-tanter2016} and the formal tools of the Gradualizer
\cite{cimini-siek2016}. These allow the language developer to start
with a statically typed language and derive a gradually typed language
that satisfies the gradual guarantee. The main downside to these
approaches lies in their inflexibility: since the process in entirely
mechanical, the language designer must adhere to the predefined
framework. Many gradually typed languages do not fit into either
framework, e.g., Typed Racket \cite{tobin-hochstadt06,
tobin-hochstadt08} and the semantics produced is not always the
desired one.
%
Furthermore, while these frameworks do prove graduality of the resulting languages,
they do not show the correctness of the equational theory, which is equally important
to sound gradual typing.
For example, programmers often refactor their code, and in so doing they rely
implicitly on the validity of the laws in the equational theory.
Similarly, correctness of compiler optimizations rests on the validity of the
corresponding equations from the equational theory. It is therefore important
that the languages that claim to be gradually typed have provably correct
equational theories.
Furthermore, while these frameworks do prove graduality of the
resulting languages, they do not show the correctness of the
equational theory, which is equally important to sound gradual typing.
%% For example, programmers often refactor their code, and in so doing they rely
%% implicitly on the validity of the laws in the equational theory.
%% Similarly, correctness of compiler optimizations rests on the validity of the
%% corresponding equations from the equational theory. It is therefore important
%% that the languages that claim to be gradually typed have provably correct
%% equational theories.
% The approaches are too inflexible... the fact that the resulting semantics are too lazy
% is a consequence of that inflexibility.
......@@ -116,25 +123,51 @@ equational theories.
% [Eric] I moved the next two paragraphs from the technical background section
% to here in the intro.
\subsubsection{Classical Domain Models}
New and Licata \cite{new-licata18} developed an axiomatic account of the
graduality relation on cast calculus terms and gave a denotational
model of this calculus using classical domain theory based on
$\omega$-CPOs. This semantics has scaled up to an analysis of a
dependently typed gradual calculus in \cite{asdf}. This meets our
criterion of being a reusable mathematical theory, as general semantic
theorems about gradual domains can be developed independent of any
particular syntax and then reused in many different denotational
models. However, it is widely believed that such classical domain
theoretic techniques cannot be extended to model higher-order store, a
standard feature of realistic gradually typed languages such as Typed
Racket. Thus if we want a reusable mathematical theory of gradual
typing that can scale to realistic programming languages, we need to
look elsewhere to so-called ``step-indexed'' techniques.
\subsubsection{Operational Reduction Proofs}
\subsubsection{Double Categorical Semantics}
New and Licata \cite{new-licata18} developed an axiomatic account of
the graduality relation on a call-by-name cast calculus terms and
showed that the graduality proof could be modeled using semantics in
certain kinds of \emph{double categories}, categories internal to the
category of categories. A double category extends a category with a
second notion of morphism, often a notion of ``relation'' to be paired
with the notion of functional morphism, as well as a notion of
functional morphisms preserving relations. In gradual typing the
notion of relation models type precision and the squares model the
term precision relation. This approach was influenced by Ma and
Reynolds semantics of parametricity using reflexive graph categories:
reflexive graph categories are essentially double categories without a
notion of relational composition. In addition to capturing the notions
of type and term precision, the double categorical approach allows for
a \emph{universal property} for casts: upcasts are the
\emph{universal} way to turn a relation arrow into a function in a
forward direction and downcasts are the dual universal arrow. Later,
New, Licata and Ahmed \cite{new-licata-ahmed19} extended this
axiomatic treatment from call-by-name to call-by-value as well by
giving an axiomatic theory of type and term precision in
call-by-push-value. This left implicit any connection to a ``double
call-by-push-value'', which we make explicit in
Section~\ref{sec:cbpv}.
With this notion of abstract categorical model in hand, denotational
semantics is then the work of constructing concrete models that
exhibit the categorical construction. New and Licata
\cite{new-licata18} present such a model using categories of
$\omega$-CPOs, and this model was extended by Lennon-Bertrand,
Maillard, Tabareau and Tanter to prove graduality of a gradual
dependently typed calculus $\textrm{CastCIC}^{\mathcalG}$. This
domain-theoretic approach meets our criteria of being a semantic
framework for proving graduality, but suffers from the limitations of
classical domain theory: the inability to model viciously
self-referential structures such as higher-order extensible state and
similar features such as runtime-extensible dynamic types. Since these
features are quite common in dynamically typed languages, we seek a
new denotational framework that can model these type system features.
The standard alternative to domain theory that scales to essentially
arbitrary self-referential definitions is \emph{step-indexing} or its
synthetic form of \emph{guarded recursion}.
\max{todo: adapt the next paragraph to fit in with this one}
A series of works \cite{new-ahmed2018, new-licata-ahmed2019, new-jamner-ahmed19}
developed step-indexed logical relations models of gradually typed
languages based on operational semantics. Unlike classical domain
......@@ -149,6 +182,26 @@ these reusable mathematical principles from these tedious models to
make formalization of realistic gradual languages tractible.
%% \subsubsection{Classical Domain Models}
%% and gave a denotational
%% model of this calculus using classical domain theory based on
%% $\omega$-CPOs. This semantics has scaled up to an analysis of a
%% dependently typed gradual calculus in \cite{asdf}. This meets our
%% criterion of being a reusable mathematical theory, as general semantic
%% theorems about gradual domains can be developed independent of any
%% particular syntax and then reused in many different denotational
%% models. However, it is widely believed that such classical domain
%% theoretic techniques cannot be extended to model higher-order store, a
%% standard feature of realistic gradually typed languages such as Typed
%% Racket. Thus if we want a reusable mathematical theory of gradual
%% typing that can scale to realistic programming languages, we need to
%% look elsewhere to so-called ``step-indexed'' techniques.
%% \subsubsection{Operational Reduction Proofs}
% Alternative phrasing:
\begin{comment}
\subsubsection{Embedding-Projection Pairs}
......@@ -204,7 +257,16 @@ complex features.
\subsection{Contributions}
The main contribution of this work is a categorical and denotational
semantics for gradually typed langauges that models not just the term
language but the graduality property as well.
\begin{enumerate}
\item First, we give a rational reconstruction \max{todo: more shit.}
\end{enumerate}
Our main contribution is a compositional denotational semantics for step-aware gradual typing,
analogous to Gradual Type Theory but now in the ``intensional" setting.
In parallel, we are developing a reusable framework in
Guarded Cubical Agda for developing machine-checked proofs of graduality of a cast calculus.
......
......@@ -112,7 +112,7 @@
%% domain used to model the dynamic type, and type precision is modeled
%% as simply a subset relation.
%% %
\end{abstract}
\end{abstract}
\maketitle
......@@ -142,15 +142,17 @@
\input{technical-background}
% \input{syntax}
\input{syntax}
\input{extensional-model}
% \input{semantics}
\input{terms}
\input{ordering}
%% \input{ordering}
\input{graduality}
%% \input{graduality}
\input{categorical-model}
......
This diff is collapsed.
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment