Skip to content
GitLab
Explore
Sign in
Primary navigation
Search or go to…
Project
S
sgdt
Manage
Activity
Members
Labels
Plan
Issues
Issue boards
Milestones
Wiki
Code
Merge requests
Repository
Branches
Commits
Tags
Repository graph
Compare revisions
Snippets
Build
Pipelines
Jobs
Pipeline schedules
Artifacts
Deploy
Releases
Package registry
Model registry
Operate
Environments
Terraform modules
Monitor
Incidents
Analyze
Value stream analytics
Contributor analytics
CI/CD analytics
Repository analytics
Model experiments
Help
Help
Support
GitLab documentation
Compare GitLab plans
Community forum
Contribute to GitLab
Provide feedback
Keyboard shortcuts
?
Snippets
Groups
Projects
Show more breadcrumbs
gradual-typing
sgdt
Commits
3ca81af2
Commit
3ca81af2
authored
6 years ago
by
Max New
Browse files
Options
Downloads
Patches
Plain Diff
some thoughts on polymorphism
parent
f40838b6
No related branches found
Branches containing commit
No related tags found
No related merge requests found
Changes
1
Hide whitespace changes
Inline
Side-by-side
Showing
1 changed file
notes/polymorphic.org
+92
-0
92 additions, 0 deletions
notes/polymorphic.org
with
92 additions
and
0 deletions
notes/polymorphic.org
0 → 100644
+
92
−
0
View file @
3ca81af2
* Polymorphic GTT
** Summary
It seems like it's possibly *impossible* to combine the following
properties:
1. Graduality
2. polymorphic casts to/from the dynamic type
3. A safe type case for the dynamic type (i.e., intro/elim, beta,eta
for the dynamic type)
4. Curry-style forall/exists,
5. Parametricity
For the first 4, the following each seem to be valid workarounds, but
each violates a principle of gradual/dynamic typing:
1. Weaken graduality so we only have retraction, not projection
2. Don't allow casting of polymorphic type variables, i.e., unknown
types are not "gradual"
3. Disallow type case as a construct
4. Use Church-style forall/exists, i.e., have the use of polymorphism
be *manifest* in the dynamic language, and instantiation is not
less than substitution.
5. Parametricity theorems only apply to totally typed programs
** Details
First, we work in a GTT-like setting.
Next, we consider the precision rules for incorporating polymorphism.
First, we need to generalize type dynamism to open type dynamism.
The following principles seem sensible for now
1. Previous rules for GTT dynamism are *stable/fibred*: i.e., they
apply in any context. In particular, the dynamic type is the most
dynamic type *in any context*, meaning for instance that
X |- X <= ?
2. Monotonicity: if X |- A <= B then for any A' <= B', A[A'/X] <=
B[B'/X]
4. Functoriality: if X |- A <= B then forall X. A <= forall X. B
5. Curry-style: if X |- A, then for any B, \forall X. A <= A[B/X]
Contrast with the church-style rule which would be
5. Church-style: (forall X. ?) <= ?
The argument that something goes wrong comes down to what happens when
we run type case when we've abstracted over some type variables?
In particular what does
/\X. \x -> tycase (in_X x) { number? -> x + 1 | else. 0 } : forall X. X -> Num
do?
If instantiation is less dynamic than we know that choosing X = Num should get us:
(/\X. \x -> tycase (in_X x) { number? -> 1 | else. 0 }) [Num]
<=
\x -> tycase (in_Num x) { number? -> 1 | else. 0 }
calling both sides with a number tells us that the LHS must be less
dynamic than `ret 1`, so either it is equal to it or raises an error.
On the other hand, instantiating X = String (really anything that
doesn't satisfy number?) should get us that the LHS must be less
dynamic than `ret 0`, but the parametricity theorem says that they
need to produce the same answer, so they must both be error.
on the other hand, consider the following slightly modified version:
(/\ X. \x -> tycase (in_X x) { number? -> 0 | else. 0 })
should this raise an error?
The runtime-sealing interpretation certainly would raise an error, but
an eta principle for the dynamic type would say that this should be equivalent to
(/\ X. \x -> 0)
i.e., that type-case is a *safe* operation, which it certainly is in
a typical dynamically typed language.
But how could we accomodate these semantics?
It would seem to require that we run *both* sides of the computation
and dynamically monitor them for equivalence, and that we would have
to do this for every polymorphic type-case, possibly causing a huge
runtime-overhead.
This is possibly of interest as some kind of testing tool, but it's
certainly *not* what sealing *does*, so we are left to ponder *why*
sealing works the way it does, since it seems *unsound*: it rejects
programs at runtime that are *semantically* polymorphic, but not
satisfying the weirdly rigid notion of *syntactic* polymorphism: never
interacting with a value in any way.
This diff is collapsed.
Click to expand it.
Preview
0%
Loading
Try again
or
attach a new file
.
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Save comment
Cancel
Please
register
or
sign in
to comment