Skip to content
Snippets Groups Projects
polymorphic.org 3.47 KiB
Newer Older
  • Learn to ignore specific revisions
  • Max New's avatar
    Max New committed
    * Polymorphic GTT
    
    ** Summary
    
    It seems like it's possibly *impossible* to combine the following
    properties:
    
    1. Graduality
    2. polymorphic casts to/from the dynamic type
    3. A safe type case for the dynamic type (i.e., intro/elim, beta,eta
       for the dynamic type)
    4. Curry-style forall/exists, 
    5. Parametricity
    
    For the first 4, the following each seem to be valid workarounds, but
    each violates a principle of gradual/dynamic typing:
    1. Weaken graduality so we only have retraction, not projection
    2. Don't allow casting of polymorphic type variables, i.e., unknown
       types are not "gradual"
    3. Disallow type case as a construct
    4. Use Church-style forall/exists, i.e., have the use of polymorphism
       be *manifest* in the dynamic language, and instantiation is not
       less than substitution.
    5. Parametricity theorems only apply to totally typed programs
    
    ** Details
    First, we work in a GTT-like setting.
    Next, we consider the precision rules for incorporating polymorphism.
    
    First, we need to generalize type dynamism to open type dynamism.
    The following principles seem sensible for now
    1. Previous rules for GTT dynamism are *stable/fibred*: i.e., they
       apply in any context. In particular, the dynamic type is the most
       dynamic type *in any context*, meaning for instance that
       X |- X <= ?
    2. Monotonicity: if X |- A <= B then for any A' <= B', A[A'/X] <=
       B[B'/X]
    4. Functoriality: if X |- A <= B then forall X. A <= forall X. B
    5. Curry-style: if X |- A, then for any B, \forall X. A <= A[B/X]
    
    Contrast with the church-style rule which would be
    
    5. Church-style: (forall X. ?) <= ?
    
    The argument that something goes wrong comes down to what happens when
    we run type case when we've abstracted over some type variables?
    
    In particular what does
    
        /\X. \x -> tycase (in_X x) { number? -> x + 1 | else. 0 } : forall X. X -> Num
    
    do?
    If instantiation is less dynamic than we know that choosing X = Num should get us:
    
       (/\X. \x -> tycase (in_X x) { number? -> 1 | else. 0 }) [Num]
       <=
       \x -> tycase (in_Num x) { number? -> 1 | else. 0 }
    
    calling both sides with a number tells us that the LHS must be less
    dynamic than `ret 1`, so either it is equal to it or raises an error.
    On the other hand, instantiating X = String (really anything that
    doesn't satisfy number?) should get us that the LHS must be less
    dynamic than `ret 0`, but the parametricity theorem says that they
    need to produce the same answer, so they must both be error.
    
    on the other hand, consider the following slightly modified version:
    
       (/\ X. \x -> tycase (in_X x) { number? -> 0 | else. 0 })
    
    should this raise an error?
    
    The runtime-sealing interpretation certainly would raise an error, but
    an eta principle for the dynamic type would say that this should be equivalent to
    
       (/\ X. \x -> 0)
    
    i.e., that type-case is a *safe* operation, which it certainly is in
    a typical dynamically typed language.
    
    But how could we accomodate these semantics?
    It would seem to require that we run *both* sides of the computation
    and dynamically monitor them for equivalence, and that we would have
    to do this for every polymorphic type-case, possibly causing a huge
    runtime-overhead.
    
    This is possibly of interest as some kind of testing tool, but it's
    certainly *not* what sealing *does*, so we are left to ponder *why*
    sealing works the way it does, since it seems *unsound*: it rejects
    programs at runtime that are *semantically* polymorphic, but not
    satisfying the weirdly rigid notion of *syntactic* polymorphism: never
    interacting with a value in any way.