diff options
Diffstat (limited to 'dev/doc/univpoly.txt')
-rw-r--r-- | dev/doc/univpoly.txt | 255 |
1 files changed, 255 insertions, 0 deletions
diff --git a/dev/doc/univpoly.txt b/dev/doc/univpoly.txt new file mode 100644 index 00000000..4c89af01 --- /dev/null +++ b/dev/doc/univpoly.txt @@ -0,0 +1,255 @@ +Notes on universe polymorphism and primitive projections, M. Sozeau - WIP +========================================================================= + +The new implementation of universe polymorphism and primitive +projections introduces a few changes to the API of Coq. First and +foremost, the term language changes, as global references now carry a +universe level substitution: + +type 'a puniverses = 'a * Univ.Instance.t +type pconstant = constant puniverses +type pinductive = inductive puniverses +type pconstructor = constructor puniverses + +type constr = ... + | Const of puniversess + | Ind of pinductive + | Constr of pconstructor + | Proj of constant * constr + + +Universes +========= + + Universe instances (an array of levels) gets substituted when +unfolding definitions, are used to typecheck and are unified according +to the rules in the ITP'14 paper on universe polymorphism in Coq. + +type Level.t = Set | Prop | Level of int * dirpath (* hashconsed *) +type Instance.t = Level.t array +type Universe.t = Level.t list (* hashconsed *) + +The universe module defines modules and abstract types for levels, +universes etc.. Structures are hashconsed (with a hack to take care +of the fact that deserialization breaks sharing). + + Definitions (constants, inductives) now carry around not only +constraints but also the universes they introduced (a Univ.UContext.t). +There is another kind of contexts [Univ.ContextSet.t], the latter has +a set of universes, while the former has serialized the levels in an +array, and is used for polymorphic objects. Both have "reified" +constraints depending on global and local universes. + + A polymorphic definition is abstract w.r.t. the variables in this +context, while a monomorphic one (or template polymorphic) just adds the +universes and constraints to the global universe context when it is put +in the environment. No other universes than the global ones and the +declared local ones are needed to check a declaration, hence the kernel +does not produce any constraints anymore, apart from module +subtyping.... There are hance two conversion functions now: check_conv +and infer_conv: the former just checks the definition in the current env +(in which we usually push_universe_context of the associated context), +and infer_conv which produces constraints that were not implied by the +ambient constraints. Ideally, that one could be put out of the kernel, +but again, module subtyping needs it. + + Inference of universes is now done during refinement, and the evar_map +carries the incrementally built universe context. [Evd.conversion] is a +wrapper around [infer_conv] that will do the bookkeeping for you, it +uses [evar_conv_x]. There is a universe substitution being built +incrementally according to the constraints, so one should normalize at +the end of a proof (or during a proof) with that substitution just like +we normalize evars. There are some nf_* functions in +library/universes.ml to do that. Additionally, there is a minimization +algorithm in there that can be applied at the end of a proof to simplify +the universe constraints used in the term. It is heuristic but +validity-preserving. No user-introduced universe (i.e. coming from a +user-written anonymous Type) gets touched by this, only the fresh +universes generated for each global application. Using + +val pf_constr_of_global : Globnames.global_reference -> (constr -> tactic) -> tactic + +Is the way to make a constr out of a global reference in the new API. +If they constr is polymorphic, it will add the necessary constraints to +the evar_map. Even if a constr is not polymorphic, we have to take care +of keeping track of it's universes. Typically, using: + + mkApp (coq_id_function, [| A; a |]) + +and putting it in a proof term is not enough now. One has to somehow +show that A's type is in cumululativity relation with id's type +argument, incurring a universe constraint. To do this, one can simply +call Typing.resolve_evars env evdref c which will do some infer_conv to +produce the right constraints and put them in the evar_map. Of course in +some cases you might now from an invariant that no new constraint would +be produced and get rid of it. Anyway the kernel will tell you if you +forgot some. As a temporary way out, [Universes.constr_of_global] allows +you to make a constr from any non-polymorphic constant, but it might +forget constraints. + +Other than that, unification (w_unify and evarconv) now take account of universes and +produce only well-typed evar_maps. + +Some syntactic comparisons like the one used in [change] have to be +adapted to allow identification up-to-universes (when dealing with +polymorphic references), [make_eq_univs_test] is there to help. +In constr, there are actually many new comparison functions to deal with +that: + +(** [equal a b] is true if [a] equals [b] modulo alpha, casts, + and application grouping *) +val equal : constr -> constr -> bool + +(** [eq_constr_univs u a b] is [true] if [a] equals [b] modulo alpha, casts, + application grouping and the universe equalities in [u]. *) +val eq_constr_univs : constr Univ.check_function + +(** [leq_constr_univs u a b] is [true] if [a] is convertible to [b] modulo + alpha, casts, application grouping and the universe inequalities in [u]. *) +val leq_constr_univs : constr Univ.check_function + +(** [eq_constr_universes a b] [true, c] if [a] equals [b] modulo alpha, casts, + application grouping and the universe equalities in [c]. *) +val eq_constr_universes : constr -> constr -> bool Univ.universe_constrained + +(** [leq_constr_universes a b] [true, c] if [a] is convertible to [b] modulo + alpha, casts, application grouping and the universe inequalities in [c]. *) +val leq_constr_universes : constr -> constr -> bool Univ.universe_constrained + +(** [eq_constr_univs a b] [true, c] if [a] equals [b] modulo alpha, casts, + application grouping and ignoring universe instances. *) +val eq_constr_nounivs : constr -> constr -> bool + +The [_univs] versions are doing checking of universe constraints +according to a graph, while the [_universes] are producing (non-atomic) +universe constraints. The non-atomic universe constraints include the +[ULub] constructor: when comparing [f (* u1 u2 *) c] and [f (* u1' u2' +*) c] we add ULub constraints on [u1, u1'] and [u2, u2']. These are +treated specially: as unfolding [f] might not result in these +unifications, we need to keep track of the fact that failure to satisfy +them does not mean that the term are actually equal. This is used in +unification but probably not necessary to the average programmer. + +Another issue for ML programmers is that tables of constrs now usually +need to take a [constr Univ.in_universe_context_set] instead, and +properly refresh the universes context when using the constr, e.g. using +Clenv.refresh_undefined_univs clenv or: + +(** Get fresh variables for the universe context. + Useful to make tactics that manipulate constrs in universe contexts polymorphic. *) +val fresh_universe_context_set_instance : universe_context_set -> + universe_level_subst * universe_context_set + +The substitution should be applied to the constr(s) under consideration, +and the context_set merged with the current evar_map with: + +val merge_context_set : rigid -> evar_map -> Univ.universe_context_set -> evar_map + +The [rigid] flag here should be [Evd.univ_flexible] most of the +time. This means the universe levels of polymorphic objects in the +constr might get instantiated instead of generating equality constraints +(Evd.univ_rigid does that). + +On this issue, I recommend forcing commands to take [global_reference]s +only, the user can declare his specialized terms used as hints as +constants and this is cleaner. Alas, backward-compatibility-wise, +this is the only solution I found. In the case of global_references +only, it's just a matter of using [Evd.fresh_global] / +[pf_constr_of_global] to let the system take care of universes. + +Projections +=========== + + | Proj of constant * constr + +Projections are always applied to a term, which must be of a record type +(i.e. reducible to an inductive type [I params]). Type-checking, +reduction and conversion are fast (not as fast as they could be yet) +because we don't keep parameters around. As you can see, it's currently +a [constant] that is used here to refer to the projection, that will +change to an abstract [projection] type in the future. Basically a +projection constant records which inductive it is a projection for, the +number of params and the actual position in the constructor that must be +projected. For compatibility reason, we also define an eta-expanded form +(accessible from user syntax @f). The constant_entry of a projection has +both informations. Declaring a record (under [Set Primitive +Projections]) will generate such definitions. The API to declare them is +not stable at the moment, but the inductive type declaration also knows +about the projections, i.e. a record inductive type decl contains an +array of terms representing the projections. This is used to implement +eta-conversion for record types (with at least one field and having all +projections definable). The canonical value being [Build_R (pn x) +... (pn x)]. Unification and conversion work up to this eta rule. The +records can also be universe polymorphic of course, and we don't need to +keep track of the universe instance for the projections either. +Projections are reduced _eagerly_ everywhere, and introduce a new Zproj +constructor in the abstract machines that obeys both the delta (for the +constant opacity) and iota laws (for the actual reduction). Refolding +works as well (afaict), but there is a slight hack there related to +universes (not projections). + +For the ML programmer, the biggest change is that pattern-matchings on +kind_of_term require an additional case, that is handled usually exactly +like an [App (Const p) arg]. + +There are slight hacks related to hints is well, to use the primitive +projection form of f when one does [Hint Resolve f]. Usually hint +resolve will typecheck the term, resulting in a partially applied +projection (disallowed), so we allow it to take +[constr_or_global_reference] arguments instead and special-case on +projections. Other tactic extensions might need similar treatment. + +WIP +=== + +- [vm_compute] does not deal with universes and projections correctly, +except when it goes to a normal form with no projections or polymorphic +constants left (the most common case). E.g. Ring with Set Universe +Polymorphism and Set Primitive Projections work (at least it did at some +point, I didn't recheck yet). + +- [native_compute] is untested: it should deal with primitive +projections right but not universes. + + +Incompatibilities +================= + +Old-style universe polymorphic definitions were implemented by taking +advantage of the fact that elaboration (i.e., pretyping and unification) +were _not_ universe aware, so some of the constraints generated during +pretypechecking would be forgotten. In the current setting, this is not +possible, as unification ensures that the substitution is built is +entirely well-typed, even w.r.t universes. This means that some terms +that type-checked before no longer do, especially projections of the +pair: + +@fst ?x ?y : prod ?x ?y : Type (max(Datatypes.i, Datatypes.j)). + +The "template universe polymorphic" variables i and j appear during +typing without being refreshed, meaning that they can be lowered (have +upper constraints) with user-introduced universes. In most cases this +won't work, so ?x and ?y have to be instantiated earlier, either from +the type of the actual projected pair term (some t : prod A B) or the +typing constraint. Adding the correct type annotations will always fix +this. + + +Unification semantics +===================== + +In Ltac, matching with: + +- a universe polymorphic constant [c] matches any instance of the + constant. +- a variable ?x already bound to a term [t] (non-linear pattern) uses + strict equality of universes (e.g., Type@{i} and Type@{j} are not + equal). + +In tactics: + +- [change foo with bar], [pattern foo] will unify all instances of [foo] + (and convert them with [bar]). This might incur unifications of + universes. [change] uses conversion while [pattern] only does + syntactic matching up-to unification of universes. +- [apply], [refine] use unification up to universes. |