| Commit message (Collapse) | Author | Age |
|
|
|
|
| |
New module introduced in OCaml 4.05 I think, can create problems when
linking with the OCaml toplevel for `Drop`.
|
|
|
|
| |
This will allow to merge back `Names` with `API.Names`
|
|
|
|
| |
This is a first step towards some of the solutions proposed in #6008.
|
| |
|
| |
|
|\ |
|
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| | |
Reminder of (some of) the reasons for removal:
- Despite the claim in sigma.mli, it does *not* prevent evar
leaks, something like:
fun env evd ->
let (evd',ev) = new_evar env evd in
(evd,ev)
will typecheck even with Sigma-like type annotations (with a proof of
reflexivity)
- The API stayed embryonic. Even typing functions were not ported to
Sigma.
- Some unsafe combinators (Unsafe.tclEVARS) were replaced with slightly
less unsafe ones (e.g. s_enter), but those ones were not marked unsafe
at all (despite still being so).
- There was no good story for higher order functions manipulating evar
maps. Without higher order, one can most of the time get away with
reusing the same name for the updated evar map.
- Most of the code doing complex things with evar maps was using unsafe
casts to sigma. This code should be fixed, but this is an orthogonal
issue.
Of course, this was showing a nice and elegant use of GADTs, but the
cost/benefit ratio in practice did not seem good.
|
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| | |
This exports two functions:
- declare_reduction_effect: to declare a hook to be applied when some
constant are visited during the execution of some reduction functions
(primarily cbv, but also cbn, simpl, hnf, ...).
- set_reduction_effect: to declare a constant on which a given effect
hook should be called.
Developed jointly by Thomas Sibut-Pinote and Hugo Herbelin.
Added support for printing effect in functions of tacred.ml.
|
|/
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
As per https://github.com/coq/coq/pull/716#issuecomment-305140839
Partially using
```bash
git grep --name-only 'anomaly\s*\(~label:"[^"]*"\s*\)\?\(Pp.\)\?(\(\(Pp.\)\?str\)\?\s*".*[^\.!]")' | xargs sed s'/\(anomaly\s*\(~label:"[^"]*"\s*\)\?\(Pp.\)\?(\(\(Pp.\)\?str\)\?\s*".*\s*[^\.! ]\)\s*")/\1.")/g' -i
```
and
```bash
git grep --name-only ' !"' | xargs sed s'/ !"/!"/g' -i
```
The rest were manually edited by looking at the results of
```bash
git grep anomaly | grep '\.ml' | grep -v 'anomaly\s*\(~label:"[^"]*"\s*\)\?\(Pp\.\)\?(\(\(Pp.\)\?str\)\?\s*".*\(\.\|!\)")' | grep 'anomaly\($\|[^_]\)' | less
```
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
This is the continuation of #244, we now deprecate `CErrors.error`,
the single entry point in Coq is `user_err`.
The rationale is to allow for easier grepping, and to ease a future
cleanup of error messages. In particular, we would like to
systematically classify all error messages raised by Coq and be sure
they are properly documented.
We restore the two functions removed in #244 to improve compatibility,
but mark them deprecated.
|
|
|
|
|
|
| |
The transition has been done a bit brutally. I think we can still save a
lot of useless normalizations here and there by providing the right API
in EConstr. Nonetheless, this is a first step.
|
|
|
|
|
|
| |
Incidentally, this fixes a printing bug in output/inference.v where the
displayed name of an evar was the wrong one because its type was not
evar-expanded enough.
|
|
|
|
|
|
| |
This removes quite a few unsafe casts. Unluckily, I had to reintroduce
the old non-module based names for these data structures, because I could
not reproduce easily the same hierarchy in EConstr.
|
|
|
|
|
| |
We make mli files look to what they were looking before the move to EConstr
by opening this module.
|
| |
|
| |
|
| |
|
| |
|
| |
|
| |
|
| |
|
| |
|
| |
|
| |
|
| |
|
| |
|
| |
|
| |
|
| |
|
| |
|
| |
|
|\ |
|
| | |
|
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| | |
mainly concerning referring to "Context.{Rel,Named}.get_{id,value,type}" functions.
If multiple modules define a function with a same name, e.g.:
Context.{Rel,Named}.get_type
those calls were prefixed with a corresponding prefix
to make sure that it is obvious which function is being called.
|
| |
| |
| |
| | |
Suggested by @ppedrot
|
|/
|
|
|
|
|
| |
As noted by @ppedrot, the first is redundant. The patch is basically a renaming.
We didn't make the component optional yet, but this could happen in a
future patch.
|
|
|
|
| |
For the moment, there is a Closure module in compiler-libs/ocamloptcomp.cm(x)a
|
|
|
|
|
|
| |
module)
For the moment, there is an Error module in compilers-lib/ocamlbytecomp.cm(x)a
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
This is a reimplementation of Hugo's PR#117.
We are trying to address the problem that the name of some reduction functions
was not saying what they were doing (e.g. whd_betadeltaiota was doing let-in
reduction). Like PR#117, we are careful that no function changed semantics
without changing the names. Porting existing ML code should be a matter of
renamings a few function calls.
Also, we introduce more precise reduction flags fMATCH, fFIX, fCOFIX
collectively denominated iota.
We renamed the following functions:
Closure.betadeltaiota -> Closure.all
Closure.betadeltaiotanolet -> Closure.allnolet
Reductionops.beta -> Closure.beta
Reductionops.zeta -> Closure.zeta
Reductionops.betaiota -> Closure.betaiota
Reductionops.betaiotazeta -> Closure.betaiotazeta
Reductionops.delta -> Closure.delta
Reductionops.betalet -> Closure.betazeta
Reductionops.betadelta -> Closure.betadeltazeta
Reductionops.betadeltaiota -> Closure.all
Reductionops.betadeltaiotanolet -> Closure.allnolet
Closure.no_red -> Closure.nored
Reductionops.nored -> Closure.nored
Reductionops.nf_betadeltaiota -> Reductionops.nf_all
Reductionops.whd_betadelta -> Reductionops.whd_betadeltazeta
Reductionops.whd_betadeltaiota -> Reductionops.whd_all
Reductionops.whd_betadeltaiota_nolet -> Reductionops.whd_allnolet
Reductionops.whd_betadelta_stack -> Reductionops.whd_betadeltazeta_stack
Reductionops.whd_betadeltaiota_stack -> Reductionops.whd_all_stack
Reductionops.whd_betadeltaiota_nolet_stack -> Reductionops.whd_allnolet_stack
Reductionops.whd_betadelta_state -> Reductionops.whd_betadeltazeta_state
Reductionops.whd_betadeltaiota_state -> Reductionops.whd_all_state
Reductionops.whd_betadeltaiota_nolet_state -> Reductionops.whd_allnolet_state
Reductionops.whd_eta -> Reductionops.shrink_eta
Tacmach.pf_whd_betadeltaiota -> Tacmach.pf_whd_all
Tacmach.New.pf_whd_betadeltaiota -> Tacmach.New.pf_whd_all
And removed the following ones:
Reductionops.whd_betaetalet
Reductionops.whd_betaetalet_stack
Reductionops.whd_betaetalet_state
Reductionops.whd_betadeltaeta_stack
Reductionops.whd_betadeltaeta_state
Reductionops.whd_betadeltaeta
Reductionops.whd_betadeltaiotaeta_stack
Reductionops.whd_betadeltaiotaeta_state
Reductionops.whd_betadeltaiotaeta
They were unused and having some reduction functions perform eta is confusing
as whd_all and nf_all don't do it.
|
|\ |
|
| | |
|
| |
| |
| |
| |
| | |
Some functions were left in the old paradigm because they are only used by the
unification algorithms, so they are not worthwhile to change for now.
|
|/
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
Originally, rel-context was represented as:
Context.rel_context = Names.Name.t * Constr.t option * Constr.t
Now it is represented as:
Context.Rel.t = LocalAssum of Names.Name.t * Constr.t
| LocalDef of Names.Name.t * Constr.t * Constr.t
Originally, named-context was represented as:
Context.named_context = Names.Id.t * Constr.t option * Constr.t
Now it is represented as:
Context.Named.t = LocalAssum of Names.Id.t * Constr.t
| LocalDef of Names.Id.t * Constr.t * Constr.t
Motivation:
(1) In "tactics/hipattern.ml4" file we define "test_strict_disjunction"
function which looked like this:
let test_strict_disjunction n lc =
Array.for_all_i (fun i c ->
match (prod_assum (snd (decompose_prod_n_assum n c))) with
| [_,None,c] -> isRel c && Int.equal (destRel c) (n - i)
| _ -> false) 0 lc
Suppose that you do not know about rel-context and named-context.
(that is the case of people who just started to read the source code)
Merlin would tell you that the type of the value you are destructing
by "match" is:
'a * 'b option * Constr.t (* worst-case scenario *)
or
Named.Name.t * Constr.t option * Constr.t (* best-case scenario (?) *)
To me, this is akin to wearing an opaque veil.
It is hard to figure out the meaning of the values you are looking at.
In particular, it is hard to discover the connection between the value
we are destructing above and the datatypes and functions defined
in the "kernel/context.ml" file.
In this case, the connection is there, but it is not visible
(between the function above and the "Context" module).
------------------------------------------------------------------------
Now consider, what happens when the reader see the same function
presented in the following form:
let test_strict_disjunction n lc =
Array.for_all_i (fun i c ->
match (prod_assum (snd (decompose_prod_n_assum n c))) with
| [LocalAssum (_,c)] -> isRel c && Int.equal (destRel c) (n - i)
| _ -> false) 0 lc
If the reader haven't seen "LocalAssum" before, (s)he can use Merlin
to jump to the corresponding definition and learn more.
In this case, the connection is there, and it is directly visible
(between the function above and the "Context" module).
(2) Also, if we already have the concepts such as:
- local declaration
- local assumption
- local definition
and we describe these notions meticulously in the Reference Manual,
then it is a real pity not to reinforce the connection
of the actual code with the abstract description we published.
|
|\ |
|
| | |
|
|/
|
|
|
| |
Note: they do not even seem to have a debugging purpose, so better remove
them before they bitrot.
|
| |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
Some functions from pretyping/typing.ml and their derivatives were potential
source of evarmap leaks, as they dropped their resulting evarmap. This commit
clarifies the situation by renaming them according to a unsafe_* scheme. Their
sound variant is likewise renamed to their old name. The following renamings
were made.
- Typing.type_of -> unsafe_type_of
- Typing.e_type_of -> type_of
- A new e_type_of function that matches the e_ prefix policy
- Tacmach.pf_type_of -> pf_unsafe_type_of
- A new safe pf_type_of function.
All uses of unsafe_* functions should be eventually eliminated.
|
| |
|
| |
|