- maxHeartbeats : Nat
Timeout for Lean code (Lean-auto + native provers)
- timeout : Nat
Timeout for external provers, i.e. TPTP solvers and SMT solvers
- solverConfig : SolverConfig
Solver configuration
Optional file for saving the log of the evaluation
Optional file for saving the result of the evaluation
On some problems, certain tactics may go into infinite loops not guarded by
Core.checkMaxHeartbeats
. These instances should be recorded here and avoided (throw error immediately) during evaluation.
Instances For
Equations
- One or more equations did not get rendered due to their size.
Run prover
on the type of name
, using premises collected
from the proof of name
Premises which only contain logic constants are filtered because they
are assumed to be known by the prover
Equations
- One or more equations did not get rendered due to their size.
Instances For
Run lean-auto
on all the constants listed in names
For the i
-th theorem name
in names
, its entry in the result file has the following form:
<i> <result> <time> <heartbeats> <name>
Equations
- One or more equations did not get rendered due to their size.
Instances For
Read the .result
file generated by runAutoOnConsts
Equations
- One or more equations did not get rendered due to their size.
Instances For
Equations
- EvalAuto.runAutoOnNamesFile cfg fname = do let names ← liftM (EvalAuto.NameArray.load fname) EvalAuto.runAutoOnConsts cfg names
Instances For
- maxHeartbeats : Nat
Timeout for Lean code (Lean-auto + native provers)
- timeout : Nat
Timeout for external provers, i.e. TPTP solvers and SMT solvers
- solverConfig : SolverConfig
Solver configuration
- resultFolder : String
Folder for saving the result of the evaluation
- nprocs : Nat
Number of processes to use
- batchSize : Nat
Batch size
Memory limit for each evaluation process, in kb
Total time limit for each evaluation process, in seconds
Instances For
Equations
- One or more equations did not get rendered due to their size.
Instances For
This should be run after import Mathlib, import Auto.EvaluateAuto.TestAuto
,
and should be run with a cwd
where lake env
creates an environment in which
Mathlib, lean-auto
and duper
are available
The evaluation splits all theorems in Mathlib into batches of size config.batchSize
,
and uses config.nprocs
parallel threads to run lean-auto on these theorems.
For each thread, three files are created:
· path.log
: Detailed log
· path.result
: Concise result of evaluation
· path.names
: Names of all the theorems in this batch
Equations
- One or more equations did not get rendered due to their size.
Instances For
Equations
- One or more equations did not get rendered due to their size.
Instances For
Equations
- One or more equations did not get rendered due to their size.
Instances For
Read results generated by evalAutoAtTheoremsAsync
Equations
- One or more equations did not get rendered due to their size.
Instances For
Read evaluateFiles.txt
generated by evalAutoAtTheoremsAsync
Equations
- One or more equations did not get rendered due to their size.
Instances For
Equations
- One or more equations did not get rendered due to their size.