Benchmark

non-incremental/QF_NIA/AProVE/aproveSMT4471552918718089052.smt2

AProve team, see http://aprove.informatik.rwth-aachen.de/, submitted for SMT-COMP 2014
Benchmark
Size24253
Compressed Size3026
License Creative Commons Attribution 4.0 International (CC-BY-4.0)
Categoryindustrial
First Occurrence2014-07-21
Generated By
Generated On
Generator
Dolmen OK1
strict Dolmen OK1
check-sat calls1
Query 1
Status unsat
Inferred Status unsat
Size 24212
Compressed Size3011
Max. Term Depth7
Asserts 1
Declared Functions0
Declared Constants93
Declared Sorts 0
Defined Functions0
Defined Recursive Functions 0
Defined Sorts0
Constants0
Declared Datatypes0

Symbols

not1 and2 =153 +300
-414 *1014 >=248

Evaluations

Evaluation Rating Solver Variant Result Wallclock CPU Time
SMT-COMP 2017 0.60 (2/5) AProVE AProVE NIA 2014 default unknown ❌ 600.05400 610.18000
CVC4 CVC4-smtcomp2017-main default unsat ✅ 5.04224 5.04129
SMT-RAT SMTRAT-comp2017_2 default unknown ❌ 600.08500 600.05000
Yices2 Yices2-Main default unsat ✅ 1.36603 1.36472
Z3 z3-4.5.0 default unknown ❌ 600.06400 599.96000
SMT-COMP 2018 0.40 (3/5) AProVE AProVE NIA 2014_default unknown ❌ 1200.08000 1211.71000
CVC4 master-2018-06-10-b19c840-competition-default_default unsat ✅ 19.67160 19.67010
SMT-RAT SMTRAT-Rat-final_default unknown ❌ 1200.06000 1199.80000
Yices2 Yices 2.6.0_default unsat ✅ 0.96376 0.96373
Z3 z3-4.7.1_default unsat ✅ 8.42996 8.42956
SMT-COMP 2024 0.25 (3/4) cvc5 cvc5 unsat ✅ 7.17760 7.07828
SMTInterpol SMTInterpol unknown ❌ 0.71978 1.52957
Yices2 Yices2 unsat ✅ 0.48249 0.38107
Z3alpha Z3-alpha unsat ✅ 14.58520 14.47716