Benchmark

non-incremental/QF_NIA/AProVE/aproveSMT6115051464051862300.smt2

AProve team, see http://aprove.informatik.rwth-aachen.de/, submitted for SMT-COMP 2014
Benchmark
Size1092
Compressed Size418
License Creative Commons Attribution 4.0 International (CC-BY-4.0)
Categoryindustrial
First Occurrence2014-07-21
Generated By
Generated On
Generator
Dolmen OK1
strict Dolmen OK1
check-sat calls1
Query 1
Status unsat
Inferred Status unsat
Size 1051
Compressed Size394
Max. Term Depth7
Asserts 1
Declared Functions0
Declared Constants6
Declared Sorts 0
Defined Functions0
Defined Recursive Functions 0
Defined Sorts0
Constants0
Declared Datatypes0

Symbols

not1 and2 =3 +8
-11 *29 >=14

Evaluations

Evaluation Rating Solver Variant Result Wallclock CPU Time
SMT-COMP 2017 0.40 (3/5) AProVE AProVE NIA 2014 default unknown ❌ 600.02100 606.84000
CVC4 CVC4-smtcomp2017-main default unsat ✅ 0.06623 0.06561
SMT-RAT SMTRAT-comp2017_2 default unsat ✅ 0.15798 0.15751
Yices2 Yices2-Main default unsat ✅ 0.00849 0.00755
Z3 z3-4.5.0 default unknown ❌ 23.47020 23.46790
SMT-COMP 2018 0.20 (4/5) AProVE AProVE NIA 2014_default unknown ❌ 1200.11000 1208.22000
CVC4 master-2018-06-10-b19c840-competition-default_default unsat ✅ 0.06653 0.06675
SMT-RAT SMTRAT-Rat-final_default unsat ✅ 0.08376 0.08370
Yices2 Yices 2.6.0_default unsat ✅ 0.00995 0.00985
Z3 z3-4.7.1_default unsat ✅ 2.11945 2.11934
SMT-COMP 2025 0.20 (4/5) cvc5 cvc5 unsat ✅ 0.33812 0.21261
SMTInterpol SMTInterpol unknown ❌ 0.43533 0.43043
Yices2 Yices2 unsat ✅ 0.27887 0.15845
Z3alpha Z3-alpha unsat ✅ 0.40487 0.28376
Z3 Z3-alpha-base unsat ✅ 0.34003 0.22238
z3siri-base unsat ✅ 0.35970 0.23761