Benchmark

non-incremental/QF_NIA/AProVE/aproveSMT7940703262999032810.smt2

AProve team, see http://aprove.informatik.rwth-aachen.de/, submitted for SMT-COMP 2014
Benchmark
Size9346
Compressed Size1447
License Creative Commons Attribution 4.0 International (CC-BY-4.0)
Categoryindustrial
First Occurrence2014-07-21
Generated By
Generated On
Generator
Dolmen OK1
strict Dolmen OK1
check-sat calls1
Query 1
Status unsat
Inferred Status unsat
Size 9305
Compressed Size1451
Max. Term Depth7
Asserts 1
Declared Functions0
Declared Constants31
Declared Sorts 0
Defined Functions0
Defined Recursive Functions 0
Defined Sorts0
Constants0
Declared Datatypes0

Symbols

not1 and2 =27 +90
-147 *387 >=106

Evaluations

Evaluation Rating Solver Variant Result Wallclock CPU Time
SMT-COMP 2017 0.40 (3/5) AProVE AProVE NIA 2014 default unknown ❌ 600.10600 606.45000
CVC4 CVC4-smtcomp2017-main default unsat ✅ 3.31995 3.32015
SMT-RAT SMTRAT-comp2017_2 default unknown ❌ 600.03300 599.96000
Yices2 Yices2-Main default unsat ✅ 0.09999 0.09938
Z3 z3-4.5.0 default unsat ✅ 22.01020 22.00850
SMT-COMP 2018 0.40 (3/5) AProVE AProVE NIA 2014_default unknown ❌ 1200.06000 1210.95000
CVC4 master-2018-06-10-b19c840-competition-default_default unsat ✅ 9.14425 9.14366
SMT-RAT SMTRAT-Rat-final_default unknown ❌ 1200.03000 1199.89000
Yices2 Yices 2.6.0_default unsat ✅ 0.07007 0.06999
Z3 z3-4.7.1_default unsat ✅ 15.57300 15.57150
SMT-COMP 2024 0.25 (3/4) cvc5 cvc5 unsat ✅ 22.40248 22.29010
SMTInterpol SMTInterpol unknown ❌ 0.55442 0.90080
Yices2 Yices2 unsat ✅ 0.26192 0.16202
Z3alpha Z3-alpha unsat ✅ 11.07493 10.97010
SMT-COMP 2025 0.20 (4/5) cvc5 cvc5 unsat ✅ 17.08805 16.95684
SMTInterpol SMTInterpol unknown ❌ 0.55275 0.79471
Yices2 Yices2 unsat ✅ 0.34143 0.22210
Z3alpha Z3-alpha unsat ✅ 0.62318 1.15961
Z3 Z3-alpha-base unsat ✅ 0.86193 0.73938
z3siri-base unsat ✅ 0.86925 0.74948