Benchmark

non-incremental/QF_NIA/AProVE/aproveSMT4635890893149525450.smt2

AProve team, see http://aprove.informatik.rwth-aachen.de/, submitted for SMT-COMP 2014
Benchmark
Size1553
Compressed Size462
License Creative Commons Attribution 4.0 International (CC-BY-4.0)
Categoryindustrial
First Occurrence2014-07-21
Generated By
Generated On
Generator
Dolmen OK1
strict Dolmen OK1
check-sat calls1
Query 1
Status unsat
Inferred Status unsat
Size 1512
Compressed Size441
Max. Term Depth7
Asserts 1
Declared Functions0
Declared Constants7
Declared Sorts 0
Defined Functions0
Defined Recursive Functions 0
Defined Sorts0
Constants0
Declared Datatypes0

Symbols

not1 and2 =3 +10
-22 *52 >=15

Evaluations

Evaluation Rating Solver Variant Result Wallclock CPU Time
SMT-COMP 2017 0.40 (3/5) AProVE AProVE NIA 2014 default unknown ❌ 600.02200 607.40000
CVC4 CVC4-smtcomp2017-main default unsat ✅ 0.15081 0.14977
SMT-RAT SMTRAT-comp2017_2 default unknown ❌ 600.04600 600.03800
Yices2 Yices2-Main default unsat ✅ 0.02139 0.02074
Z3 z3-4.5.0 default unsat ✅ 14.16450 14.16200
SMT-COMP 2018 0.40 (3/5) AProVE AProVE NIA 2014_default unknown ❌ 1200.07000 1209.12000
CVC4 master-2018-06-10-b19c840-competition-default_default unsat ✅ 0.14695 0.14711
SMT-RAT SMTRAT-Rat-final_default unknown ❌ 1200.04000 1200.00000
Yices2 Yices 2.6.0_default unsat ✅ 0.02464 0.02454
Z3 z3-4.7.1_default unsat ✅ 1.62644 1.62624
SMT-COMP 2024 0.25 (3/4) cvc5 cvc5 unsat ✅ 0.31526 0.21552
SMTInterpol SMTInterpol unknown ❌ 0.43808 0.47878
Yices2 Yices2 unsat ✅ 0.23416 0.13449
Z3alpha Z3-alpha unsat ✅ 0.59738 0.49783
SMT-COMP 2025 0.20 (4/5) cvc5 cvc5 unsat ✅ 0.39377 0.26737
SMTInterpol SMTInterpol unknown ❌ 0.42984 0.43016
Yices2 Yices2 unsat ✅ 0.29651 0.16867
Z3alpha Z3-alpha unsat ✅ 0.42829 0.39646
Z3 Z3-alpha-base unsat ✅ 0.45600 0.33712
z3siri-base unsat ✅ 0.46987 0.34405