Benchmark

non-incremental/QF_NIA/AProVE/aproveSMT3461177408214304445.smt2

AProve team, see http://aprove.informatik.rwth-aachen.de/, submitted for SMT-COMP 2014
Benchmark
Size4438
Compressed Size898
License Creative Commons Attribution 4.0 International (CC-BY-4.0)
Categoryindustrial
First Occurrence2014-07-21
Generated By
Generated On
Generator
Dolmen OK1
strict Dolmen OK1
check-sat calls1
Query 1
Status unsat
Inferred Status unsat
Size 4397
Compressed Size868
Max. Term Depth7
Asserts 1
Declared Functions0
Declared Constants32
Declared Sorts 0
Defined Functions0
Defined Recursive Functions 0
Defined Sorts0
Constants0
Declared Datatypes0

Symbols

not1 and2 =6 +34
-56 *135 >=64

Evaluations

Evaluation Rating Solver Variant Result Wallclock CPU Time
SMT-COMP 2017 0.60 (2/5) AProVE AProVE NIA 2014 default unknown ❌ 600.11200 607.53000
CVC4 CVC4-smtcomp2017-main default unsat ✅ 0.17924 0.17802
SMT-RAT SMTRAT-comp2017_2 default unknown ❌ 600.06800 600.08100
Yices2 Yices2-Main default unsat ✅ 0.06982 0.06920
Z3 z3-4.5.0 default unknown ❌ 57.52180 57.52140
SMT-COMP 2018 0.40 (3/5) AProVE AProVE NIA 2014_default unknown ❌ 1200.12000 1208.79000
CVC4 master-2018-06-10-b19c840-competition-default_default unsat ✅ 0.16379 0.16398
SMT-RAT SMTRAT-Rat-final_default unknown ❌ 1200.01000 1199.74000
Yices2 Yices 2.6.0_default unsat ✅ 0.09620 0.09614
Z3 z3-4.7.1_default unsat ✅ 3.55779 3.55698
SMT-COMP 2024 0.25 (3/4) cvc5 cvc5 unsat ✅ 0.35100 0.25131
SMTInterpol SMTInterpol unknown ❌ 0.46896 0.58773
Yices2 Yices2 unsat ✅ 0.24098 0.14103
Z3alpha Z3-alpha unsat ✅ 1.29050 1.19003
SMT-COMP 2025 0.20 (4/5) cvc5 cvc5 unsat ✅ 0.39780 0.27387
SMTInterpol SMTInterpol unknown ❌ 0.46746 0.53567
Yices2 Yices2 unsat ✅ 0.30440 0.18663
Z3alpha Z3-alpha unsat ✅ 0.56654 0.98655
Z3 Z3-alpha-base unsat ✅ 0.80474 0.68099
z3siri-base unsat ✅ 0.80825 0.69033