Benchmark

non-incremental/QF_NIA/AProVE/aproveSMT2104748456448998709.smt2

AProve team, see http://aprove.informatik.rwth-aachen.de/, submitted for SMT-COMP 2014
Benchmark
Size7801
Compressed Size1325
License Creative Commons Attribution 4.0 International (CC-BY-4.0)
Categoryindustrial
First Occurrence2014-07-21
Generated By
Generated On
Generator
Dolmen OK1
strict Dolmen OK1
check-sat calls1
Query 1
Status unknown
Inferred Status unsat
Size 7760
Compressed Size1299
Max. Term Depth7
Asserts 1
Declared Functions0
Declared Constants51
Declared Sorts 0
Defined Functions0
Defined Recursive Functions 0
Defined Sorts0
Constants0
Declared Datatypes0

Symbols

not1 and2 =18 +71
-102 *279 >=107

Evaluations

Evaluation Rating Solver Variant Result Wallclock CPU Time
SMT-COMP 2017 0.80 (1/5) AProVE AProVE NIA 2014 default unknown ❌ 600.02000 607.35000
CVC4 CVC4-smtcomp2017-main default unknown ❌ 600.02900 598.40000
SMT-RAT SMTRAT-comp2017_2 default unknown ❌ 600.01900 599.97600
Yices2 Yices2-Main default unsat ✅ 13.04570 13.04500
Z3 z3-4.5.0 default unknown ❌ 600.01000 599.92100
SMT-COMP 2018 0.60 (2/5) AProVE AProVE NIA 2014_default unknown ❌ 1200.03000 1208.99000
CVC4 master-2018-06-10-b19c840-competition-default_default unknown ❌ 1200.01000 1193.71000
SMT-RAT SMTRAT-Rat-final_default unknown ❌ 1200.04000 1199.88000
Yices2 Yices 2.6.0_default unsat ✅ 0.26778 0.26623
Z3 z3-4.7.1_default unsat ✅ 12.47180 12.47000
SMT-COMP 2024 0.50 (2/4) cvc5 cvc5 unknown ❌ 1201.73672 1201.02023
SMTInterpol SMTInterpol unknown ❌ 0.52597 0.82969
Yices2 Yices2 unsat ✅ 0.24105 0.14110
Z3alpha Z3-alpha unsat ✅ 10.49871 10.39868