Benchmark

non-incremental/QF_NIA/AProVE/aproveSMT6771738228131904044.smt2

AProve team, see http://aprove.informatik.rwth-aachen.de/, submitted for SMT-COMP 2014
Benchmark
Size2874
Compressed Size621
License Creative Commons Attribution 4.0 International (CC-BY-4.0)
Categoryindustrial
First Occurrence2014-07-21
Generated By
Generated On
Generator
Dolmen OK1
strict Dolmen OK1
check-sat calls1
Query 1
Status unknown
Inferred Status unsat
Size 2833
Compressed Size595
Max. Term Depth7
Asserts 1
Declared Functions0
Declared Constants18
Declared Sorts 0
Defined Functions0
Defined Recursive Functions 0
Defined Sorts0
Constants0
Declared Datatypes0

Symbols

not1 and2 =3 +20
-35 *102 >=36

Evaluations

Evaluation Rating Solver Variant Result Wallclock CPU Time
SMT-COMP 2017 0.80 (1/5) AProVE AProVE NIA 2014 default unknown ❌ 600.06300 605.80500
CVC4 CVC4-smtcomp2017-main default unknown ❌ 600.10900 599.04000
SMT-RAT SMTRAT-comp2017_2 default unknown ❌ 600.01600 599.90000
Yices2 Yices2-Main default unsat ✅ 0.22071 0.21977
Z3 z3-4.5.0 default unknown ❌ 217.59900 217.57600
SMT-COMP 2018 0.60 (2/5) AProVE AProVE NIA 2014_default unknown ❌ 1200.03000 1205.98000
CVC4 master-2018-06-10-b19c840-competition-default_default unknown ❌ 1200.07000 1197.68000
SMT-RAT SMTRAT-Rat-final_default unknown ❌ 1200.06000 1199.98000
Yices2 Yices 2.6.0_default unsat ✅ 0.08384 0.08379
Z3 z3-4.7.1_default unsat ✅ 11.09360 11.09360
SMT-COMP 2024 0.50 (2/4) cvc5 cvc5 unknown ❌ 1201.71389 1200.64457
SMTInterpol SMTInterpol unknown ❌ 0.48581 0.56595
Yices2 Yices2 unsat ✅ 0.28338 0.18325
Z3alpha Z3-alpha unsat ✅ 7.08340 6.98331
SMT-COMP 2025 1.00 (0/5) cvc5 cvc5 unknown ❌ 1201.74665 1201.05650
SMTInterpol SMTInterpol unknown ❌ 0.49882 0.53246
Yices2 Yices2 unknown ❌ 1201.28921 1201.03367
Z3alpha Z3-alpha unknown ❌ 1201.78991 3626.35805
Z3 Z3-alpha-base unknown ❌ 1201.28484 1201.01786
z3siri-base unknown ❌ 1201.29073 1200.92269