Benchmark

non-incremental/QF_NRA/meti-tarski/Chua/2/VC2/U/Chua-2-VC2-U-chunk-0099.smt2

These benchmarks used in the paper:

  Dejan Jovanovic and Leonardo de Moura.  Solving Non-Linear Arithmetic.
  In IJCAR 2012, published as LNCS volume 7364, pp. 339--354.

The meti-tarski benchmarks are proof obligations extracted from the
Meti-Tarski project, see:

  B. Akbarpour and L. C. Paulson. MetiTarski: An automatic theorem prover
  for real-valued special functions. Journal of Automated Reasoning,
  44(3):175-205, 2010.

Submitted by Dejan Jovanovic for SMT-LIB.
Benchmark
Size4011
Compressed Size1378
License Creative Commons Attribution 4.0 International (CC-BY-4.0)
Categoryindustrial
First Occurrence2014-07-21
Generated By
Generated On
Generator
Dolmen OK1
strict Dolmen OK1
check-sat calls1
Query 1
Status unsat
Inferred Status unsat
Size 4003
Compressed Size1374
Max. Term Depth32
Asserts 1
Declared Functions0
Declared Constants3
Declared Sorts 0
Defined Functions0
Defined Recursive Functions 0
Defined Sorts0
Constants0
Declared Datatypes0

Symbols

not2 and6 =1 /60
+60 -34 *91 <=6

Evaluations

Evaluation Rating Solver Variant Result Wallclock CPU Time
SMT-COMP 2014 0.75 (1/4) CVC3 CVC3 default unknown ❌ 2239.43000 2239.68000
CVC4 CVC4 f7118b2 default unknown ❌ 0.02819 0.01800
raSAT raSAT-main-track-final default.sh unknown ❌ 0.00894 0.00400
Z3 Z3-4.3.2.a054b099c1d6-x64-debian-6.0.6-SMT-COMP-2014 default unsat ✅ 0.02446 0.01500
SMT-COMP 2015 0.33 (4/6) CVC3 CVC3 default unknown ❌ 1792.95000 1792.96000
CVC4 CVC4-master-2015-06-15-9b32405-main default unknown ❌ 0.02038 0.01800
CVC4-experimental-2015-06-15-ff5745a-main default unknown ❌ 0.02085 0.01900
raSAT raSAT default.sh unsat ✅ 0.05251 0.04999
SMT-RAT SMT-RAT-final default unsat ✅ 50.29810 50.30840
Yices2 Yices2-NL default unsat ✅ 0.00818 0.00300
Z3 z3 4.4.0 default unsat ✅ 0.03920 0.03899
SMT-COMP 2016 0.20 (4/5) CVC4 CVC4-master-2016-05-27-cfef263-main default unknown ❌ 0.02087 0.02105
raSAT raSAT 0.3 default.sh unsat ✅ 0.02192 0.02209
raSAT 0.4 exp - final default.py unsat ✅ 0.05371 0.04636
SMT-RAT SMT-RAT default unsat ✅ 0.06583 0.06581
Yices2 Yices-2.4.2 default unsat ✅ 0.01171 0.00383
Z3 z3-4.4.1 default unsat ✅ 0.03769 0.03889
SMT-COMP 2017 0.20 (4/5) CVC4 CVC4-smtcomp2017-main default unknown ❌ 600.01600 599.79000
SMT-RAT SMTRAT-comp2017_2 default unsat ✅ 0.07658 0.07548
veriT veriT+raSAT+Redlog default unsat ✅ 0.18120 0.18543
Yices2 Yices2-Main default unsat ✅ 0.01049 0.00453
Z3 z3-4.5.0 default unsat ✅ 0.04736 0.04008
SMT-COMP 2018 0.20 (4/5) CVC4 master-2018-06-10-b19c840-competition-default_default unknown ❌ 1200.02000 1198.25000
SMT-RAT SMTRAT-Rat-final_default unsat ✅ 0.05987 0.05980
SMTRAT-MCSAT-final_default unsat ✅ 0.04457 0.04452
veriT veriT+raSAT+Reduce_default unsat ✅ 0.01436 0.01431
Yices2 Yices 2.6.0_default unsat ✅ 0.00722 0.00653
Z3 z3-4.7.1_default unsat ✅ 0.04364 0.04363
SMT-COMP 2023 cvc5 cvc5-default-2023-05-16-ea045f305_sq unsat ✅ 0.14173 0.14228
NRA-LS cvc5-NRA-LS-sq_default unsat ✅ 0.41080 0.41072
Par4 Par4-wrapped-sq_default unsat ✅ 0.01769 0.00627
SMT-RAT SMT-RAT-MCSAT_default unsat ✅ 0.02424 0.02418
Yices2 Yices 2 for SMTCOMP 2023_default unsat ✅ 0.01058 0.01011
Z3alpha z3alpha_default unsat ✅ 0.03159 0.03179
Z3++ z3++0715_default unsat ✅ 0.29526 0.29535
Z3++_sq_0526_default unsat ✅ 0.29842 0.29852