Benchmark

non-incremental/QF_NRA/meti-tarski/sin/problem/7/weak2/sin-problem-7-weak2-chunk-0115.smt2

These benchmarks used in the paper:

  Dejan Jovanovic and Leonardo de Moura.  Solving Non-Linear Arithmetic.
  In IJCAR 2012, published as LNCS volume 7364, pp. 339--354.

The meti-tarski benchmarks are proof obligations extracted from the
Meti-Tarski project, see:

  B. Akbarpour and L. C. Paulson. MetiTarski: An automatic theorem prover
  for real-valued special functions. Journal of Automated Reasoning,
  44(3):175-205, 2010.

Submitted by Dejan Jovanovic for SMT-LIB.
Benchmark
Size1483
Compressed Size724
License Creative Commons Attribution 4.0 International (CC-BY-4.0)
Categoryindustrial
First Occurrence2014-07-21
Generated By
Generated On
Generator
Dolmen OK1
strict Dolmen OK1
check-sat calls1
Query 1
Status sat
Inferred Status sat
Size 1475
Compressed Size726
Max. Term Depth16
Asserts 1
Declared Functions0
Declared Constants3
Declared Sorts 0
Defined Functions0
Defined Recursive Functions 0
Defined Sorts0
Constants0
Declared Datatypes0

Symbols

not10 or2 and9 let2
/12 +6 -3 *28
<=10

Evaluations

Evaluation Rating Solver Variant Result Wallclock CPU Time
SMT-COMP 2014 0.75 (1/4) CVC3 CVC3 default unknown ❌ 0.09705 0.08899
CVC4 CVC4 f7118b2 default unknown ❌ 0.01946 0.00900
raSAT raSAT-main-track-final default.sh unknown ❌ 0.01697 0.00400
Z3 Z3-4.3.2.a054b099c1d6-x64-debian-6.0.6-SMT-COMP-2014 default sat ✅ 0.00897 0.00800
SMT-COMP 2015 0.50 (3/6) CVC3 CVC3 default unknown ❌ 0.09913 0.09898
CVC4 CVC4-master-2015-06-15-9b32405-main default unknown ❌ 0.01088 0.00900
CVC4-experimental-2015-06-15-ff5745a-main default unknown ❌ 0.01105 0.00900
raSAT raSAT default.sh sat ✅ 0.51225 0.49992
SMT-RAT SMT-RAT-final default unknown ❌ 2400.01000 2401.13000
Yices2 Yices2-NL default sat ✅ 0.00832 0.00300
Z3 z3 4.4.0 default sat ✅ 0.03204 0.03199
SMT-COMP 2016 0.40 (3/5) CVC4 CVC4-master-2016-05-27-cfef263-main default unknown ❌ 0.01276 0.01202
raSAT raSAT 0.3 default.sh sat ✅ 0.01596 0.01597
raSAT 0.4 exp - final default.py sat ✅ 0.06402 0.08656
SMT-RAT SMT-RAT default unknown ❌ 2400.06000 2401.61000
Yices2 Yices-2.4.2 default sat ✅ 0.01230 0.00442
Z3 z3-4.4.1 default sat ✅ 0.03090 0.03218
SMT-COMP 2017 CVC4 CVC4-smtcomp2017-main default sat ✅ 0.01866 0.01817
SMT-RAT SMTRAT-comp2017_2 default sat ✅ 0.03465 0.03385
veriT veriT+raSAT+Redlog default sat ✅ 0.17793 0.18083
Yices2 Yices2-Main default sat ✅ 0.00887 0.00456
Z3 z3-4.5.0 default sat ✅ 0.03322 0.03316
SMT-COMP 2018 CVC4 master-2018-06-10-b19c840-competition-default_default sat ✅ 0.02850 0.02862
SMT-RAT SMTRAT-Rat-final_default unknown ❌ 1200.02000 1200.03000
SMTRAT-MCSAT-final_default sat ✅ 0.05088 0.05079
veriT veriT+raSAT+Reduce_default sat ✅ 0.01545 0.01543
Yices2 Yices 2.6.0_default sat ✅ 0.02917 0.00779
Z3 z3-4.7.1_default sat ✅ 0.03653 0.03650
SMT-COMP 2022 0.11 (8/9) cvc5 cvc5-default-2022-07-02-b15e116-wrapped_sq sat ✅ 0.04967 0.04887
MathSAT MathSAT-5.6.8_default unknown ❌ 1200.07000 1199.88000
NRA-LS NRA-LS-FINAL_default sat ✅ 0.03663 0.03668
Par4 Par4-wrapped-sq_default sat ✅ 0.01551 0.00622
SMT-RAT SMT-RAT-MCSAT_default sat ✅ 0.01545 0.01540
veriT veriT+raSAT+Redlog_default sat ✅ 0.02122 0.02123
Yices2 Yices 2.6.2 for SMTCOMP 2021_default sat ✅ 0.01490 0.01213
Z3 z3-4.8.17_default sat ✅ 0.02013 0.02231
Z3++ z3++0715_default sat ✅ 0.28093 0.28098
SMT-COMP 2023 cvc5 cvc5-default-2023-05-16-ea045f305_sq sat ✅ 0.02652 0.02699
NRA-LS cvc5-NRA-LS-sq_default sat ✅ 0.03935 0.03940
Par4 Par4-wrapped-sq_default sat ✅ 0.01666 0.00636
SMT-RAT SMT-RAT-MCSAT_default sat ✅ 0.02038 0.02037
Yices2 Yices 2 for SMTCOMP 2023_default sat ✅ 0.01113 0.01107
Z3alpha z3alpha_default sat ✅ 0.02427 0.02447
Z3++ z3++0715_default sat ✅ 0.28472 0.28481
Z3++_sq_0526_default sat ✅ 0.29469 0.29461
SMT-COMP 2024 0.20 (4/5) cvc5 cvc5 sat ✅ 0.22457 0.12493
SMTInterpol SMTInterpol unknown ❌ 0.41712 0.44165
SMT-RAT SMT-RAT sat ✅ 0.21736 0.11771
Yices2 Yices2 sat ✅ 0.21754 0.11772
Z3alpha Z3-alpha sat ✅ 0.26840 0.16892