Benchmark

non-incremental/QF_BV/20190311-bv-term-small-rw-Noetzli/bv-term-small-rw_955.smt2

Publications: "Syntax-Guided Rewrite Rule Enumeration for SMT Solvers" by A. Noetzli, A. Reynolds, H. Barbosa, A. Niemetz, M. Preiner, C. Barrett, and C. Tinelli, SAT 2019.
Benchmark
Size749
Compressed Size439
License Creative Commons Attribution 4.0 International (CC-BY-4.0)
Categoryindustrial
First Occurrence2019-07-07
Generated ByAndres Noetzli, Andrew Reynolds, Haniel Barbosa, Aina Niemetz, Mathias Preiner, Clark Barrett, Cesare Tinelli
Generated On2018-11-08 00:00:00
GeneratorCVC4
Dolmen OK1
strict Dolmen OK1
check-sat calls1
Query 1
Status unsat
Inferred Status unsat
Size 741
Compressed Size433
Max. Term Depth5
Asserts 1
Declared Functions0
Declared Constants2
Declared Sorts 0
Defined Functions0
Defined Recursive Functions 0
Defined Sorts0
Constants0
Declared Datatypes0

Symbols

not1 =1 bvmul1 bvlshr2

Evaluations

Evaluation Rating Solver Variant Result Wallclock CPU Time
SMT-COMP 2023 Bitwuzla Bitwuzla-fixed_default unsat ✅ 0.01035 0.00501
cvc5 cvc5-default-2023-05-16-ea045f305_sq unsat ✅ 0.01645 0.01700
STP STP 2022.4_default unsat ✅ 0.01068 0.00995
STP 2022.4_default unsat ✅ 0.01058 0.01063
UltimateEliminator UltimateIntBlastingWrapper+SMTInterpol_default unsat ✅ 4.46022 11.29770
Yices2 Yices 2 for SMTCOMP 2023_default unsat ✅ 0.01063 0.00341
Z3-Owl z3-Owl-Final_default unsat ✅ 0.69039 0.69026
z3-Owl-Final_default unsat ✅ 0.68869 0.68899
SMT-COMP 2025 Bitwuzla Bitwuzla unsat ✅ 0.28622 0.15807
Bitwuzla-MachBV-base unsat ✅ 0.26048 0.14252
Bitwuzla-MachBV Bitwuzla-MachBV unsat ✅ 0.32045 0.20495
BVDecide bv_decide unsat ✅ 0.60296 0.42721
bv_decide-nokernel unsat ✅ 0.57484 0.39937
cvc5 cvc5 unsat ✅ 0.29315 0.16515
SMTInterpol SMTInterpol unsat ✅ 0.45176 0.43712
Yices2 Yices2 unsat ✅ 0.28437 0.15797
Z3alpha Z3-alpha unsat ✅ 0.39365 0.27610
Z3 Z3-alpha-base unsat ✅ 0.30988 0.18279
Z3-Owl-base unsat ✅ 0.30679 0.17660
z3siri-base unsat ✅ 0.29724 0.16763
Z3-Owl Z3-Owl unsat ✅ 0.81516 0.69167