Benchmark

non-incremental/QF_BV/20190311-bv-term-small-rw-Noetzli/bv-term-small-rw_536.smt2

Publications: "Syntax-Guided Rewrite Rule Enumeration for SMT Solvers" by A. Noetzli, A. Reynolds, H. Barbosa, A. Niemetz, M. Preiner, C. Barrett, and C. Tinelli, SAT 2019.
Benchmark
Size780
Compressed Size433
License Creative Commons Attribution 4.0 International (CC-BY-4.0)
Categoryindustrial
First Occurrence2019-07-07
Generated ByAndres Noetzli, Andrew Reynolds, Haniel Barbosa, Aina Niemetz, Mathias Preiner, Clark Barrett, Cesare Tinelli
Generated On2018-11-08 00:00:00
GeneratorCVC4
Dolmen OK1
strict Dolmen OK1
check-sat calls1
Query 1
Status unsat
Inferred Status unsat
Size 772
Compressed Size429
Max. Term Depth5
Asserts 1
Declared Functions0
Declared Constants2
Declared Sorts 0
Defined Functions0
Defined Recursive Functions 0
Defined Sorts0
Constants0
Declared Datatypes0

Symbols

not1 =1 bvmul3

Evaluations

Evaluation Rating Solver Variant Result Wallclock CPU Time
SMT-COMP 2023 Bitwuzla Bitwuzla-fixed_default unsat ✅ 0.01129 0.00513
cvc5 cvc5-default-2023-05-16-ea045f305_sq unsat ✅ 0.01662 0.01720
STP STP 2022.4_default unsat ✅ 0.01214 0.01108
STP 2022.4_default unsat ✅ 0.01483 0.01043
UltimateEliminator UltimateIntBlastingWrapper+SMTInterpol_default unsat ✅ 4.44206 11.34950
Yices2 Yices 2 for SMTCOMP 2023_default unsat ✅ 0.01060 0.00332
Z3-Owl z3-Owl-Final_default unsat ✅ 0.68716 0.68724
z3-Owl-Final_default unsat ✅ 0.71579 0.71598
SMT-COMP 2025 Bitwuzla Bitwuzla unsat ✅ 0.28747 0.15870
Bitwuzla-MachBV-base unsat ✅ 0.28805 0.15999
Bitwuzla-MachBV Bitwuzla-MachBV unsat ✅ 0.33449 0.21360
BVDecide bv_decide unsat ✅ 0.53760 0.41752
bv_decide-nokernel unsat ✅ 0.52299 0.40216
cvc5 cvc5 unsat ✅ 0.28582 0.16162
SMTInterpol SMTInterpol unsat ✅ 0.45320 0.40507
Yices2 Yices2 unsat ✅ 0.24676 0.12620
Z3alpha Z3-alpha unsat ✅ 0.39676 0.26913
Z3 Z3-alpha-base unsat ✅ 0.28110 0.15923
Z3-Owl-base unsat ✅ 0.29560 0.17498
z3siri-base unsat ✅ 0.30415 0.17029
Z3-Owl Z3-Owl unsat ✅ 0.81490 0.67643