Benchmark

non-incremental/QF_BV/20190311-bv-term-small-rw-Noetzli/bv-term-small-rw_318.smt2

Publications: "Syntax-Guided Rewrite Rule Enumeration for SMT Solvers" by A. Noetzli, A. Reynolds, H. Barbosa, A. Niemetz, M. Preiner, C. Barrett, and C. Tinelli, SAT 2019.
Benchmark
Size734
Compressed Size432
License Creative Commons Attribution 4.0 International (CC-BY-4.0)
Categoryindustrial
First Occurrence2019-07-07
Generated ByAndres Noetzli, Andrew Reynolds, Haniel Barbosa, Aina Niemetz, Mathias Preiner, Clark Barrett, Cesare Tinelli
Generated On2018-11-08 00:00:00
GeneratorCVC4
Dolmen OK1
strict Dolmen OK1
check-sat calls1
Query 1
Status sat
Inferred Status sat
Size 726
Compressed Size430
Max. Term Depth5
Asserts 1
Declared Functions0
Declared Constants2
Declared Sorts 0
Defined Functions0
Defined Recursive Functions 0
Defined Sorts0
Constants0
Declared Datatypes0

Symbols

not1 =1 bvand1 bvshl2
bvlshr2

Evaluations

Evaluation Rating Solver Variant Result Wallclock CPU Time
SMT-COMP 2023 0.17 (5/6) Bitwuzla Bitwuzla-fixed_default sat ✅ 0.01255 0.01249
cvc5 cvc5-default-2023-05-16-ea045f305_sq sat ✅ 0.02957 0.03012
STP STP 2022.4_default sat ✅ 0.01737 0.01741
STP 2022.4_default sat ✅ 0.01694 0.01695
UltimateEliminator UltimateIntBlastingWrapper+SMTInterpol_default unknown ❌ 1200.03000 1243.87000
Yices2 Yices 2 for SMTCOMP 2023_default sat ✅ 0.01237 0.00460
Z3-Owl z3-Owl-Final_default sat ✅ 0.85676 0.85683
z3-Owl-Final_default sat ✅ 0.86251 0.86275
SMT-COMP 2025 Bitwuzla Bitwuzla sat ✅ 0.24261 0.12724
Bitwuzla-MachBV-base sat ✅ 0.24859 0.12931
Bitwuzla-MachBV Bitwuzla-MachBV sat ✅ 0.38890 0.25925
BVDecide bv_decide sat ✅ 0.60175 0.43433
bv_decide-nokernel sat ✅ 0.57159 0.40358
cvc5 cvc5 sat ✅ 0.29346 0.17247
SMTInterpol SMTInterpol sat ✅ 436.03015 467.67461
Yices2 Yices2 sat ✅ 0.28144 0.16252
Z3alpha Z3-alpha sat ✅ 0.40188 0.28686
Z3 Z3-alpha-base sat ✅ 0.29340 0.17303
Z3-Owl-base sat ✅ 0.32543 0.19743
z3siri-base sat ✅ 0.27516 0.15558
Z3-Owl Z3-Owl sat ✅ 0.87343 0.75263