Benchmark

non-incremental/BV/wintersteiger/fmsd13/fixpoint/small-equiv-fixpoint-6.smt2

Hardware fixpoint check problems.
These benchmarks stem from an evaluation described in Wintersteiger, Hamadi, de Moura: Efficiently solving quantified bit-vector formulas, FMSD 42(1), 2013.
The hardware models that were used are from the VCEGAR benchmark suite (see www.cprover.org/hardware/).
Benchmark
Size7792
Compressed Size860
License Creative Commons Attribution 4.0 International (CC-BY-4.0)
Categoryindustrial
First Occurrence2015-07-02
Generated By
Generated On
Generator
Dolmen OK1
strict Dolmen OK1
check-sat calls1
Query 1
Status sat
Inferred Status sat
Size 7784
Compressed Size851
Max. Term Depth60
Asserts 1
Declared Functions0
Declared Constants0
Declared Sorts 0
Defined Functions0
Defined Recursive Functions 0
Defined Sorts0
Constants0
Declared Datatypes0

Symbols

or1 and22 =>1 =50
forall28 exists24 BitVec52 bvmul13
bvudiv13 bvurem13 bvsub13

Evaluations

Evaluation Rating Solver Variant Result Wallclock CPU Time
SMT-COMP 2017 0.25 (3/4) Boolector Boolector SMT17 final boolector sat ✅ 0.17858 0.18217
CVC4 CVC4-smtcomp2017-main default sat ✅ 0.01728 0.01660
Q3B Q3B default sat ✅ 0.05643 0.05659
Z3 z3-4.5.0 default unknown ❌ 600.06100 599.87000
SMT-COMP 2018 0.25 (3/4) Boolector Boolector_default sat ✅ 0.01285 0.01619
CVC4 master-2018-06-10-b19c840-competition-default_default sat ✅ 0.01596 0.01617
Q3B Q3B_default sat ✅ 0.00869 0.00865
Z3 z3-4.7.1_default unknown ❌ 1200.07000 1199.95000
SMT-COMP 2024 0.40 (3/5) Bitwuzla Bitwuzla sat ✅ 0.32140 0.22157
cvc5 cvc5 sat ✅ 0.22597 0.12653
SMTInterpol SMTInterpol unknown ❌ 1201.72586 2919.19154
YicesQS YicesQS sat ✅ 59.58006 59.47947
Z3alpha Z3-alpha unknown ❌ 153.84925 155.62810
SMT-COMP 2025 0.20 (4/5) Bitwuzla Bitwuzla sat ✅ 0.41288 0.29303
cvc5 cvc5 sat ✅ 0.30024 0.17377
SMTInterpol SMTInterpol unknown ❌ 1201.79536 2093.48751
UltimateEliminator UltimateEliminator+MathSAT sat ✅ 2.10891 4.69215
YicesQS YicesQS sat ✅ 49.74296 49.60755