Benchmark
non-incremental/BV/wintersteiger/fmsd13/ranking/audio_sysfx_swap.cpp.smt2
Software ranking function synthesis problems.
These benchmarks stem from an evaluation described in Wintersteiger, Hamadi, de Moura: Efficiently solving quantified bit-vector formulas, FMSD 42(1), 2013.
The software models that were used are from a previous evaluation of termination proving tools described in Cook, Kroening, Ruemmer, Wintersteiger: Ranking Function Synthesis for Bit-Vector Relations, TACAS 2010.
| Benchmark |
| Size | 1608 |
| Compressed Size | 572 |
| License |
Creative Commons Attribution 4.0 International
(CC-BY-4.0)
|
| Category | industrial |
| First Occurrence | 2015-07-02 |
| Generated By | — |
| Generated On | — |
| Generator | — |
| Dolmen OK | 1 |
| strict Dolmen OK | 1 |
| check-sat calls | 1 |
| Status | sat |
| Inferred Status | sat |
| Size | 1600 |
| Compressed Size | 568 |
| Max. Term Depth | 10 |
| Asserts | 1 |
| Declared Functions | 0 |
| Declared Constants | 0 |
| Declared Sorts | 0 |
| Defined Functions | 0 |
| Defined Recursive Functions | 0 |
| Defined Sorts | 0 |
| Constants | 0 |
| Declared Datatypes | 0 |
Symbols
not | 1 |
and | 1 |
=> | 1 |
= | 4 |
forall | 4 |
exists | 1 |
BitVec | 5 |
bvmul | 2 |
bvsub | 1 |
bvslt | 1 |
zero_extend | 2 |
sign_extend | 2 |
Evaluations
| Evaluation |
Rating |
Solver |
Variant |
Result |
Wallclock |
CPU Time |
|
SMT-COMP 2017
|
0.50 (2/4) |
Boolector |
Boolector SMT17 final boolector |
unknown ❌
|
600.02000
|
1199.95000
|
| |
CVC4 |
CVC4-smtcomp2017-main default |
unknown ❌
|
600.01700
|
585.08000
|
| |
|
CVC4-smtcomp2017-main default |
unknown ❌
|
600.01500
|
577.28000
|
| |
Q3B |
Q3B default |
sat ✅
|
0.16195
|
0.21689
|
| |
Z3 |
z3-4.5.0 default |
sat ✅
|
0.04172
|
0.04167
|
| |
|
z3-4.5.0 default |
sat ✅
|
0.04262
|
0.04196
|
|
SMT-COMP 2018
|
0.25 (3/4) |
Boolector |
Boolector_default |
unknown ❌
|
1200.02000
|
2399.79000
|
| |
CVC4 |
master-2018-06-10-b19c840-competition-default_default |
sat ✅
|
0.04088
|
0.04122
|
| |
|
master-2018-06-10-b19c840-competition-default_default |
sat ✅
|
0.02059
|
0.02076
|
| |
Q3B |
Q3B_default |
sat ✅
|
0.03001
|
0.05742
|
| |
Z3 |
z3-4.7.1_default |
sat ✅
|
0.04339
|
0.04334
|
| |
|
z3-4.7.1_default |
sat ✅
|
0.04422
|
0.04417
|
|
SMT-COMP 2020
|
0.40 (3/5) |
Bitwuzla |
Bitwuzla-fixed_default |
unknown ❌
|
1200.05000
|
2400.02000
|
| |
CVC4 |
CVC4-sq-final_default |
sat ✅
|
0.02415
|
0.02444
|
| |
Par4 |
Par4-wrapped-sq_default |
sat ✅
|
0.02983
|
0.00628
|
| |
UltimateEliminator |
UltimateEliminator+MathSAT-5.6.3_s_default |
unknown ❌
|
2.40763
|
3.38036
|
| |
Z3 |
z3-4.8.8_default |
sat ✅
|
0.04300
|
0.04296
|
|
SMT-COMP 2021
|
0.50 (2/4) |
Par4 |
Par4-wrapped-sq_default |
sat ✅
|
0.02763
|
0.00668
|
| |
UltimateEliminator |
UltimateEliminator+MathSAT-5.6.6_default |
unknown ❌
|
3.33523
|
4.91682
|
| |
YicesQS |
yices-QS-2021-06-13under10_default |
unknown ❌
|
1200.01000
|
1199.93000
|
| |
Z3 |
z3-4.8.11_default |
sat ✅
|
0.02115
|
0.02113
|
|
SMT-COMP 2023
|
0.17 (5/6) |
Bitwuzla |
Bitwuzla-fixed_default |
sat ✅
|
6.46815
|
6.46795
|
| |
cvc5 |
cvc5-default-2023-05-16-ea045f305_sq |
sat ✅
|
0.02350
|
0.02409
|
| |
Par4 |
Par4-wrapped-sq_default |
sat ✅
|
0.02337
|
0.00654
|
| |
Q3B |
Q3B_default |
sat ✅
|
0.05148
|
0.07220
|
| |
UltimateEliminator |
UltimateEliminator+MathSAT-5.6.9_default |
unknown ❌
|
3.00121
|
5.13556
|
| |
|
UltimateIntBlastingWrapper+SMTInterpol_default |
unknown ❌
|
4.47223
|
12.34880
|
| |
YicesQS |
yicesQS-2022-07-02-optim-under10_default |
sat ✅
|
0.08966
|
0.09025
|
|
SMT-COMP 2024
|
0.40 (3/5) |
Bitwuzla |
Bitwuzla |
sat ✅
|
2.44645
|
2.32863
|
| |
cvc5 |
cvc5 |
sat ✅
|
0.22546
|
0.12462
|
| |
SMTInterpol |
SMTInterpol |
unknown ❌
|
0.48356
|
0.61173
|
| |
YicesQS |
YicesQS |
unknown ❌
|
1201.72046
|
1201.09946
|
| |
Z3alpha |
Z3-alpha |
sat ✅
|
6.34742
|
6.24355
|