Benchmark
non-incremental/BV/wintersteiger/fmsd13/ranking/filesys_fastfat_write.c.smt2
Software ranking function synthesis problems.
These benchmarks stem from an evaluation described in Wintersteiger, Hamadi, de Moura: Efficiently solving quantified bit-vector formulas, FMSD 42(1), 2013.
The software models that were used are from a previous evaluation of termination proving tools described in Cook, Kroening, Ruemmer, Wintersteiger: Ranking Function Synthesis for Bit-Vector Relations, TACAS 2010.
| Benchmark |
| Size | 1307 |
| Compressed Size | 549 |
| License |
Creative Commons Attribution 4.0 International
(CC-BY-4.0)
|
| Category | industrial |
| First Occurrence | 2015-07-02 |
| Generated By | — |
| Generated On | — |
| Generator | — |
| Dolmen OK | 1 |
| strict Dolmen OK | 1 |
| check-sat calls | 1 |
| Status | sat |
| Inferred Status | sat |
| Size | 1299 |
| Compressed Size | 542 |
| Max. Term Depth | 9 |
| Asserts | 1 |
| Declared Functions | 0 |
| Declared Constants | 0 |
| Declared Sorts | 0 |
| Defined Functions | 0 |
| Defined Recursive Functions | 0 |
| Defined Sorts | 0 |
| Constants | 0 |
| Declared Datatypes | 0 |
Symbols
not | 2 |
and | 1 |
=> | 1 |
= | 3 |
forall | 3 |
exists | 1 |
BitVec | 4 |
bvmul | 2 |
bvsub | 1 |
bvule | 1 |
bvslt | 1 |
zero_extend | 2 |
sign_extend | 2 |
| | | | | |
Evaluations
| Evaluation |
Rating |
Solver |
Variant |
Result |
Wallclock |
CPU Time |
|
SMT-COMP 2017
|
0.25 (3/4) |
Boolector |
Boolector SMT17 final boolector |
sat ✅
|
569.77200
|
573.65700
|
| |
CVC4 |
CVC4-smtcomp2017-main default |
unknown ❌
|
600.01800
|
582.46000
|
| |
|
CVC4-smtcomp2017-main default |
unknown ❌
|
600.01400
|
577.77000
|
| |
Q3B |
Q3B default |
sat ✅
|
0.16597
|
0.19515
|
| |
Z3 |
z3-4.5.0 default |
sat ✅
|
0.04530
|
0.04520
|
| |
|
z3-4.5.0 default |
sat ✅
|
248.31400
|
248.29100
|
|
SMT-COMP 2018
|
0.25 (3/4) |
Boolector |
Boolector_default |
sat ✅
|
213.63600
|
427.00500
|
| |
CVC4 |
master-2018-06-10-b19c840-competition-default_default |
sat ✅
|
0.03650
|
0.03680
|
| |
|
master-2018-06-10-b19c840-competition-default_default |
sat ✅
|
0.01995
|
0.01996
|
| |
Q3B |
Q3B_default |
sat ✅
|
0.03080
|
0.05683
|
| |
Z3 |
z3-4.7.1_default |
unknown ❌
|
1200.05000
|
1199.81000
|
| |
|
z3-4.7.1_default |
unknown ❌
|
1200.03000
|
1199.90000
|
|
SMT-COMP 2020
|
0.40 (3/5) |
Bitwuzla |
Bitwuzla-fixed_default |
sat ✅
|
0.17265
|
0.30855
|
| |
CVC4 |
CVC4-sq-final_default |
sat ✅
|
0.02380
|
0.02410
|
| |
Par4 |
Par4-wrapped-sq_default |
sat ✅
|
0.03286
|
0.00759
|
| |
UltimateEliminator |
UltimateEliminator+MathSAT-5.6.3_s_default |
unknown ❌
|
2.38192
|
3.39084
|
| |
Z3 |
z3-4.8.8_default |
unknown ❌
|
1200.02000
|
1199.79000
|
|
SMT-COMP 2022
|
0.38 (5/8) |
Bitwuzla |
Bitwuzla-wrapped_default |
unknown ❌
|
1200.07000
|
1199.30000
|
| |
cvc5 |
cvc5-default-2022-07-02-b15e116-wrapped_sq |
sat ✅
|
0.02361
|
0.02414
|
| |
Par4 |
Par4-wrapped-sq_default |
sat ✅
|
0.03248
|
0.00623
|
| |
Q3B |
Q3B_default |
sat ✅
|
0.05354
|
0.07673
|
| |
Q3B-pBNN |
Q3B-pBDD SMT-COMP 2022 final_default |
unknown ❌
|
1200.10000
|
1199.80000
|
| |
UltimateEliminator |
UltimateEliminator+MathSAT-5.6.7-wrapped_default |
unknown ❌
|
3.19434
|
11.05620
|
| |
YicesQS |
yicesQS-2022-07-02-optim-under10_default |
sat ✅
|
0.20984
|
0.20987
|
| |
Z3 |
z3-4.8.17_default |
sat ✅
|
586.63300
|
586.57700
|
|
SMT-COMP 2025
|
0.40 (3/5) |
Bitwuzla |
Bitwuzla |
sat ✅
|
0.34709
|
0.22531
|
| |
cvc5 |
cvc5 |
sat ✅
|
0.31605
|
0.18690
|
| |
SMTInterpol |
SMTInterpol |
unknown ❌
|
0.53345
|
0.59549
|
| |
UltimateEliminator |
UltimateEliminator+MathSAT |
unknown ❌
|
2.07684
|
4.36401
|
| |
YicesQS |
YicesQS |
sat ✅
|
0.35219
|
0.22549
|