Benchmark
non-incremental/QF_BV/Sage2/bench_7652.smt2
Patrice Godefroid, SAGE (systematic dynamic test generation). For more information: http://research.microsoft.com/en-us/um/people/pg/public_psfiles/ndss2008.pdf
| Benchmark |
| Size | 343739 |
| Compressed Size | 16846 |
| License |
Creative Commons Attribution 4.0 International
(CC-BY-4.0)
|
| Category | industrial |
| First Occurrence | 2015-07-02 |
| Generated By | — |
| Generated On | — |
| Generator | — |
| Dolmen OK | 1 |
| strict Dolmen OK | 1 |
| check-sat calls | 1 |
| Status | sat |
| Inferred Status | sat |
| Size | 343731 |
| Compressed Size | 16858 |
| Max. Term Depth | 86 |
| Asserts | 640 |
| Declared Functions | 0 |
| Declared Constants | 220 |
| Declared Sorts | 0 |
| Defined Functions | 0 |
| Defined Recursive Functions | 0 |
| Defined Sorts | 0 |
| Constants | 0 |
| Declared Datatypes | 0 |
Symbols
true | 1 |
ite | 12 |
not | 73 |
= | 171 |
let | 6938 |
extract | 6 |
bvand | 21 |
bvor | 105 |
bvadd | 3209 |
bvmul | 1371 |
bvult | 125 |
bvule | 181 |
bvugt | 59 |
bvuge | 115 |
bvshl | 105 |
bvlshr | 38 |
zero_extend | 1986 |
| | | | | |
Evaluations
| Evaluation |
Rating |
Solver |
Variant |
Result |
Wallclock |
CPU Time |
|
SMT-COMP 2017
|
0.12 (7/8) |
Boolector |
Boolector+CaDiCaL SMT17 final boolector |
sat ✅
|
0.94925
|
0.94873
|
| |
|
Boolector SMT17 final boolector |
sat ✅
|
2.31538
|
2.31479
|
| |
CVC4 |
CVC4-smtcomp2017-main default |
sat ✅
|
2.38052
|
4.61252
|
| |
MathSAT |
mathsat-5.4.1-linux-x86_64-Main default |
sat ✅
|
30.16380
|
30.16120
|
| |
MinkeyRink |
MinkeyRink 2017.3a default |
sat ✅
|
0.53557
|
0.52490
|
| |
Q3B |
Q3B default |
unknown ❌
|
600.05600
|
1799.49000
|
| |
STP |
stp_st default |
sat ✅
|
3.75348
|
3.75293
|
| |
|
stp_mt default |
sat ✅
|
2.97224
|
9.40891
|
| |
Yices2 |
Yices2-Main default |
sat ✅
|
0.08972
|
0.08925
|
| |
Z3 |
z3-4.5.0 default |
sat ✅
|
3.75614
|
3.75525
|
|
SMT-COMP 2018
|
|
Boolector |
Boolector_default |
sat ✅
|
0.28916
|
0.28900
|
| |
CVC4 |
master-2018-06-10-b19c840-competition-default_default |
sat ✅
|
1.98067
|
1.98071
|
| |
MathSAT |
mathsat-5.5.2-linux-x86_64-Main_default |
sat ✅
|
5.39514
|
5.39652
|
| |
MinkeyRink |
Minkeyrink MT_mt |
sat ✅
|
0.36426
|
0.75820
|
| |
|
Minkeyrink ST_st |
sat ✅
|
0.37168
|
0.37176
|
| |
STP |
STP-CMS-st-2018_default-no-stderr |
sat ✅
|
1.37760
|
1.37634
|
| |
|
STP-CMS-mt-2018_multicore-no-stderr |
sat ✅
|
0.60564
|
1.07095
|
| |
|
STP-Riss-st-2018_riss-no-stderr |
sat ✅
|
2.31255
|
2.31236
|
| |
Yices2 |
Yices 2.6.0_default |
sat ✅
|
0.09152
|
0.09147
|
| |
Z3 |
z3-4.7.1_default |
sat ✅
|
2.87277
|
2.87207
|
|
SMT-COMP 2023
|
0.17 (5/6) |
Bitwuzla |
Bitwuzla-fixed_default |
sat ✅
|
1.59835
|
1.59802
|
| |
cvc5 |
cvc5-default-2023-05-16-ea045f305_sq |
sat ✅
|
2.89470
|
2.89512
|
| |
STP |
STP 2022.4_default |
sat ✅
|
0.31373
|
0.31372
|
| |
|
STP 2022.4_default |
sat ✅
|
0.31250
|
0.31250
|
| |
UltimateEliminator |
UltimateIntBlastingWrapper+SMTInterpol_default |
unknown ❌
|
1200.03000
|
1241.85000
|
| |
Yices2 |
Yices 2 for SMTCOMP 2023_default |
sat ✅
|
0.07034
|
0.07029
|
| |
Z3-Owl |
z3-Owl-Final_default |
sat ✅
|
54.06390
|
54.05990
|
| |
|
z3-Owl-Final_default |
sat ✅
|
57.90390
|
57.90280
|
|
SMT-COMP 2025
|
0.11 (8/9) |
Bitwuzla |
Bitwuzla |
sat ✅
|
0.52204
|
0.40298
|
| |
|
Bitwuzla-MachBV-base |
sat ✅
|
0.47117
|
0.34992
|
| |
Bitwuzla-MachBV |
Bitwuzla-MachBV |
sat ✅
|
0.53187
|
0.40953
|
| |
BVDecide |
bv_decide |
sat ✅
|
6.35958
|
6.19255
|
| |
|
bv_decide-nokernel |
sat ✅
|
6.11324
|
5.93990
|
| |
cvc5 |
cvc5 |
sat ✅
|
2.10411
|
1.97328
|
| |
SMTInterpol |
SMTInterpol |
unknown ❌
|
1201.49029
|
1224.97316
|
| |
Yices2 |
Yices2 |
sat ✅
|
0.43260
|
0.31406
|
| |
Z3alpha |
Z3-alpha |
sat ✅
|
2.50725
|
8.24211
|
| |
Z3 |
Z3-alpha-base |
sat ✅
|
14.03238
|
13.90888
|
| |
|
Z3-Owl-base |
sat ✅
|
56.46243
|
56.32973
|
| |
|
z3siri-base |
sat ✅
|
14.09315
|
13.95688
|
| |
Z3-Owl |
Z3-Owl |
sat ✅
|
13.98050
|
13.85258
|