Benchmark
non-incremental/QF_AUFLIA/storecomm/storecomm_invalid_t1_pp_nf_ai_00050_005.cvc.smt2
Benchmarks used in the followin paper:
Big proof engines as little proof engines: new results on rewrite-based satisfiability procedure
Alessandro Armando, Maria Paola Bonacina, Silvio Ranise, Stephan Schulz.
PDPAR'05
http://www.ai.dist.unige.it/pdpar05/
| Benchmark |
| Size | 37292 |
| Compressed Size | 2622 |
| License |
Creative Commons Attribution 4.0 International
(CC-BY-4.0)
|
| Category | crafted |
| First Occurrence | 2006-08-21 |
| Generated By | — |
| Generated On | — |
| Generator | — |
| Dolmen OK | 1 |
| strict Dolmen OK | 1 |
| check-sat calls | 1 |
| Status | sat |
| Inferred Status | sat |
| Size | 37284 |
| Compressed Size | 2636 |
| Max. Term Depth | 53 |
| Asserts | 1226 |
| Declared Functions | 1 |
| Declared Constants | 101 |
| Declared Sorts | 0 |
| Defined Functions | 0 |
| Defined Recursive Functions | 0 |
| Defined Sorts | 0 |
| Constants | 0 |
| Declared Datatypes | 0 |
Symbols
not | 1226 |
= | 1226 |
let | 2 |
select | 2 |
store | 100 |
| | | | | |
Evaluations
| Evaluation |
Rating |
Solver |
Variant |
Result |
Wallclock |
CPU Time |
|
SMT Evaluation 2013
|
0.20 (4/5) |
CVC3 |
CVC3-SMT-COMP-2010 default |
sat ✅
|
0.09456
|
—
|
| |
|
CVC3-SMT-COMP-2011 default |
sat ✅
|
0.09621
|
—
|
| |
|
CVC3-SMT-COMP-2012 default |
sat ✅
|
0.09406
|
—
|
| |
CVC4 |
CVC4-SMT-COMP-2012-Resubmission default |
sat ✅
|
0.09879
|
—
|
| |
|
CVC4-SMT-EVAL-2013 default |
sat ✅
|
0.10886
|
—
|
| |
MathSAT |
MathSAT5-5.2.6-SMT-EVAL-2013 default |
sat ✅
|
0.24355
|
—
|
| |
|
MathSAT5-SMT-COMP-2011 default |
sat ✅
|
0.23095
|
—
|
| |
|
MathSAT5-SMT-COMP-2012 default |
sat ✅
|
0.27543
|
—
|
| |
veriT |
veriT-SMT-EVAL-2013 default |
unknown ❌
|
0.02631
|
—
|
| |
Z3 |
Z3-4.3.2.a054b099c1d6-x64-debian-6.0.6-SMT-EVAL-2013 default |
sat ✅
|
0.02623
|
—
|
| |
|
Z3-SMT-COMP-2011 default |
sat ✅
|
0.03460
|
—
|
|
SMT-COMP 2015
|
0.17 (5/6) |
CVC4 |
CVC4-master-2015-06-15-9b32405-main default |
sat ✅
|
0.69354
|
0.69189
|
| |
|
CVC4-experimental-2015-06-15-ff5745a-main default |
sat ✅
|
2.03483
|
2.03369
|
| |
MathSAT |
MathSat 5.3.6 main smtcomp2015_main |
sat ✅
|
0.36121
|
0.35994
|
| |
SMTInterpol |
SMTInterpol v2.1-206-g86e9531 default |
sat ✅
|
1.04681
|
2.03069
|
| |
veriT |
veriT default |
unknown ❌
|
0.01979
|
0.01900
|
| |
Yices2 |
Yices default |
sat ✅
|
0.10560
|
0.10498
|
| |
Z3 |
z3 4.4.0 default |
sat ✅
|
0.04878
|
0.04899
|
|
SMT-COMP 2016
|
0.17 (5/6) |
CVC4 |
CVC4-master-2016-05-27-cfef263-main default |
sat ✅
|
1.69964
|
1.70084
|
| |
MathSAT |
mathsat-5.3.11-linux-x86_64-Main default |
sat ✅
|
0.30568
|
0.30686
|
| |
SMTInterpol |
smtinterpol-2.1-258-g92ab3df default |
sat ✅
|
0.87325
|
2.46301
|
| |
veriT |
veriT-dev default |
unknown ❌
|
0.01616
|
0.01640
|
| |
Yices2 |
Yices-2.4.2 default |
sat ✅
|
0.21204
|
0.21208
|
| |
Z3 |
z3-4.4.1 default |
sat ✅
|
0.04749
|
0.04884
|
|
SMT-COMP 2017
|
0.17 (5/6) |
CVC4 |
CVC4-smtcomp2017-main default |
sat ✅
|
1.73536
|
1.73399
|
| |
MathSAT |
mathsat-5.4.1-linux-x86_64-Main default |
sat ✅
|
0.31370
|
0.31353
|
| |
SMTInterpol |
SMTInterpol default |
sat ✅
|
0.68369
|
1.79428
|
| |
veriT |
veriT-2017-06-17 default |
unknown ❌
|
0.01602
|
0.01577
|
| |
Yices2 |
Yices2-Main default |
sat ✅
|
0.01550
|
0.01472
|
| |
Z3 |
z3-4.5.0 default |
sat ✅
|
0.05017
|
0.04906
|
|
SMT-COMP 2018
|
0.33 (4/6) |
CVC4 |
master-2018-06-10-b19c840-competition-default_default |
sat ✅
|
3.38705
|
3.38718
|
| |
MathSAT |
mathsat-5.5.2-linux-x86_64-Main_default |
unknown ❌
|
0.17308
|
0.17410
|
| |
SMTInterpol |
SMTInterpol-2.5-19-g0d39cdee_default |
sat ✅
|
1.08073
|
2.82906
|
| |
veriT |
veriT_default |
unknown ❌
|
0.01586
|
0.01599
|
| |
Yices2 |
Yices 2.6.0_default |
sat ✅
|
0.01672
|
0.01664
|
| |
Z3 |
z3-4.7.1_default |
sat ✅
|
0.05267
|
0.05256
|
|
SMT-COMP 2025
|
|
cvc5 |
cvc5 |
sat ✅
|
2.44067
|
2.31183
|
| |
OpenSMT |
OpenSMT |
sat ✅
|
1.35855
|
1.23838
|
| |
SMTInterpol |
SMTInterpol |
sat ✅
|
1.21170
|
2.97594
|
| |
Yices2 |
Yices2 |
sat ✅
|
0.30139
|
0.16622
|