Benchmark
non-incremental/QF_AUFLIA/swap/swap_invalid_t1_pp_nf_ai_00004_006.cvc.smt2
Benchmarks used in the followin paper:
Big proof engines as little proof engines: new results on rewrite-based satisfiability procedure
Alessandro Armando, Maria Paola Bonacina, Silvio Ranise, Stephan Schulz.
PDPAR'05
http://www.ai.dist.unige.it/pdpar05/
| Benchmark |
| Size | 1230 |
| Compressed Size | 551 |
| License |
Creative Commons Attribution 4.0 International
(CC-BY-4.0)
|
| Category | crafted |
| First Occurrence | 2007-07-03 |
| Generated By | — |
| Generated On | — |
| Generator | — |
| Dolmen OK | 1 |
| strict Dolmen OK | 1 |
| check-sat calls | 1 |
| Status | sat |
| Inferred Status | sat |
| Size | 1222 |
| Compressed Size | 554 |
| Max. Term Depth | 14 |
| Asserts | 1 |
| Declared Functions | 1 |
| Declared Constants | 5 |
| Declared Sorts | 0 |
| Defined Functions | 0 |
| Defined Recursive Functions | 0 |
| Defined Sorts | 0 |
| Constants | 0 |
| Declared Datatypes | 0 |
Symbols
not | 1 |
= | 1 |
let | 10 |
select | 12 |
store | 14 |
| | | | | |
Evaluations
| Evaluation |
Rating |
Solver |
Variant |
Result |
Wallclock |
CPU Time |
|
SMT Evaluation 2013
|
0.20 (4/5) |
CVC3 |
CVC3-SMT-COMP-2010 default |
sat ✅
|
—
|
—
|
| |
|
CVC3-SMT-COMP-2011 default |
sat ✅
|
0.01364
|
—
|
| |
|
CVC3-SMT-COMP-2012 default |
sat ✅
|
0.00945
|
—
|
| |
CVC4 |
CVC4-SMT-COMP-2012-Resubmission default |
sat ✅
|
0.01219
|
—
|
| |
|
CVC4-SMT-EVAL-2013 default |
sat ✅
|
0.01016
|
—
|
| |
MathSAT |
MathSAT5-5.2.6-SMT-EVAL-2013 default |
sat ✅
|
—
|
—
|
| |
|
MathSAT5-SMT-COMP-2011 default |
sat ✅
|
0.02112
|
—
|
| |
|
MathSAT5-SMT-COMP-2012 default |
sat ✅
|
0.02129
|
—
|
| |
veriT |
veriT-SMT-EVAL-2013 default |
unknown ❌
|
0.00929
|
—
|
| |
Z3 |
Z3-4.3.2.a054b099c1d6-x64-debian-6.0.6-SMT-EVAL-2013 default |
sat ✅
|
0.00901
|
—
|
| |
|
Z3-SMT-COMP-2011 default |
sat ✅
|
—
|
—
|
|
SMT-COMP 2015
|
0.17 (5/6) |
CVC4 |
CVC4-master-2015-06-15-9b32405-main default |
sat ✅
|
0.01678
|
0.01500
|
| |
|
CVC4-experimental-2015-06-15-ff5745a-main default |
sat ✅
|
0.01435
|
0.01200
|
| |
MathSAT |
MathSat 5.3.6 main smtcomp2015_main |
sat ✅
|
0.02580
|
0.02499
|
| |
SMTInterpol |
SMTInterpol v2.1-206-g86e9531 default |
sat ✅
|
0.27320
|
0.34995
|
| |
veriT |
veriT default |
unknown ❌
|
0.00840
|
0.00200
|
| |
Yices2 |
Yices default |
sat ✅
|
0.00831
|
0.00200
|
| |
Z3 |
z3 4.4.0 default |
sat ✅
|
0.03017
|
0.02899
|
|
SMT-COMP 2016
|
0.17 (5/6) |
CVC4 |
CVC4-master-2016-05-27-cfef263-main default |
sat ✅
|
0.01630
|
0.01643
|
| |
MathSAT |
mathsat-5.3.11-linux-x86_64-Main default |
sat ✅
|
0.02079
|
0.02176
|
| |
SMTInterpol |
smtinterpol-2.1-258-g92ab3df default |
sat ✅
|
0.26002
|
0.35448
|
| |
veriT |
veriT-dev default |
unknown ❌
|
0.01265
|
0.00925
|
| |
Yices2 |
Yices-2.4.2 default |
sat ✅
|
0.01271
|
0.00305
|
| |
Z3 |
z3-4.4.1 default |
sat ✅
|
0.02949
|
0.03064
|
|
SMT-COMP 2017
|
0.17 (5/6) |
CVC4 |
CVC4-smtcomp2017-main default |
sat ✅
|
0.01791
|
0.01714
|
| |
MathSAT |
mathsat-5.4.1-linux-x86_64-Main default |
sat ✅
|
0.01865
|
0.01899
|
| |
SMTInterpol |
SMTInterpol default |
sat ✅
|
0.27956
|
0.36267
|
| |
veriT |
veriT-2017-06-17 default |
unknown ❌
|
0.00788
|
0.00675
|
| |
Yices2 |
Yices2-Main default |
sat ✅
|
0.00878
|
0.00374
|
| |
Z3 |
z3-4.5.0 default |
sat ✅
|
0.03258
|
0.03104
|
|
SMT-COMP 2018
|
0.33 (4/6) |
CVC4 |
master-2018-06-10-b19c840-competition-default_default |
sat ✅
|
0.01759
|
0.01777
|
| |
MathSAT |
mathsat-5.5.2-linux-x86_64-Main_default |
unknown ❌
|
0.01799
|
0.01894
|
| |
SMTInterpol |
SMTInterpol-2.5-19-g0d39cdee_default |
sat ✅
|
0.26818
|
0.37815
|
| |
veriT |
veriT_default |
unknown ❌
|
0.00913
|
0.00808
|
| |
Yices2 |
Yices 2.6.0_default |
sat ✅
|
0.00898
|
0.00649
|
| |
Z3 |
z3-4.7.1_default |
sat ✅
|
0.03411
|
0.03403
|
|
SMT-COMP 2025
|
|
cvc5 |
cvc5 |
sat ✅
|
0.25862
|
0.14089
|
| |
OpenSMT |
OpenSMT |
sat ✅
|
0.29089
|
0.16270
|
| |
SMTInterpol |
SMTInterpol |
sat ✅
|
0.45718
|
0.44607
|
| |
Yices2 |
Yices2 |
sat ✅
|
0.28544
|
0.15505
|