Benchmark
non-incremental/QF_ABV/20230321-UltimateAutomizerSvcomp2023/s3_clnt.blast.03.i.cil-1.c_2.smt2
Generated by the tool Ultimate Automizer [1,2] which implements
an automata theoretic approach [3] to software verification.
This SMT script belongs to a set of SMT scripts that was generated by
applying Ultimate Automizer to benchmarks [4] from the SV-COMP 2023 [5,6].
This script may not contain all SMT commands that Ultimate Automizer
issued. In order to meet the restrictions for SMT-COMP benchmarks
we dropped the commands for getting values (resp. models),
unsatisfiable cores, and interpolants.
2023-03-21, Matthias Heizmann (heizmann@informatik.uni-freiburg.de)
[1] https://ultimate.informatik.uni-freiburg.de/automizer/
[2] Matthias Heizmann, Max Barth, Daniel Dietsch, Leonard Fichtner,
Jochen Hoenicke, Dominik Klumpp, Mehdi Naouar, Tanja Schindler,
Frank Schüssele, Andreas Podelski: Ultimate Automizer and the
CommuHash Normal Form (Competition Contribution). TACAS 2023
[3] Matthias Heizmann, Jochen Hoenicke, Andreas Podelski: Software Model
Checking for People Who Love Automata. CAV 2013
[4] https://github.com/sosy-lab/sv-benchmarks
[5] Dirk Beyer: Competition on Software Verification and
Witness Validation: SV-COMP 2023. TACAS 2023
[6] https://sv-comp.sosy-lab.org/2023/
| Benchmark |
| Size | 28789 |
| Compressed Size | 4438 |
| License |
Creative Commons Attribution 4.0 International
(CC-BY-4.0)
|
| Category | industrial |
| First Occurrence | 2023-07-06 |
| Generated By | — |
| Generated On | — |
| Generator | — |
| Dolmen OK | — |
| strict Dolmen OK | — |
| check-sat calls | 1 |
| Status | unknown |
| Inferred Status | sat |
| Size | 28781 |
| Compressed Size | 4462 |
| Max. Term Depth | 25 |
| Asserts | 4 |
| Declared Functions | 0 |
| Declared Constants | 68 |
| Declared Sorts | 0 |
| Defined Functions | 0 |
| Defined Recursive Functions | 0 |
| Defined Sorts | 0 |
| Constants | 0 |
| Declared Datatypes | 0 |
Symbols
not | 6 |
and | 3 |
= | 56 |
let | 22 |
extract | 36 |
bvadd | 52 |
bvult | 5 |
sign_extend | 1 |
select | 172 |
store | 195 |
| | | |
Evaluations
| Evaluation |
Rating |
Solver |
Variant |
Result |
Wallclock |
CPU Time |
|
SMT-COMP 2023
|
|
Bitwuzla |
Bitwuzla-fixed_default |
sat ✅
|
0.22557
|
0.22534
|
| |
cvc5 |
cvc5-default-2023-05-16-ea045f305_sq |
sat ✅
|
0.77287
|
0.77091
|
| |
UltimateEliminator |
UltimateIntBlastingWrapper+SMTInterpol_default |
sat ✅
|
14.15260
|
42.52860
|
| |
Yices2 |
Yices 2 for SMTCOMP 2023_default |
sat ✅
|
0.17477
|
0.17463
|
| |
Z3-Owl |
z3-Owl-Final_default |
sat ✅
|
1.83157
|
1.08742
|
| |
|
z3-Owl-Final_default |
sat ✅
|
139.46900
|
139.38800
|
|
SMT-COMP 2024
|
|
Bitwuzla |
Bitwuzla |
sat ✅
|
1.06216
|
0.96180
|
| |
cvc5 |
cvc5 |
sat ✅
|
0.60951
|
0.51010
|
| |
SMTInterpol |
SMTInterpol |
sat ✅
|
14.06103
|
37.47221
|
| |
Yices2 |
Yices2 |
sat ✅
|
0.40889
|
0.30895
|
|
SMT-COMP 2025
|
|
Bitwuzla |
Bitwuzla |
sat ✅
|
2.28724
|
2.16937
|
| |
cvc5 |
cvc5 |
sat ✅
|
1.08819
|
0.95759
|
| |
SMTInterpol |
SMTInterpol |
sat ✅
|
21.39601
|
38.57863
|
| |
Yices2 |
Yices2 |
sat ✅
|
0.43623
|
0.30592
|
| |
Z3 |
Z3-Owl-base |
sat ✅
|
877.38417
|
877.10650
|
| |
Z3-Owl |
Z3-Owl |
sat ✅
|
172.94558
|
172.81125
|