Benchmark

non-incremental/QF_FP/20190429-UltimateAutomizerSvcomp2019/double_req_bl_0330b_true-unreach-call.c_9.smt2

|
Generated by the tool Ultimate Automizer [1,2] which implements
an automata theoretic approach [3] to software verification.

This SMT script belongs to a set of SMT scripts that was generated by
applying Ultimate Automizer to benchmarks [4] from the SV-COMP 2019 [5,6].
This script might _not_ contain all SMT commands that are used by
Ultimate Automizer. In order to satisfy the restrictions of
the SMT-COMP we have to drop e.g., the commands for getting
values (resp. models), unsatisfiable cores and interpolants.

2019-04-27, Matthias Heizmann (heizmann@informatik.uni-freiburg.de)

[1] https://ultimate.informatik.uni-freiburg.de/automizer/
[2] Matthias Heizmann, Yu-Fang Chen, Daniel Dietsch, Marius Greitschus,
     Jochen Hoenicke, Yong Li, Alexander Nutz, Betim Musa, Christian
     Schilling, Tanja Schindler, Andreas Podelski: Ultimate Automizer
     and the Search for Perfect Interpolants - (Competition Contribution).
     TACAS (2) 2018: 447-451
[3] Matthias Heizmann, Jochen Hoenicke, Andreas Podelski: Software Model
     Checking for People Who Love Automata. CAV 2013:36-52
[4] https://github.com/sosy-lab/sv-benchmarks
[5] Dirk Beyer: Automatic Verification of C and Java Programs: SV-COMP 2019.
     TACAS (3) 2019: 133-155
[6] https://sv-comp.sosy-lab.org/2019/
|
Benchmark
Size3152
Compressed Size1132
License Creative Commons Attribution 4.0 International (CC-BY-4.0)
Categoryindustrial
First Occurrence2020-07-06
Generated By
Generated On
Generator
Dolmen OK1
strict Dolmen OK1
check-sat calls1
Query 1
Status sat
Inferred Status sat
Size 3144
Compressed Size1135
Max. Term Depth13
Asserts 2
Declared Functions0
Declared Constants12
Declared Sorts 0
Defined Functions0
Defined Recursive Functions 0
Defined Sorts0
Constants0
Declared Datatypes0

Symbols

not2 or1 and1 =2
let2 fp.add5 fp.sub2 fp.mul7
fp.eq2

Evaluations

Evaluation Rating Solver Variant Result Wallclock CPU Time
SMT-COMP 2020 Bitwuzla Bitwuzla-fixed_default sat ✅ 0.62568 0.62543
COLIBRI COLIBRI 20.5.25_default sat ✅ 0.43613 0.39847
CVC4 CVC4-sq-final_default sat ✅ 0.16095 0.16124
MathSAT MathSAT5_default.sh sat ✅ 0.11719 0.11697
Par4 Par4-wrapped-sq_default sat ✅ 0.20957 0.00672
Z3 z3-4.8.8_default sat ✅ 107.71700 107.65900
SMT-COMP 2021 Bitwuzla Bitwuzla-fixed_default sat ✅ 0.63610 0.63602
COLIBRI COLIBRI_21_06_23_default sat ✅ 0.39788 0.39741
COLIBRI 20.5.25_default sat ✅ 0.40525 0.40490
COLIBRI_21_05_28_default sat ✅ 0.39735 0.39689
CVC4 CVC4-sq-final_default sat ✅ 0.16164 0.16195
MathSAT mathsat-5.6.6_default sat ✅ 0.11488 0.11481
Z3 z3-4.8.11_default sat ✅ 7.38142 7.38117
SMT-COMP 2022 Bitwuzla Bitwuzla-wrapped_default sat ✅ 0.66083 0.66073
COLIBRI COLIBRI 22_06_18_default sat ✅ 0.44564 0.44575
cvc5 cvc5_default sat ✅ 0.13143 0.13206
cvc5-default-2022-07-02-b15e116-wrapped_sq sat ✅ 0.11880 0.11938
MathSAT MathSAT-5.6.8_default sat ✅ 0.17746 0.17488
Z3 z3-4.8.17_default sat ✅ 133.01500 132.97000
SMT-COMP 2023 Bitwuzla Bitwuzla-fixed_default sat ✅ 0.01796 0.01790
COLIBRI COLIBRI 2023_05_10_default sat ✅ 0.46003 0.45944
cvc5 cvc5-default-2023-05-16-ea045f305_sq sat ✅ 0.14233 0.14247
Z3-Owl z3-Owl-Final_default sat ✅ 0.68581 0.68591
z3-Owl-Final_default sat ✅ 113.85700 113.83900
SMT-COMP 2024 Bitwuzla Bitwuzla sat ✅ 0.22008 0.12026
COLIBRI COLIBRI sat ✅ 0.56708 0.46724
cvc5 cvc5 sat ✅ 0.31458 0.21418
SMT-COMP 2025 Bitwuzla Bitwuzla sat ✅ 0.29927 0.16659
COLIBRI COLIBRI sat ✅ 0.49771 0.37117
Colibri2 colibri2 sat ✅ 0.28301 0.16514
cvc5 cvc5 sat ✅ 0.35546 0.22988
Z3 Z3-Owl-base sat ✅ 40.91774 40.79563
Z3-Owl Z3-Owl sat ✅ 34.59736 34.47500