Benchmark

non-incremental/QF_FP/20190429-UltimateAutomizerSvcomp2019/water_pid_true-unreach-call_true-termination.c_0.smt2

|
Generated by the tool Ultimate Automizer [1,2] which implements
an automata theoretic approach [3] to software verification.

This SMT script belongs to a set of SMT scripts that was generated by
applying Ultimate Automizer to benchmarks [4] from the SV-COMP 2019 [5,6].
This script might _not_ contain all SMT commands that are used by
Ultimate Automizer. In order to satisfy the restrictions of
the SMT-COMP we have to drop e.g., the commands for getting
values (resp. models), unsatisfiable cores and interpolants.

2019-04-27, Matthias Heizmann (heizmann@informatik.uni-freiburg.de)

[1] https://ultimate.informatik.uni-freiburg.de/automizer/
[2] Matthias Heizmann, Yu-Fang Chen, Daniel Dietsch, Marius Greitschus,
     Jochen Hoenicke, Yong Li, Alexander Nutz, Betim Musa, Christian
     Schilling, Tanja Schindler, Andreas Podelski: Ultimate Automizer
     and the Search for Perfect Interpolants - (Competition Contribution).
     TACAS (2) 2018: 447-451
[3] Matthias Heizmann, Jochen Hoenicke, Andreas Podelski: Software Model
     Checking for People Who Love Automata. CAV 2013:36-52
[4] https://github.com/sosy-lab/sv-benchmarks
[5] Dirk Beyer: Automatic Verification of C and Java Programs: SV-COMP 2019.
     TACAS (3) 2019: 133-155
[6] https://sv-comp.sosy-lab.org/2019/
|
Benchmark
Size3677
Compressed Size1280
License Creative Commons Attribution 4.0 International (CC-BY-4.0)
Categoryindustrial
First Occurrence2020-07-06
Generated By
Generated On
Generator
Dolmen OK1
strict Dolmen OK1
check-sat calls1
Query 1
Status sat
Inferred Status sat
Size 3669
Compressed Size1276
Max. Term Depth7
Asserts 1
Declared Functions0
Declared Constants14
Declared Sorts 0
Defined Functions0
Defined Recursive Functions 0
Defined Sorts0
Constants0
Declared Datatypes0

Symbols

and1 =5 fp.add3 fp.sub2
fp.mul3 fp.div2

Evaluations

Evaluation Rating Solver Variant Result Wallclock CPU Time
SMT-COMP 2020 Bitwuzla Bitwuzla-fixed_default sat ✅ 0.71499 0.71496
COLIBRI COLIBRI 20.5.25_default sat ✅ 0.44349 0.44402
CVC4 CVC4-sq-final_default sat ✅ 0.01440 0.01473
MathSAT MathSAT5_default.sh sat ✅ 0.01716 0.01711
Par4 Par4-wrapped-sq_default sat ✅ 0.02125 0.00647
Z3 z3-4.8.8_default sat ✅ 503.56500 503.55200
SMT-COMP 2021 Bitwuzla Bitwuzla-fixed_default sat ✅ 0.70516 0.70512
COLIBRI COLIBRI_21_06_23_default sat ✅ 0.38294 0.38341
COLIBRI 20.5.25_default sat ✅ 0.42402 0.42456
COLIBRI_21_05_28_default sat ✅ 0.38381 0.38426
CVC4 CVC4-sq-final_default sat ✅ 0.01498 0.01528
MathSAT mathsat-5.6.6_default sat ✅ 0.01729 0.01725
Z3 z3-4.8.11_default sat ✅ 113.69700 113.69700
SMT-COMP 2022 Bitwuzla Bitwuzla-wrapped_default sat ✅ 0.72842 0.72836
COLIBRI COLIBRI 22_06_18_default sat ✅ 0.44356 0.44365
cvc5 cvc5_default sat ✅ 0.01736 0.01795
cvc5-default-2022-07-02-b15e116-wrapped_sq sat ✅ 0.01921 0.01979
MathSAT MathSAT-5.6.8_default sat ✅ 0.01598 0.01593
Z3 z3-4.8.17_default sat ✅ 120.13800 120.12400
SMT-COMP 2023 Bitwuzla Bitwuzla-fixed_default sat ✅ 0.01634 0.00545
COLIBRI COLIBRI 2023_05_10_default sat ✅ 0.46117 0.46147
cvc5 cvc5-default-2023-05-16-ea045f305_sq sat ✅ 0.01716 0.01766
Z3-Owl z3-Owl-Final_default sat ✅ 0.68508 0.68500
z3-Owl-Final_default sat ✅ 251.64000 251.60900
SMT-COMP 2024 Bitwuzla Bitwuzla sat ✅ 0.21367 0.11331
COLIBRI COLIBRI sat ✅ 0.61873 0.48353
cvc5 cvc5 sat ✅ 0.22071 0.12113
SMT-COMP 2025 Bitwuzla Bitwuzla sat ✅ 0.28104 0.15552
COLIBRI COLIBRI sat ✅ 0.53401 0.41379
Colibri2 colibri2 sat ✅ 0.31774 0.19789
cvc5 cvc5 sat ✅ 0.26268 0.14060
Z3 Z3-Owl-base sat ✅ 942.66580 942.44926
Z3-Owl Z3-Owl sat ✅ 96.56628 96.41960