Benchmark

non-incremental/BV/20170501-Heizmann-UltimateAutomizer/gcd_2_true-unreach-call_true-no-overflow.i_921.smt2

Generated by the tool Ultimate Automizer [1,2] which implements 
an automata theoretic approach [3] to software verification.

This SMT script belongs to a set of SMT scripts that was generated by 
applying Ultimate Automizer to benchmarks [4] from the SV-COMP 2017 [5,6].

This script might _not_ contain all SMT commands that are used by 
Ultimate Automizer. In order to satisfy the restrictions of
the SMT-COMP we have to drop e.g., the commands for getting
values (resp. models), unsatisfiable cores and interpolants.

2017-05-01, Matthias Heizmann (heizmann@informatik.uni-freiburg.de)


[1] https://ultimate.informatik.uni-freiburg.de/automizer/
[2] Matthias Heizmann, Yu-Wen Chen, Daniel Dietsch, Marius Greitschus, 
Alexander Nutz, Betim Musa, Claus Schätzle, Christian Schilling, 
Frank Schüssele, Andreas Podelski:
Ultimate Automizer with an On-Demand Construction of Floyd-Hoare 
Automata - (Competition Contribution). TACAS (2) 2017: 394-398
[3] Matthias Heizmann, Jochen Hoenicke, Andreas Podelski: Software Model 
Checking for People Who Love Automata. CAV 2013:36-52
[4] https://github.com/sosy-lab/sv-benchmarks
[5] Dirk Beyer: Software Verification with Validation of Results - 
(Report on SV-COMP 2017). TACAS (2) 2017: 331-349
[6] https://sv-comp.sosy-lab.org/2017/
Benchmark
Size3125
Compressed Size1160
License Creative Commons Attribution 4.0 International (CC-BY-4.0)
Categoryindustrial
First Occurrence2017-07-23
Generated By
Generated On
Generator
Dolmen OK1
strict Dolmen OK1
check-sat calls1
Query 1
Status sat
Inferred Status sat
Size 3117
Compressed Size1163
Max. Term Depth11
Asserts 3
Declared Functions0
Declared Constants6
Declared Sorts 0
Defined Functions0
Defined Recursive Functions 0
Defined Sorts0
Constants0
Declared Datatypes0

Symbols

not6 or4 and1 =8
forall2 BitVec2 extract3 bvsrem3
bvslt1 bvsgt2 bvsge2 sign_extend26

Evaluations

Evaluation Rating Solver Variant Result Wallclock CPU Time
SMT-COMP 2017 Boolector Boolector SMT17 final boolector sat ✅ 39.46600 78.27870
CVC4 CVC4-smtcomp2017-main default sat ✅ 63.94390 63.93350
Q3B Q3B default sat ✅ 1.49670 4.21994
Z3 z3-4.5.0 default sat ✅ 5.20206 5.19642
SMT-COMP 2018 0.25 (3/4) Boolector Boolector_default sat ✅ 269.76100 538.68600
CVC4 master-2018-06-10-b19c840-competition-default_default unknown ❌ 1200.02000 1143.56000
Q3B Q3B_default unsat ✅ 0.42264 1.17854
Z3 z3-4.7.1_default sat ✅ 9.25704 9.25696
SMT-COMP 2020 0.20 (4/5) Bitwuzla Bitwuzla-fixed_default sat ✅ 3.22939 9.69451
CVC4 CVC4-sq-final_default sat ✅ 545.86100 522.12200
Par4 Par4-wrapped-sq_default sat ✅ 36.57610 144.58000
UltimateEliminator UltimateEliminator+MathSAT-5.6.3_s_default unknown ❌ 2.30696 3.45469
Z3 z3-4.8.8_default sat ✅ 1.28846 1.28829
SMT-COMP 2021 0.25 (3/4) Par4 Par4-wrapped-sq_default sat ✅ 8.45178 33.20000
UltimateEliminator UltimateEliminator+MathSAT-5.6.6_default unknown ❌ 2.96110 4.88375
YicesQS yices-QS-2021-06-13under10_default sat ✅ 4.65199 4.65080
Z3 z3-4.8.11_default sat ✅ 15.99760 15.99730
SMT-COMP 2024 0.20 (4/5) Bitwuzla Bitwuzla sat ✅ 0.40884 0.28997
cvc5 cvc5 sat ✅ 161.31709 161.21307
SMTInterpol SMTInterpol unknown ❌ 1.32345 3.49109
YicesQS YicesQS sat ✅ 0.27124 0.17059
Z3alpha Z3-alpha sat ✅ 8.63856 8.53813
SMT-COMP 2025 0.40 (3/5) Bitwuzla Bitwuzla sat ✅ 0.35240 0.22774
cvc5 cvc5 sat ✅ 80.77120 80.64109
SMTInterpol SMTInterpol unknown ❌ 1.01253 2.33846
UltimateEliminator UltimateEliminator+MathSAT unknown ❌ 1.98781 4.23278
YicesQS YicesQS sat ✅ 0.29348 0.17659