Benchmark

non-incremental/QF_ANIA/20190429-UltimateAutomizerSvcomp2019/usb_urb-drivers-hid-usbhid-usbmouse.ko_false-unreach-call.cil.out.i_AllErrorsAtOnce_Iteration21_TraceCheck_0.smt2

|
Generated by the tool Ultimate Automizer [1,2] which implements
an automata theoretic approach [3] to software verification.

This SMT script belongs to a set of SMT scripts that was generated by
applying Ultimate Automizer to benchmarks [4] from the SV-COMP 2019 [5,6].
This script might _not_ contain all SMT commands that are used by
Ultimate Automizer. In order to satisfy the restrictions of
the SMT-COMP we have to drop e.g., the commands for getting
values (resp. models), unsatisfiable cores and interpolants.

2019-04-27, Matthias Heizmann (heizmann@informatik.uni-freiburg.de)

[1] https://ultimate.informatik.uni-freiburg.de/automizer/
[2] Matthias Heizmann, Yu-Fang Chen, Daniel Dietsch, Marius Greitschus,
     Jochen Hoenicke, Yong Li, Alexander Nutz, Betim Musa, Christian
     Schilling, Tanja Schindler, Andreas Podelski: Ultimate Automizer
     and the Search for Perfect Interpolants - (Competition Contribution).
     TACAS (2) 2018: 447-451
[3] Matthias Heizmann, Jochen Hoenicke, Andreas Podelski: Software Model
     Checking for People Who Love Automata. CAV 2013:36-52
[4] https://github.com/sosy-lab/sv-benchmarks
[5] Dirk Beyer: Automatic Verification of C and Java Programs: SV-COMP 2019.
     TACAS (3) 2019: 133-155
[6] https://sv-comp.sosy-lab.org/2019/
|
Benchmark
Size189131
Compressed Size21755
License Creative Commons Attribution 4.0 International (CC-BY-4.0)
Categoryindustrial
First Occurrence2020-07-06
Generated By
Generated On
Generator
Dolmen OK1
strict Dolmen OK1
check-sat calls1
Query 1
Status unknown
Inferred Status sat
Size 189123
Compressed Size21539
Max. Term Depth48
Asserts 1433
Declared Functions0
Declared Constants768
Declared Sorts 0
Defined Functions0
Defined Recursive Functions 0
Defined Sorts0
Constants0
Declared Datatypes0

Symbols

true78 false14 ite13 not52
or7 and1 =203 let57
mod22 +350 -15 *4
<13 <=662 >=614 select635
store349

Evaluations

Evaluation Rating Solver Variant Result Wallclock CPU Time
SMT-COMP 2020 0.75 (1/4) Alt-Ergo Alt-Ergo-SMTComp-2020_default unknown ❌ 2.49556 8.35180
CVC4 CVC4-sq-final_default sat ✅ 25.61560 25.61600
CVC4-2019-06-03-d350fe1-wrapped-sq_default sat ✅ 5.33072 5.33036
MathSAT MathSAT5_default.sh unknown ❌ 0.04268 0.04265
Z3 z3-4.8.8_default unknown ❌ 1200.07000 1199.70000
SMT-COMP 2021 0.50 (2/4) CVC4 CVC4-2019-06-03-d350fe1-wrapped-sq_default sat ✅ 10.60990 10.60720
CVC4-sq-final_default sat ✅ 8.41358 8.41373
cvc5 cvc5-fixed_default sat ✅ 10.60170 10.59860
MathSAT mathsat-5.6.6_default unknown ❌ 0.04278 0.04274
Z3 z3-4.8.11_default unknown ❌ 1200.02000 1198.89000
SMT-COMP 2022 0.50 (2/4) CVC4 CVC4-sq-final_default sat ✅ 5.16973 5.16928
cvc5 cvc5-default-2022-07-02-b15e116-wrapped_sq sat ✅ 470.21800 470.20000
MathSAT MathSAT-5.6.8_default unknown ❌ 0.04435 0.04429
Z3 z3-4.8.17_default unknown ❌ 1200.03000 1199.69000
SMT-COMP 2023 CVC4 CVC4-sq-final_default sat ✅ 6.08293 6.07986
cvc5 cvc5-default-2023-05-16-ea045f305_sq sat ✅ 11.94680 11.94620
SMTInterpol smtinterpol-2.5-1272-g2d6d356c_default sat ✅ 3.07656 9.29014
Yices2 Yices 2 for SMTCOMP 2023_default sat ✅ 4.56820 4.56815
SMT-COMP 2024 cvc5 cvc5 sat ✅ 10.43211 10.32520
SMTInterpol SMTInterpol sat ✅ 3.16819 9.29489
Yices2 Yices2 sat ✅ 1.83720 1.73676
SMT-COMP 2025 cvc5 cvc5 sat ✅ 6.96738 6.84708
SMTInterpol SMTInterpol sat ✅ 1.88592 5.30962
Yices2 Yices2 sat ✅ 2.33731 2.21927