Benchmark

non-incremental/QF_NIA/UltimateLassoRanker/b.09-no-inv_assume-alloca_true-termination.c.i_Iteration4_Lasso+nonterminationTemplate.smt2

SMT script generated by Ultimate LassoRanker [1].
Ultimate LassoRanker is a tool that analyzes termination and nontermination of 
lasso-shaped programs. This script contains the SMT commands that Ultimate 
LassoRanker used while checking if a lasso-shaped program has a geometric 
nontermination argument [2].

This SMT script belongs to a set of SMT scripts that was generated by applying
Ultimate Buchi Automizer [3,4] to the benchmarks from the SV-COMP 2015 [5,6] 
which are available at [7]. Ultimate Buchi Automizer takes omega-traces
(lasso-shaped programs) and uses LassoRanker in order to check if the 
lasso-shaped program is terminating.

2015-04-30, Matthias Heizmann (heizmann@informatik.uni-freiburg.de)


[1] https://ultimate.informatik.uni-freiburg.de/LassoRanker/
[2] Jan Leike, Matthias Heizmann:
Geometric Series as Nontermination Arguments for Linear Lasso Programs. 
CoRR abs/1405.4413 (2014)
http://arxiv.org/abs/1405.4413
[3] http://ultimate.informatik.uni-freiburg.de/BuchiAutomizer/
[4] Matthias Heizmann, Jochen Hoenicke, Andreas Podelski: Software Model 
Checking for People Who Love Automata. CAV 2013:36-52
[5] http://sv-comp.sosy-lab.org/2015/
[6] Dirk Beyer: Software Verification and Verifiable Witnesses - (Report on 
SV-COMP 2015). TACAS 2015: 401-416
[7] https://svn.sosy-lab.org/software/sv-benchmarks/tags/svcomp15/
Benchmark
Size5510514
Compressed Size19720
License Creative Commons Attribution 4.0 International (CC-BY-4.0)
Categoryindustrial
First Occurrence2015-07-02
Generated By
Generated On
Generator
Dolmen OK
strict Dolmen OK
check-sat calls1
Query 1
Status unsat
Inferred Status unsat
Size 5510505
Compressed Size19707
Max. Term Depth7
Asserts 1
Declared Functions0
Declared Constants123
Declared Sorts 0
Defined Functions0
Defined Recursive Functions 0
Defined Sorts0
Constants0
Declared Datatypes0

Symbols

or2 and37 =948 +14388
-1566 *115716 >=1513

Evaluations

Evaluation Rating Solver Variant Result Wallclock CPU Time
SMT-COMP 2015 0.33 (4/6) AProVE AProVE NIA 2014 default unknown ❌ 2400.02000 2512.37000
CVC3 CVC3 default unsat ✅ 1.01963 1.01984
CVC4 CVC4-master-2015-06-15-9b32405-main default unsat ✅ 1.11634 1.11483
CVC4-experimental-2015-06-15-ff5745a-main default unsat ✅ 1.09818 1.09583
raSAT raSAT default.sh unknown ❌ 654.51800 654.59000
SMT-RAT SMT-RAT-final default unsat ✅ 3.81063 3.81142
SMT-RAT-NIA-Parallel-final default unsat ✅ 3.02626 3.67044
Z3 z3 4.4.0 default unsat ✅ 0.51945 0.51992
SMT-COMP 2016 0.57 (3/7) AProVE AProVE NIA 2014 default unknown ❌ 2400.03000 2453.14000
CVC4 CVC4-master-2016-05-27-cfef263-main default unsat ✅ 1.12091 1.12184
ProB ProB competition unknown ❌ 2400.10000 2401.48000
raSAT raSAT 0.3 default.sh unknown ❌ 2400.01000 2401.18000
raSAT 0.4 exp - final default.py unknown ❌ 845.47000 1696.37000
SMT-RAT SMT-RAT default unknown ❌ 1.89294 1.89400
Yices2 Yices-2.4.2 default unsat ✅ 0.11054 0.11051
Z3 z3-4.4.1 default unsat ✅ 0.52011 0.52163
SMT-COMP 2017 0.20 (4/5) AProVE AProVE NIA 2014 default unknown ❌ 600.05800 623.95000
CVC4 CVC4-smtcomp2017-main default unsat ✅ 1.12687 1.12636
SMT-RAT SMTRAT-comp2017_2 default unsat ✅ 2.62109 2.62055
Yices2 Yices2-Main default unsat ✅ 0.10125 0.10049
Z3 z3-4.5.0 default unsat ✅ 0.58631 0.57861
SMT-COMP 2018 0.20 (4/5) AProVE AProVE NIA 2014_default unknown ❌ 1200.04000 1229.93000
CVC4 master-2018-06-10-b19c840-competition-default_default unsat ✅ 1.17794 1.17799
SMT-RAT SMTRAT-Rat-final_default unsat ✅ 2.82494 2.82455
Yices2 Yices 2.6.0_default unsat ✅ 0.08292 0.08283
Z3 z3-4.7.1_default unsat ✅ 0.56042 0.56041
SMT-COMP 2019 0.25 (6/8) AProVE AProVE NIA 2014-wrapped-sq_default unknown ❌ 2400.07000 2434.73000
CVC4 CVC4-2019-06-03-d350fe1-wrapped-sq_default unsat ✅ 1.35395 1.35436
CVC4-SymBreak_03_06_2019-wrapped-sq_default unsat ✅ 5.36115 5.35003
master-2018-06-10-b19c840-competition-default_default unsat ✅ 1.21612 1.21636
MathSAT mathsat-20190601-wrapped-sq_default unsat ✅ 0.24194 0.24196
mathsat-na-20190601-wrapped-sq_default unsat ✅ 0.24093 0.24097
Par4 Par4-wrapped-sq_default unsat ✅ 0.09604 0.00563
ProB ProB-wrapped-sq_default unknown ❌ 2400.02000 2399.98000
SMT-RAT SMTRAT-5-wrapped-sq_default unsat ✅ 2.82658 2.82672
Yices2 Yices 2.6.2-wrapped-sq_default unsat ✅ 0.09702 0.09703
Z3 z3-4.8.4-d6df51951f4c-wrapped-sq_default unsat ✅ 0.46944 0.46944
SMT-COMP 2020 0.14 (6/7) AProVE AProVE NIA 2014_default unknown ❌ 1200.03000 1227.21000
CVC4 CVC4-sq-final_default unsat ✅ 2.49288 2.49312
MathSAT MathSAT5_default.sh unsat ✅ 0.22591 0.22586
Par4 Par4-wrapped-sq_default unsat ✅ 0.09288 0.00614
SMT-RAT smtrat-SMTCOMP_default unsat ✅ 2.69769 2.69757
Yices2 Yices 2.6.2 bug fix_default unsat ✅ 0.12347 0.12343
Z3 z3-4.8.8_default unsat ✅ 0.44706 0.44710
SMT-COMP 2022 cvc5 cvc5-default-2022-07-02-b15e116-wrapped_sq unsat ✅ 2.01863 2.01906
MathSAT MathSAT-5.6.8_default unsat ✅ 0.23629 0.23617
Par4 Par4-wrapped-sq_default unsat ✅ 0.11906 0.00700
Yices2 Yices 2.6.2 for SMTCOMP 2021_default unsat ✅ 0.12391 0.12386
Yices-ismt yices-ismt-0721_default unsat ✅ 0.17699 0.17704
Z3 z3-4.8.17_default unsat ✅ 0.43031 0.43223
Z3++ z3++0715_default unsat ✅ 0.18269 0.18273
SMT-COMP 2023 cvc5 cvc5-default-2023-05-16-ea045f305_sq unsat ✅ 0.59441 0.59079
Yices2 Yices 2 for SMTCOMP 2023_default unsat ✅ 0.10977 0.10972
Yices-ismt yices-ismt-sq-0526_default unsat ✅ 0.17646 0.17643
Z3alpha z3alpha_default unsat ✅ 0.37302 0.37321
Z3++ z3++0715_default unsat ✅ 0.19084 0.19086
Z3++_sq_0526_default unsat ✅ 3.10094 3.10096
SMT-COMP 2024 cvc5 cvc5 unsat ✅ 0.54833 0.44889
SMTInterpol SMTInterpol unsat ✅ 2.98376 7.03924
Yices2 Yices2 unsat ✅ 0.30027 0.20040
Z3alpha Z3-alpha unsat ✅ 0.57878 0.47901
SMT-COMP 2025 cvc5 cvc5 unsat ✅ 0.58508 0.45813
SMTInterpol SMTInterpol unsat ✅ 2.43715 5.66996
Yices2 Yices2 unsat ✅ 0.44913 0.32363
Z3alpha Z3-alpha unsat ✅ 0.55298 0.52715
Z3 Z3-alpha-base unsat ✅ 0.52540 0.39709
z3siri-base unsat ✅ 0.49918 0.38020