Benchmark

non-incremental/QF_NIA/UltimateLassoRanker/LarrazOliverasRodriguez-CarbonellRubio-2013FMCAD-Fig1-alloca_unknown-termination.c.i_Iteration7_Lasso+nonterminationTemplate.smt2

SMT script generated by Ultimate LassoRanker [1].
Ultimate LassoRanker is a tool that analyzes termination and nontermination of 
lasso-shaped programs. This script contains the SMT commands that Ultimate 
LassoRanker used while checking if a lasso-shaped program has a geometric 
nontermination argument [2].

This SMT script belongs to a set of SMT scripts that was generated by applying
Ultimate Buchi Automizer [3,4] to the benchmarks from the SV-COMP 2015 [5,6] 
which are available at [7]. Ultimate Buchi Automizer takes omega-traces
(lasso-shaped programs) and uses LassoRanker in order to check if the 
lasso-shaped program is terminating.

2015-04-30, Matthias Heizmann (heizmann@informatik.uni-freiburg.de)


[1] https://ultimate.informatik.uni-freiburg.de/LassoRanker/
[2] Jan Leike, Matthias Heizmann:
Geometric Series as Nontermination Arguments for Linear Lasso Programs. 
CoRR abs/1405.4413 (2014)
http://arxiv.org/abs/1405.4413
[3] http://ultimate.informatik.uni-freiburg.de/BuchiAutomizer/
[4] Matthias Heizmann, Jochen Hoenicke, Andreas Podelski: Software Model 
Checking for People Who Love Automata. CAV 2013:36-52
[5] http://sv-comp.sosy-lab.org/2015/
[6] Dirk Beyer: Software Verification and Verifiable Witnesses - (Report on 
SV-COMP 2015). TACAS 2015: 401-416
[7] https://svn.sosy-lab.org/software/sv-benchmarks/tags/svcomp15/
Benchmark
Size3970567
Compressed Size16605
License Creative Commons Attribution 4.0 International (CC-BY-4.0)
Categoryindustrial
First Occurrence2015-07-02
Generated By
Generated On
Generator
Dolmen OK
strict Dolmen OK
check-sat calls1
Query 1
Status unsat
Inferred Status unsat
Size 3970558
Compressed Size16592
Max. Term Depth7
Asserts 1
Declared Functions0
Declared Constants138
Declared Sorts 0
Defined Functions0
Defined Recursive Functions 0
Defined Sorts0
Constants0
Declared Datatypes0

Symbols

or2 and31 =522 +7326
-1248 *97614 >=1141

Evaluations

Evaluation Rating Solver Variant Result Wallclock CPU Time
SMT-COMP 2015 0.33 (4/6) AProVE AProVE NIA 2014 default unknown ❌ 2400.02000 2419.13000
CVC3 CVC3 default unsat ✅ 0.88367 0.88387
CVC4 CVC4-master-2015-06-15-9b32405-main default unsat ✅ 0.92243 0.92086
CVC4-experimental-2015-06-15-ff5745a-main default unsat ✅ 0.89980 0.89786
raSAT raSAT default.sh unknown ❌ 643.11800 643.35000
SMT-RAT SMT-RAT-final default unsat ✅ 1.86189 1.86172
SMT-RAT-NIA-Parallel-final default unsat ✅ 1.88175 1.88971
Z3 z3 4.4.0 default unsat ✅ 0.59932 0.59991
SMT-COMP 2016 0.43 (4/7) AProVE AProVE NIA 2014 default unknown ❌ 2400.07000 2444.27000
CVC4 CVC4-master-2016-05-27-cfef263-main default unsat ✅ 0.91961 0.92073
ProB ProB competition unknown ❌ 2400.05000 2401.49000
raSAT raSAT 0.3 default.sh unknown ❌ 2400.10000 2401.46000
raSAT 0.4 exp - final default.py unknown ❌ 729.11100 1463.32000
SMT-RAT SMT-RAT default unsat ✅ 1.44160 1.44239
Yices2 Yices-2.4.2 default unsat ✅ 0.08831 0.08831
Z3 z3-4.4.1 default unsat ✅ 0.29042 0.29185
SMT-COMP 2017 0.20 (4/5) AProVE AProVE NIA 2014 default unknown ❌ 600.02600 621.54000
CVC4 CVC4-smtcomp2017-main default unsat ✅ 0.92807 0.92816
SMT-RAT SMTRAT-comp2017_2 default unsat ✅ 1.95120 1.93902
Yices2 Yices2-Main default unsat ✅ 0.08160 0.08152
Z3 z3-4.5.0 default unsat ✅ 0.77516 0.77475
SMT-COMP 2018 0.20 (4/5) AProVE AProVE NIA 2014_default unknown ❌ 1200.08000 1220.70000
CVC4 master-2018-06-10-b19c840-competition-default_default unsat ✅ 0.98059 0.98081
SMT-RAT SMTRAT-Rat-final_default unsat ✅ 2.11616 2.11608
Yices2 Yices 2.6.0_default unsat ✅ 0.06772 0.06767
Z3 z3-4.7.1_default unsat ✅ 0.44088 0.44086
SMT-COMP 2019 0.25 (6/8) AProVE AProVE NIA 2014-wrapped-sq_default unknown ❌ 2400.06000 2461.37000
CVC4 CVC4-2019-06-03-d350fe1-wrapped-sq_default unsat ✅ 1.14047 1.14078
CVC4-SymBreak_03_06_2019-wrapped-sq_default unsat ✅ 3.85711 3.85625
master-2018-06-10-b19c840-competition-default_default unsat ✅ 0.97351 0.97359
MathSAT mathsat-20190601-wrapped-sq_default unsat ✅ 0.19987 0.19985
mathsat-na-20190601-wrapped-sq_default unsat ✅ 0.20101 0.20027
Par4 Par4-wrapped-sq_default unsat ✅ 0.08568 0.00630
ProB ProB-wrapped-sq_default unknown ❌ 2400.04000 2399.95000
SMT-RAT SMTRAT-5-wrapped-sq_default unsat ✅ 2.14470 2.14445
Yices2 Yices 2.6.2-wrapped-sq_default unsat ✅ 0.08390 0.08392
Z3 z3-4.8.4-d6df51951f4c-wrapped-sq_default unsat ✅ 0.35529 0.35533
SMT-COMP 2020 0.14 (6/7) AProVE AProVE NIA 2014_default unknown ❌ 1200.03000 1230.18000
CVC4 CVC4-sq-final_default unsat ✅ 2.05290 2.05266
MathSAT MathSAT5_default.sh unsat ✅ 0.18487 0.18482
Par4 Par4-wrapped-sq_default unsat ✅ 0.10164 0.01207
SMT-RAT smtrat-SMTCOMP_default unsat ✅ 1.97823 1.97806
Yices2 Yices 2.6.2 bug fix_default unsat ✅ 0.10418 0.10414
Z3 z3-4.8.8_default unsat ✅ 0.29054 0.29051
SMT-COMP 2021 0.17 (5/6) AProVE AProVE NIA 2014_2021 unknown ❌ 1200.09000 1236.08000
cvc5 cvc5-fixed_default unsat ✅ 1.39058 1.39108
MathSAT mathsat-5.6.6_default unsat ✅ 0.18003 0.17997
Par4 Par4-wrapped-sq_default unsat ✅ 0.07676 0.00642
SMT-RAT smtrat-SMTCOMP_default unsat ✅ 2.54654 2.54625
smtrat-SMTCOMP_default unsat ✅ 1.97577 1.97578
Z3 z3-4.8.11_default unsat ✅ 0.28924 0.28931
SMT-COMP 2022 cvc5 cvc5-default-2022-07-02-b15e116-wrapped_sq unsat ✅ 1.67700 1.67745
MathSAT MathSAT-5.6.8_default unsat ✅ 0.19534 0.19529
Par4 Par4-wrapped-sq_default unsat ✅ 0.08434 0.00726
Yices2 Yices 2.6.2 for SMTCOMP 2021_default unsat ✅ 0.10169 0.10165
Yices-ismt yices-ismt-0721_default unsat ✅ 0.14519 0.14532
Z3 z3-4.8.17_default unsat ✅ 0.29842 0.30063
Z3++ z3++0715_default unsat ✅ 0.15171 0.15179
SMT-COMP 2023 cvc5 cvc5-default-2023-05-16-ea045f305_sq unsat ✅ 0.49074 0.49133
Yices2 Yices 2 for SMTCOMP 2023_default unsat ✅ 0.09195 0.09187
Yices-ismt yices-ismt-sq-0526_default unsat ✅ 0.14308 0.14329
Z3alpha z3alpha_default unsat ✅ 0.25738 0.25760
Z3++ z3++0715_default unsat ✅ 0.15142 0.15147
Z3++_sq_0526_default unsat ✅ 2.18433 2.18398