Benchmark

non-incremental/QF_NIA/UltimateLassoRanker/b.17-alloca_true-termination.c.i_Iteration2_Lasso+nonterminationTemplate.smt2

SMT script generated by Ultimate LassoRanker [1].
Ultimate LassoRanker is a tool that analyzes termination and nontermination of 
lasso-shaped programs. This script contains the SMT commands that Ultimate 
LassoRanker used while checking if a lasso-shaped program has a geometric 
nontermination argument [2].

This SMT script belongs to a set of SMT scripts that was generated by applying
Ultimate Buchi Automizer [3,4] to the benchmarks from the SV-COMP 2015 [5,6] 
which are available at [7]. Ultimate Buchi Automizer takes omega-traces
(lasso-shaped programs) and uses LassoRanker in order to check if the 
lasso-shaped program is terminating.

2015-04-30, Matthias Heizmann (heizmann@informatik.uni-freiburg.de)


[1] https://ultimate.informatik.uni-freiburg.de/LassoRanker/
[2] Jan Leike, Matthias Heizmann:
Geometric Series as Nontermination Arguments for Linear Lasso Programs. 
CoRR abs/1405.4413 (2014)
http://arxiv.org/abs/1405.4413
[3] http://ultimate.informatik.uni-freiburg.de/BuchiAutomizer/
[4] Matthias Heizmann, Jochen Hoenicke, Andreas Podelski: Software Model 
Checking for People Who Love Automata. CAV 2013:36-52
[5] http://sv-comp.sosy-lab.org/2015/
[6] Dirk Beyer: Software Verification and Verifiable Witnesses - (Report on 
SV-COMP 2015). TACAS 2015: 401-416
[7] https://svn.sosy-lab.org/software/sv-benchmarks/tags/svcomp15/
Benchmark
Size28370464
Compressed Size94679
License Creative Commons Attribution 4.0 International (CC-BY-4.0)
Categoryindustrial
First Occurrence2015-07-02
Generated By
Generated On
Generator
Dolmen OK
strict Dolmen OK
check-sat calls1
Query 1
Status unsat
Inferred Status unsat
Size 28370455
Compressed Size94665
Max. Term Depth7
Asserts 1
Declared Functions0
Declared Constants144
Declared Sorts 0
Defined Functions0
Defined Recursive Functions 0
Defined Sorts0
Constants0
Declared Datatypes0

Symbols

or2 and145 =2952 +44448
-7104 *589488 >=7057

Evaluations

Evaluation Rating Solver Variant Result Wallclock CPU Time
SMT-COMP 2015 0.33 (4/6) AProVE AProVE NIA 2014 default unknown ❌ 2400.04000 2440.40000
CVC3 CVC3 default unsat ✅ 4.30017 4.30135
CVC4 CVC4-master-2015-06-15-9b32405-main default unsat ✅ 5.44075 5.44117
CVC4-experimental-2015-06-15-ff5745a-main default unsat ✅ 5.32459 5.32319
raSAT raSAT default.sh unknown ❌ 669.91800 670.23000
SMT-RAT SMT-RAT-final default unsat ✅ 11.28550 11.28830
SMT-RAT-NIA-Parallel-final default unsat ✅ 11.41220 11.42730
Z3 z3 4.4.0 default unsat ✅ 1.01107 1.01085
SMT-COMP 2016 0.43 (4/7) AProVE AProVE NIA 2014 default unknown ❌ 2400.08000 2514.53000
CVC4 CVC4-master-2016-05-27-cfef263-main default unsat ✅ 5.43600 5.43969
ProB ProB competition unknown ❌ 2400.11000 2401.50000
raSAT raSAT 0.3 default.sh unknown ❌ 2400.03000 2401.39000
raSAT 0.4 exp - final default.py unknown ❌ 781.42100 1568.55000
SMT-RAT SMT-RAT default unsat ✅ 11.36560 11.37290
Yices2 Yices-2.4.2 default unsat ✅ 0.47916 0.48004
Z3 z3-4.4.1 default unsat ✅ 0.95461 0.95648
SMT-COMP 2017 0.20 (4/5) AProVE AProVE NIA 2014 default unknown ❌ 600.08300 624.88000
CVC4 CVC4-smtcomp2017-main default unsat ✅ 5.55429 5.55352
SMT-RAT SMTRAT-comp2017_2 default unsat ✅ 14.75360 14.74820
Yices2 Yices2-Main default unsat ✅ 0.44192 0.44181
Z3 z3-4.5.0 default unsat ✅ 0.95561 0.95488
SMT-COMP 2018 0.20 (4/5) AProVE AProVE NIA 2014_default unknown ❌ 1200.02000 1230.68000
CVC4 master-2018-06-10-b19c840-competition-default_default unsat ✅ 5.77382 5.77423
SMT-RAT SMTRAT-Rat-final_default unsat ✅ 16.66320 16.66120
Yices2 Yices 2.6.0_default unsat ✅ 0.35263 0.35256
Z3 z3-4.7.1_default unsat ✅ 1.06602 1.06583
SMT-COMP 2020 0.14 (6/7) AProVE AProVE NIA 2014_default unknown ❌ 1200.04000 1235.30000
CVC4 CVC4-sq-final_default unsat ✅ 12.39120 12.39070
MathSAT MathSAT5_default.sh unsat ✅ 1.01322 1.01315
Par4 Par4-wrapped-sq_default unsat ✅ 0.36149 0.00585
SMT-RAT smtrat-SMTCOMP_default unsat ✅ 19.37580 19.37440
Yices2 Yices 2.6.2 bug fix_default unsat ✅ 0.51486 0.51478
Z3 z3-4.8.8_default unsat ✅ 0.85446 0.85293
SMT-COMP 2021 0.17 (5/6) AProVE AProVE NIA 2014_2021 unknown ❌ 1200.10000 1222.80000
cvc5 cvc5-fixed_default unsat ✅ 8.26375 8.26429
MathSAT mathsat-5.6.6_default unsat ✅ 0.96063 0.96058
Par4 Par4-wrapped-sq_default unsat ✅ 0.37539 0.00912
SMT-RAT smtrat-SMTCOMP_default unsat ✅ 20.97190 20.97110
smtrat-SMTCOMP_default unsat ✅ 19.38710 19.38660
Z3 z3-4.8.11_default unsat ✅ 0.84600 0.84598
SMT-COMP 2023 cvc5 cvc5-default-2023-05-16-ea045f305_sq unsat ✅ 2.80487 2.80398
Yices2 Yices 2 for SMTCOMP 2023_default unsat ✅ 0.47333 0.47325
Yices-ismt yices-ismt-sq-0526_default unsat ✅ 0.75090 0.75098
Z3alpha z3alpha_default unsat ✅ 0.88135 0.88165
Z3++ z3++0715_default unsat ✅ 0.78722 0.78728
Z3++_sq_0526_default unsat ✅ 11.99460 11.99380
SMT-COMP 2024 cvc5 cvc5 unsat ✅ 2.21583 2.11486
SMTInterpol SMTInterpol unsat ✅ 9.13122 16.64737
Yices2 Yices2 unsat ✅ 0.64316 0.54324
Z3alpha Z3-alpha unsat ✅ 0.91872 0.81562
SMT-COMP 2025 cvc5 cvc5 unsat ✅ 1.73142 1.61195
SMTInterpol SMTInterpol unsat ✅ 6.95117 13.48939
Yices2 Yices2 unsat ✅ 0.96221 0.83932
Z3alpha Z3-alpha unsat ✅ 1.22111 1.36045
Z3 Z3-alpha-base unsat ✅ 0.72061 0.60250
z3siri-base unsat ✅ 0.73316 0.61663