Benchmark

non-incremental/QF_NIA/UltimateLassoRanker/LarrazOliverasRodriguez-CarbonellRubio-2013FMCAD-Fig1-alloca_unknown-termination.c.i_Iteration6_Lasso+nonterminationTemplate.smt2

SMT script generated by Ultimate LassoRanker [1].
Ultimate LassoRanker is a tool that analyzes termination and nontermination of 
lasso-shaped programs. This script contains the SMT commands that Ultimate 
LassoRanker used while checking if a lasso-shaped program has a geometric 
nontermination argument [2].

This SMT script belongs to a set of SMT scripts that was generated by applying
Ultimate Buchi Automizer [3,4] to the benchmarks from the SV-COMP 2015 [5,6] 
which are available at [7]. Ultimate Buchi Automizer takes omega-traces
(lasso-shaped programs) and uses LassoRanker in order to check if the 
lasso-shaped program is terminating.

2015-04-30, Matthias Heizmann (heizmann@informatik.uni-freiburg.de)


[1] https://ultimate.informatik.uni-freiburg.de/LassoRanker/
[2] Jan Leike, Matthias Heizmann:
Geometric Series as Nontermination Arguments for Linear Lasso Programs. 
CoRR abs/1405.4413 (2014)
http://arxiv.org/abs/1405.4413
[3] http://ultimate.informatik.uni-freiburg.de/BuchiAutomizer/
[4] Matthias Heizmann, Jochen Hoenicke, Andreas Podelski: Software Model 
Checking for People Who Love Automata. CAV 2013:36-52
[5] http://sv-comp.sosy-lab.org/2015/
[6] Dirk Beyer: Software Verification and Verifiable Witnesses - (Report on 
SV-COMP 2015). TACAS 2015: 401-416
[7] https://svn.sosy-lab.org/software/sv-benchmarks/tags/svcomp15/
Benchmark
Size3965887
Compressed Size12856
License Creative Commons Attribution 4.0 International (CC-BY-4.0)
Categoryindustrial
First Occurrence2015-07-02
Generated By
Generated On
Generator
Dolmen OK
strict Dolmen OK
check-sat calls1
Query 1
Status unsat
Inferred Status unsat
Size 3965878
Compressed Size12842
Max. Term Depth7
Asserts 1
Declared Functions0
Declared Constants138
Declared Sorts 0
Defined Functions0
Defined Recursive Functions 0
Defined Sorts0
Constants0
Declared Datatypes0

Symbols

or2 and31 =558 +7644
-1158 *97068 >=1153

Evaluations

Evaluation Rating Solver Variant Result Wallclock CPU Time
SMT-COMP 2015 0.50 (3/6) AProVE AProVE NIA 2014 default unknown ❌ 2400.02000 2430.09000
CVC3 CVC3 default unsat ✅ 0.87385 0.87387
CVC4 CVC4-master-2015-06-15-9b32405-main default unknown ❌ 0.90919 0.90786
CVC4-experimental-2015-06-15-ff5745a-main default unknown ❌ 2400.01000 2400.86000
raSAT raSAT default.sh unknown ❌ 592.51800 592.52000
SMT-RAT SMT-RAT-final default unsat ✅ 4.08241 4.08238
SMT-RAT-NIA-Parallel-final default unsat ✅ 5.36616 8.77467
Z3 z3 4.4.0 default unsat ✅ 1.81759 1.81872
SMT-COMP 2016 0.57 (3/7) AProVE AProVE NIA 2014 default unknown ❌ 2400.08000 2580.77000
CVC4 CVC4-master-2016-05-27-cfef263-main default unknown ❌ 2400.10000 2393.58000
ProB ProB competition unknown ❌ 2400.11000 2401.32000
raSAT raSAT 0.3 default.sh unknown ❌ 2400.02000 2401.43000
raSAT 0.4 exp - final default.py unknown ❌ 710.15800 1424.82000
SMT-RAT SMT-RAT default unsat ✅ 1.55401 1.55486
Yices2 Yices-2.4.2 default unsat ✅ 0.08964 0.08965
Z3 z3-4.4.1 default unsat ✅ 1.89358 1.89581
SMT-COMP 2017 0.20 (4/5) AProVE AProVE NIA 2014 default unknown ❌ 600.08000 622.46000
CVC4 CVC4-smtcomp2017-main default unsat ✅ 0.93932 0.93883
SMT-RAT SMTRAT-comp2017_2 default unsat ✅ 2.29333 2.29271
Yices2 Yices2-Main default unsat ✅ 0.08594 0.08439
Z3 z3-4.5.0 default unsat ✅ 1.81277 1.81062
SMT-COMP 2018 0.20 (4/5) AProVE AProVE NIA 2014_default unknown ❌ 1200.03000 1235.21000
CVC4 master-2018-06-10-b19c840-competition-default_default unsat ✅ 0.98384 0.98393
SMT-RAT SMTRAT-Rat-final_default unsat ✅ 2.51685 2.51683
Yices2 Yices 2.6.0_default unsat ✅ 0.06957 0.06953
Z3 z3-4.7.1_default unsat ✅ 1.63154 3.17134
SMT-COMP 2019 0.25 (6/8) AProVE AProVE NIA 2014-wrapped-sq_default unknown ❌ 2400.02000 2454.39000
CVC4 CVC4-2019-06-03-d350fe1-wrapped-sq_default unsat ✅ 1.15690 1.15728
CVC4-SymBreak_03_06_2019-wrapped-sq_default unsat ✅ 3.71592 3.71753
master-2018-06-10-b19c840-competition-default_default unsat ✅ 1.00546 1.00577
MathSAT mathsat-20190601-wrapped-sq_default unsat ✅ 0.22286 0.22283
mathsat-na-20190601-wrapped-sq_default unsat ✅ 0.22217 0.22217
Par4 Par4-wrapped-sq_default unsat ✅ 0.08130 0.00576
ProB ProB-wrapped-sq_default unknown ❌ 16.28740 16.22520
SMT-RAT SMTRAT-5-wrapped-sq_default unsat ✅ 2.56734 2.56716
Yices2 Yices 2.6.2-wrapped-sq_default unsat ✅ 0.08161 0.08161
Z3 z3-4.8.4-d6df51951f4c-wrapped-sq_default unsat ✅ 1.14971 1.14972
SMT-COMP 2021 0.17 (5/6) AProVE AProVE NIA 2014_2021 unknown ❌ 1200.10000 1219.99000
cvc5 cvc5-fixed_default unsat ✅ 1.41427 1.41483
MathSAT mathsat-5.6.6_default unsat ✅ 0.20310 0.20304
Par4 Par4-wrapped-sq_default unsat ✅ 0.08333 0.00724
SMT-RAT smtrat-SMTCOMP_default unsat ✅ 3.03550 3.03475
smtrat-SMTCOMP_default unsat ✅ 2.36063 2.36057
Z3 z3-4.8.11_default unsat ✅ 1.24709 1.24719
SMT-COMP 2024 0.25 (3/4) cvc5 cvc5 unsat ✅ 0.52364 0.42424
SMTInterpol SMTInterpol unknown ❌ 1.63078 3.32832
Yices2 Yices2 unsat ✅ 0.28539 0.18576
Z3alpha Z3-alpha unsat ✅ 1.16473 1.06496