Benchmark

non-incremental/QF_NIA/UltimateLassoRanker/Velroyen-alloca_false-termination.c.i_Iteration2_Lasso+nonterminationTemplate.smt2

SMT script generated by Ultimate LassoRanker [1].
Ultimate LassoRanker is a tool that analyzes termination and nontermination of 
lasso-shaped programs. This script contains the SMT commands that Ultimate 
LassoRanker used while checking if a lasso-shaped program has a geometric 
nontermination argument [2].

This SMT script belongs to a set of SMT scripts that was generated by applying
Ultimate Buchi Automizer [3,4] to the benchmarks from the SV-COMP 2015 [5,6] 
which are available at [7]. Ultimate Buchi Automizer takes omega-traces
(lasso-shaped programs) and uses LassoRanker in order to check if the 
lasso-shaped program is terminating.

2015-04-30, Matthias Heizmann (heizmann@informatik.uni-freiburg.de)


[1] https://ultimate.informatik.uni-freiburg.de/LassoRanker/
[2] Jan Leike, Matthias Heizmann:
Geometric Series as Nontermination Arguments for Linear Lasso Programs. 
CoRR abs/1405.4413 (2014)
http://arxiv.org/abs/1405.4413
[3] http://ultimate.informatik.uni-freiburg.de/BuchiAutomizer/
[4] Matthias Heizmann, Jochen Hoenicke, Andreas Podelski: Software Model 
Checking for People Who Love Automata. CAV 2013:36-52
[5] http://sv-comp.sosy-lab.org/2015/
[6] Dirk Beyer: Software Verification and Verifiable Witnesses - (Report on 
SV-COMP 2015). TACAS 2015: 401-416
[7] https://svn.sosy-lab.org/software/sv-benchmarks/tags/svcomp15/
Benchmark
Size49067
Compressed Size2540
License Creative Commons Attribution 4.0 International (CC-BY-4.0)
Categoryindustrial
First Occurrence2015-07-02
Generated By
Generated On
Generator
Dolmen OK
strict Dolmen OK
check-sat calls1
Query 1
Status sat
Inferred Status sat
Size 49058
Compressed Size2553
Max. Term Depth7
Asserts 1
Declared Functions0
Declared Constants50
Declared Sorts 0
Defined Functions0
Defined Recursive Functions 0
Defined Sorts0
Constants0
Declared Datatypes0

Symbols

or1 and4 =37 +158
-31 *1058 >=39

Evaluations

Evaluation Rating Solver Variant Result Wallclock CPU Time
SMT-COMP 2015 AProVE AProVE NIA 2014 default sat ✅ 0.72639 1.38579
CVC3 CVC3 default sat ✅ 0.03155 0.03099
CVC4 CVC4-master-2015-06-15-9b32405-main default unknown ❌ 0.02382 0.02200
CVC4-experimental-2015-06-15-ff5745a-main default sat ✅ 0.14302 0.13798
raSAT raSAT default.sh sat ✅ 0.35983 0.35695
SMT-RAT SMT-RAT-final default sat ✅ 0.03142 0.03000
SMT-RAT-NIA-Parallel-final default sat ✅ 0.03296 0.03199
Z3 z3 4.4.0 default sat ✅ 0.03471 0.03499
SMT-COMP 2016 AProVE AProVE NIA 2014 default sat ✅ 0.57043 1.29451
CVC4 CVC4-master-2016-05-27-cfef263-main default sat ✅ 0.15785 0.15831
ProB ProB competition sat ✅ 1.05364 1.05445
raSAT raSAT 0.3 default.sh sat ✅ 3.70624 3.70847
raSAT 0.4 exp - final default.py sat ✅ 0.34375 0.31645
SMT-RAT SMT-RAT default sat ✅ 0.03268 0.03266
Yices2 Yices-2.4.2 default sat ✅ 0.01270 0.00544
Z3 z3-4.4.1 default sat ✅ 0.03315 0.03438
SMT-COMP 2017 AProVE AProVE NIA 2014 default sat ✅ 0.62253 1.34820
CVC4 CVC4-smtcomp2017-main default sat ✅ 0.03134 0.02553
SMT-RAT SMTRAT-comp2017_2 default sat ✅ 0.04773 0.04769
Yices2 Yices2-Main default sat ✅ 0.00951 0.00674
Z3 z3-4.5.0 default sat ✅ 0.04913 0.03708
SMT-COMP 2018 AProVE AProVE NIA 2014_default sat ✅ 0.58790 1.29134
CVC4 master-2018-06-10-b19c840-competition-default_default sat ✅ 0.02684 0.02700
SMT-RAT SMTRAT-Rat-final_default sat ✅ 0.04942 0.04938
Yices2 Yices 2.6.0_default sat ✅ 0.00796 0.00708
Z3 z3-4.7.1_default sat ✅ 0.03881 0.03874