Benchmark

non-incremental/QF_NIA/UltimateLassoRanker/GulwaniJainKoskinen-2009PLDI-Fig1-alloca_true-termination.c.i_Iteration3_Lasso+nonterminationTemplate.smt2

SMT script generated by Ultimate LassoRanker [1].
Ultimate LassoRanker is a tool that analyzes termination and nontermination of 
lasso-shaped programs. This script contains the SMT commands that Ultimate 
LassoRanker used while checking if a lasso-shaped program has a geometric 
nontermination argument [2].

This SMT script belongs to a set of SMT scripts that was generated by applying
Ultimate Buchi Automizer [3,4] to the benchmarks from the SV-COMP 2015 [5,6] 
which are available at [7]. Ultimate Buchi Automizer takes omega-traces
(lasso-shaped programs) and uses LassoRanker in order to check if the 
lasso-shaped program is terminating.

2015-04-30, Matthias Heizmann (heizmann@informatik.uni-freiburg.de)


[1] https://ultimate.informatik.uni-freiburg.de/LassoRanker/
[2] Jan Leike, Matthias Heizmann:
Geometric Series as Nontermination Arguments for Linear Lasso Programs. 
CoRR abs/1405.4413 (2014)
http://arxiv.org/abs/1405.4413
[3] http://ultimate.informatik.uni-freiburg.de/BuchiAutomizer/
[4] Matthias Heizmann, Jochen Hoenicke, Andreas Podelski: Software Model 
Checking for People Who Love Automata. CAV 2013:36-52
[5] http://sv-comp.sosy-lab.org/2015/
[6] Dirk Beyer: Software Verification and Verifiable Witnesses - (Report on 
SV-COMP 2015). TACAS 2015: 401-416
[7] https://svn.sosy-lab.org/software/sv-benchmarks/tags/svcomp15/
Benchmark
Size6544894
Compressed Size29214
License Creative Commons Attribution 4.0 International (CC-BY-4.0)
Categoryindustrial
First Occurrence2015-07-02
Generated By
Generated On
Generator
Dolmen OK
strict Dolmen OK
check-sat calls1
Query 1
Status unsat
Inferred Status unsat
Size 6544885
Compressed Size29201
Max. Term Depth7
Asserts 1
Declared Functions0
Declared Constants171
Declared Sorts 0
Defined Functions0
Defined Recursive Functions 0
Defined Sorts0
Constants0
Declared Datatypes0

Symbols

or2 and49 =1872 +26448
-1950 *143304 >=1849

Evaluations

Evaluation Rating Solver Variant Result Wallclock CPU Time
SMT-COMP 2015 0.33 (4/6) AProVE AProVE NIA 2014 default unknown ❌ 2400.02000 2497.89000
CVC3 CVC3 default unsat ✅ 1.25169 1.25181
CVC4 CVC4-master-2015-06-15-9b32405-main default unsat ✅ 1.39429 1.39279
CVC4-experimental-2015-06-15-ff5745a-main default unsat ✅ 1.37000 1.36779
raSAT raSAT default.sh unknown ❌ 667.61700 667.76000
SMT-RAT SMT-RAT-final default unsat ✅ 3.12234 3.12252
SMT-RAT-NIA-Parallel-final default unsat ✅ 3.18079 3.19251
Z3 z3 4.4.0 default unsat ✅ 0.48006 0.47993
SMT-COMP 2016 0.57 (3/7) AProVE AProVE NIA 2014 default unknown ❌ 2400.10000 2553.81000
CVC4 CVC4-master-2016-05-27-cfef263-main default unsat ✅ 1.40424 1.40521
ProB ProB competition unknown ❌ 2400.11000 2401.53000
raSAT raSAT 0.3 default.sh unknown ❌ 2400.02000 2401.37000
raSAT 0.4 exp - final default.py unknown ❌ 715.68700 1436.15000
SMT-RAT SMT-RAT default unknown ❌ 2.68685 2.68828
Yices2 Yices-2.4.2 default unsat ✅ 0.12862 0.12865
Z3 z3-4.4.1 default unsat ✅ 0.52918 0.53074
SMT-COMP 2017 0.20 (4/5) AProVE AProVE NIA 2014 default unknown ❌ 600.05700 631.45000
CVC4 CVC4-smtcomp2017-main default unsat ✅ 1.40879 1.40864
SMT-RAT SMTRAT-comp2017_2 default unsat ✅ 3.75356 3.75336
Yices2 Yices2-Main default unsat ✅ 0.12236 0.12176
Z3 z3-4.5.0 default unsat ✅ 0.48875 0.48832
SMT-COMP 2018 0.20 (4/5) AProVE AProVE NIA 2014_default unknown ❌ 1200.08000 1238.09000
CVC4 master-2018-06-10-b19c840-competition-default_default unsat ✅ 1.47885 1.47892
SMT-RAT SMTRAT-Rat-final_default unsat ✅ 4.09846 4.09829
Yices2 Yices 2.6.0_default unsat ✅ 0.10478 0.10471
Z3 z3-4.7.1_default unsat ✅ 0.40318 0.40319
SMT-COMP 2020 0.14 (6/7) AProVE AProVE NIA 2014_default unknown ❌ 1200.02000 1242.07000
CVC4 CVC4-sq-final_default unsat ✅ 3.13036 3.13046
MathSAT MathSAT5_default.sh unsat ✅ 0.28823 0.28819
Par4 Par4-wrapped-sq_default unsat ✅ 0.12124 0.00703
SMT-RAT smtrat-SMTCOMP_default unsat ✅ 4.07552 4.07008
Yices2 Yices 2.6.2 bug fix_default unsat ✅ 0.14812 0.14806
Z3 z3-4.8.8_default unsat ✅ 0.32376 0.32369
SMT-COMP 2021 0.17 (5/6) AProVE AProVE NIA 2014_2021 unknown ❌ 1200.08000 1254.49000
cvc5 cvc5-fixed_default unsat ✅ 2.07693 2.07739
MathSAT mathsat-5.6.6_default unsat ✅ 0.27796 0.27794
Par4 Par4-wrapped-sq_default unsat ✅ 0.11134 0.00639
SMT-RAT smtrat-SMTCOMP_default unsat ✅ 5.17219 5.17216
smtrat-SMTCOMP_default unsat ✅ 3.98612 3.98513
Z3 z3-4.8.11_default unsat ✅ 0.31237 0.31244
SMT-COMP 2022 cvc5 cvc5-default-2022-07-02-b15e116-wrapped_sq unsat ✅ 2.58517 2.58245
MathSAT MathSAT-5.6.8_default unsat ✅ 0.29834 0.29827
Par4 Par4-wrapped-sq_default unsat ✅ 0.28021 0.01049
Yices2 Yices 2.6.2 for SMTCOMP 2021_default unsat ✅ 0.14858 0.14850
Yices-ismt yices-ismt-0721_default unsat ✅ 0.21369 0.21454
Z3 z3-4.8.17_default unsat ✅ 0.31945 0.32124
Z3++ z3++0715_default unsat ✅ 0.22928 0.22933
SMT-COMP 2023 cvc5 cvc5-default-2023-05-16-ea045f305_sq unsat ✅ 0.72626 0.72515
Yices2 Yices 2 for SMTCOMP 2023_default unsat ✅ 0.13533 0.13524
Yices-ismt yices-ismt-sq-0526_default unsat ✅ 0.21026 0.21044
Z3alpha z3alpha_default unsat ✅ 0.30554 0.30575
Z3++ z3++0715_default unsat ✅ 0.23069 0.23074
Z3++_sq_0526_default unsat ✅ 2.75062 2.74993
SMT-COMP 2024 cvc5 cvc5 unsat ✅ 0.70430 0.60457
SMTInterpol SMTInterpol unsat ✅ 3.19437 7.02566
Yices2 Yices2 unsat ✅ 0.31721 0.21736
Z3alpha Z3-alpha unsat ✅ 0.49094 0.39125
SMT-COMP 2025 cvc5 cvc5 unsat ✅ 0.64157 0.52432
SMTInterpol SMTInterpol unsat ✅ 2.67567 6.30545
Yices2 Yices2 unsat ✅ 0.37673 0.25511
Z3alpha Z3-alpha unsat ✅ 0.65499 0.64468
Z3 Z3-alpha-base unsat ✅ 0.47757 0.34877
z3siri-base unsat ✅ 0.46820 0.34398