Benchmark

non-incremental/QF_NIA/LassoRanker/Stockholm_true-termination.c_Iteration1_Lasso+nonterminationTemplate_0.smt2

SMT script generated by Ultimate LassoRanker [1].
Ultimate LassoRanker is a tool that analyzes termination and nontermination of
lasso-shaped programs. This script contains the SMT commands that Ultimate 
LassoRanker used while checking if a lasso-shaped program has a geometric 
nontermination argument. (See [2] for a preliminary definition of
geometric nontermination argument.)

This SMT script belongs to a set of SMT scripts that was generated by applying
Ultimate Buchi Automizer [3,4] to benchmarks from the SV-COMP 2016 [5,6] 
which are available at [7]. Ultimate Buchi Automizer takes omega-traces
(lasso-shaped programs) and uses LassoRanker in order to check if the 
lasso-shaped program is terminating.

2016-04-30, Matthias Heizmann (heizmann@informatik.uni-freiburg.de)


[1] https://ultimate.informatik.uni-freiburg.de/LassoRanker/
[2] Jan Leike, Matthias Heizmann: Geometric Series as Nontermination
Arguments for Linear Lasso Programs. CoRR abs/1405.4413 (2014)
http://arxiv.org/abs/1405.4413
[3] http://ultimate.informatik.uni-freiburg.de/BuchiAutomizer/
[4] Matthias Heizmann, Jochen Hoenicke, Andreas Podelski: Software Model
Checking for People Who Love Automata. CAV 2013:36-52
[5] http://sv-comp.sosy-lab.org/2016/
[6] Dirk Beyer: Reliable and Reproducible Competition Results with BenchExec
and Witnesses (Report on SV-COMP 2016). TACAS 2016: 887-904
[7] https://github.com/dbeyer/sv-benchmarks
Benchmark
Size7895
Compressed Size1477
License Creative Commons Attribution 4.0 International (CC-BY-4.0)
Categoryindustrial
First Occurrence2014-07-21
Generated By
Generated On
Generator
Dolmen OK1
strict Dolmen OK1
check-sat calls1
Query 1
Status unsat
Inferred Status unsat
Size 7887
Compressed Size1471
Max. Term Depth7
Asserts 1
Declared Functions0
Declared Constants20
Declared Sorts 0
Defined Functions0
Defined Recursive Functions 0
Defined Sorts0
Constants0
Declared Datatypes0

Symbols

or2 and4 =12 +47
-19 *115 >=17

Evaluations

Evaluation Rating Solver Variant Result Wallclock CPU Time
SMT-COMP 2016 0.43 (4/7) AProVE AProVE NIA 2014 default unknown ❌ 2400.04000 2417.10000
CVC4 CVC4-master-2016-05-27-cfef263-main default unsat ✅ 0.01723 0.01747
ProB ProB competition unknown ❌ 2400.04000 2401.64000
raSAT raSAT 0.3 default.sh unknown ❌ 2400.04000 2401.63000
raSAT 0.4 exp - final default.py unknown ❌ 2400.12000 4810.87000
SMT-RAT SMT-RAT default unsat ✅ 0.15276 0.15281
Yices2 Yices-2.4.2 default unsat ✅ 0.01291 0.00507
Z3 z3-4.4.1 default unsat ✅ 0.92461 0.92651
SMT-COMP 2017 0.20 (4/5) AProVE AProVE NIA 2014 default unknown ❌ 600.01800 606.93000
CVC4 CVC4-smtcomp2017-main default unsat ✅ 0.02090 0.01875
SMT-RAT SMTRAT-comp2017_2 default unsat ✅ 0.22823 0.22812
Yices2 Yices2-Main default unsat ✅ 0.01032 0.00611
Z3 z3-4.5.0 default unsat ✅ 2.12332 2.12187
SMT-COMP 2018 0.20 (4/5) AProVE AProVE NIA 2014_default unknown ❌ 1200.02000 1210.98000
CVC4 master-2018-06-10-b19c840-competition-default_default unsat ✅ 0.01693 0.01714
SMT-RAT SMTRAT-Rat-final_default unsat ✅ 0.15854 0.15841
Yices2 Yices 2.6.0_default unsat ✅ 0.01018 0.00777
Z3 z3-4.7.1_default unsat ✅ 0.64830 0.64608
SMT-COMP 2019 0.25 (6/8) AProVE AProVE NIA 2014-wrapped-sq_default unknown ❌ 2400.03000 2416.98000
CVC4 CVC4-2019-06-03-d350fe1-wrapped-sq_default unsat ✅ 0.02338 0.02371
CVC4-SymBreak_03_06_2019-wrapped-sq_default unsat ✅ 0.10393 0.10576
master-2018-06-10-b19c840-competition-default_default unsat ✅ 0.02048 0.02075
MathSAT mathsat-20190601-wrapped-sq_default unsat ✅ 0.01966 0.01965
mathsat-na-20190601-wrapped-sq_default unsat ✅ 0.01956 0.01960
Par4 Par4-wrapped-sq_default unsat ✅ 0.01890 0.00566
ProB ProB-wrapped-sq_default unknown ❌ 2400.02000 2400.01000
SMT-RAT SMTRAT-5-wrapped-sq_default unsat ✅ 0.07886 0.07885
Yices2 Yices 2.6.2-wrapped-sq_default unsat ✅ 0.00952 0.00827
Z3 z3-4.8.4-d6df51951f4c-wrapped-sq_default unsat ✅ 0.94103 0.94102
SMT-COMP 2020 0.14 (6/7) AProVE AProVE NIA 2014_default unknown ❌ 1200.02000 1209.70000
CVC4 CVC4-sq-final_default unsat ✅ 0.02294 0.02320
MathSAT MathSAT5_default.sh unsat ✅ 0.01891 0.01886
Par4 Par4-wrapped-sq_default unsat ✅ 0.02741 0.00600
SMT-RAT smtrat-SMTCOMP_default unsat ✅ 0.18223 0.18216
Yices2 Yices 2.6.2 bug fix_default unsat ✅ 0.01301 0.01293
Z3 z3-4.8.8_default unsat ✅ 2.16301 2.16299
SMT-COMP 2021 0.17 (5/6) AProVE AProVE NIA 2014_2021 unknown ❌ 1200.02000 1211.09000
cvc5 cvc5-fixed_default unsat ✅ 0.02587 0.02639
MathSAT mathsat-5.6.6_default unsat ✅ 0.01949 0.01944
Par4 Par4-wrapped-sq_default unsat ✅ 0.02191 0.00935
SMT-RAT smtrat-SMTCOMP_default unsat ✅ 0.19825 0.19825
smtrat-SMTCOMP_default unsat ✅ 0.16789 0.16785
Z3 z3-4.8.11_default unsat ✅ 4.66330 4.66276
SMT-COMP 2022 cvc5 cvc5-default-2022-07-02-b15e116-wrapped_sq unsat ✅ 0.05840 0.05892
MathSAT MathSAT-5.6.8_default unsat ✅ 0.01829 0.01823
Par4 Par4-wrapped-sq_default unsat ✅ 0.05463 0.00688
Yices2 Yices 2.6.2 for SMTCOMP 2021_default unsat ✅ 0.01524 0.01377
Yices-ismt yices-ismt-0721_default unsat ✅ 0.01880 0.01899
Z3 z3-4.8.17_default unsat ✅ 0.35502 0.35683
Z3++ z3++0715_default unsat ✅ 0.02201 0.02207
SMT-COMP 2025 cvc5 cvc5 unsat ✅ 0.25644 0.14279
SMTInterpol SMTInterpol unsat ✅ 0.53077 0.63576
Yices2 Yices2 unsat ✅ 0.29267 0.16613
Z3alpha Z3-alpha unsat ✅ 0.40416 0.28528
Z3 Z3-alpha-base unsat ✅ 0.46436 0.33903
z3siri-base unsat ✅ 0.46421 0.33552