Benchmark

non-incremental/QF_NIA/UltimateLassoRanker/Masse-alloca_true-termination.c.i_Iteration1_Lasso+nonterminationTemplate.smt2

SMT script generated by Ultimate LassoRanker [1].
Ultimate LassoRanker is a tool that analyzes termination and nontermination of 
lasso-shaped programs. This script contains the SMT commands that Ultimate 
LassoRanker used while checking if a lasso-shaped program has a geometric 
nontermination argument [2].

This SMT script belongs to a set of SMT scripts that was generated by applying
Ultimate Buchi Automizer [3,4] to the benchmarks from the SV-COMP 2015 [5,6] 
which are available at [7]. Ultimate Buchi Automizer takes omega-traces
(lasso-shaped programs) and uses LassoRanker in order to check if the 
lasso-shaped program is terminating.

2015-04-30, Matthias Heizmann (heizmann@informatik.uni-freiburg.de)


[1] https://ultimate.informatik.uni-freiburg.de/LassoRanker/
[2] Jan Leike, Matthias Heizmann:
Geometric Series as Nontermination Arguments for Linear Lasso Programs. 
CoRR abs/1405.4413 (2014)
http://arxiv.org/abs/1405.4413
[3] http://ultimate.informatik.uni-freiburg.de/BuchiAutomizer/
[4] Matthias Heizmann, Jochen Hoenicke, Andreas Podelski: Software Model 
Checking for People Who Love Automata. CAV 2013:36-52
[5] http://sv-comp.sosy-lab.org/2015/
[6] Dirk Beyer: Software Verification and Verifiable Witnesses - (Report on 
SV-COMP 2015). TACAS 2015: 401-416
[7] https://svn.sosy-lab.org/software/sv-benchmarks/tags/svcomp15/
Benchmark
Size76051
Compressed Size2763
License Creative Commons Attribution 4.0 International (CC-BY-4.0)
Categoryindustrial
First Occurrence2015-07-02
Generated By
Generated On
Generator
Dolmen OK
strict Dolmen OK
check-sat calls1
Query 1
Status unsat
Inferred Status unsat
Size 76042
Compressed Size2773
Max. Term Depth7
Asserts 1
Declared Functions0
Declared Constants45
Declared Sorts 0
Defined Functions0
Defined Recursive Functions 0
Defined Sorts0
Constants0
Declared Datatypes0

Symbols

or2 and5 =54 +286
-55 *1598 >=57

Evaluations

Evaluation Rating Solver Variant Result Wallclock CPU Time
SMT-COMP 2015 0.33 (4/6) AProVE AProVE NIA 2014 default unknown ❌ 2400.02000 2417.71000
CVC3 CVC3 default unsat ✅ 0.05976 0.05899
CVC4 CVC4-master-2015-06-15-9b32405-main default unknown ❌ 0.03423 0.03199
CVC4-experimental-2015-06-15-ff5745a-main default unknown ❌ 2400.01000 2400.95000
raSAT raSAT default.sh unsat ✅ 1.60615 1.60376
SMT-RAT SMT-RAT-final default unsat ✅ 0.04551 0.04399
SMT-RAT-NIA-Parallel-final default unsat ✅ 0.04856 0.04999
Z3 z3 4.4.0 default unsat ✅ 0.16788 0.16797
SMT-COMP 2016 0.29 (5/7) AProVE AProVE NIA 2014 default unknown ❌ 2400.03000 2433.18000
CVC4 CVC4-master-2016-05-27-cfef263-main default unknown ❌ 2400.02000 2401.47000
ProB ProB competition unsat ✅ 1.10194 1.10277
raSAT raSAT 0.3 default.sh unsat ✅ 1991.96000 1993.10000
raSAT 0.4 exp - final default.py unsat ✅ 1.31471 2.46000
SMT-RAT SMT-RAT default unsat ✅ 0.05307 0.05353
Yices2 Yices-2.4.2 default unsat ✅ 0.01376 0.00711
Z3 z3-4.4.1 default unsat ✅ 0.19453 0.19588
SMT-COMP 2017 0.20 (4/5) AProVE AProVE NIA 2014 default unknown ❌ 600.10400 617.01000
CVC4 CVC4-smtcomp2017-main default unsat ✅ 0.03799 0.03774
SMT-RAT SMTRAT-comp2017_2 default unsat ✅ 0.07390 0.07343
Yices2 Yices2-Main default unsat ✅ 0.00871 0.00702
Z3 z3-4.5.0 default unsat ✅ 0.19362 0.19250
SMT-COMP 2018 0.20 (4/5) AProVE AProVE NIA 2014_default unknown ❌ 1200.02000 1211.84000
CVC4 master-2018-06-10-b19c840-competition-default_default unsat ✅ 0.04104 0.04129
SMT-RAT SMTRAT-Rat-final_default unsat ✅ 0.08689 0.08682
Yices2 Yices 2.6.0_default unsat ✅ 0.01863 0.00980
Z3 z3-4.7.1_default unsat ✅ 0.21129 0.21126
SMT-COMP 2020 0.14 (6/7) AProVE AProVE NIA 2014_default unknown ❌ 1200.03000 1220.79000
CVC4 CVC4-sq-final_default unsat ✅ 0.06529 0.06563
MathSAT MathSAT5_default.sh unsat ✅ 0.03794 0.03787
Par4 Par4-wrapped-sq_default unsat ✅ 0.01989 0.00826
SMT-RAT smtrat-SMTCOMP_default unsat ✅ 0.08404 0.08397
Yices2 Yices 2.6.2 bug fix_default unsat ✅ 0.01540 0.01537
Z3 z3-4.8.8_default unsat ✅ 0.18715 0.18716
SMT-COMP 2021 0.17 (5/6) AProVE AProVE NIA 2014_2021 unknown ❌ 1200.11000 1219.54000
cvc5 cvc5-fixed_default unsat ✅ 0.05701 0.05742
MathSAT mathsat-5.6.6_default unsat ✅ 0.03764 0.03762
Par4 Par4-wrapped-sq_default unsat ✅ 0.01812 0.00643
SMT-RAT smtrat-SMTCOMP_default unsat ✅ 0.12311 0.12300
smtrat-SMTCOMP_default unsat ✅ 0.08556 0.08552
Z3 z3-4.8.11_default unsat ✅ 0.15608 0.15616
SMT-COMP 2022 cvc5 cvc5-default-2022-07-02-b15e116-wrapped_sq unsat ✅ 0.14302 0.14207
MathSAT MathSAT-5.6.8_default unsat ✅ 0.03665 0.03662
Par4 Par4-wrapped-sq_default unsat ✅ 0.02205 0.00711
Yices2 Yices 2.6.2 for SMTCOMP 2021_default unsat ✅ 0.01539 0.01536
Yices-ismt yices-ismt-0721_default unsat ✅ 0.02156 0.02176
Z3 z3-4.8.17_default unsat ✅ 0.24183 0.24375
Z3++ z3++0715_default unsat ✅ 0.02375 0.02381
SMT-COMP 2024 0.25 (3/4) cvc5 cvc5 unsat ✅ 0.27337 0.17392
SMTInterpol SMTInterpol unknown ❌ 0.56849 0.93396
Yices2 Yices2 unsat ✅ 0.21842 0.11867
Z3alpha Z3-alpha unsat ✅ 0.44032 0.34069
SMT-COMP 2025 0.20 (4/5) cvc5 cvc5 unsat ✅ 0.29505 0.17523
SMTInterpol SMTInterpol unknown ❌ 0.55197 0.86039
Yices2 Yices2 unsat ✅ 0.29260 0.16575
Z3alpha Z3-alpha unsat ✅ 0.40137 0.29180
Z3 Z3-alpha-base unsat ✅ 0.33466 0.21427
z3siri-base unsat ✅ 0.31457 0.18909