Benchmark

non-incremental/QF_NIA/LassoRanker/Pure3Phase_true-termination.c_Iteration1_Loop+nonterminationTemplate_0.smt2

SMT script generated by Ultimate LassoRanker [1].
Ultimate LassoRanker is a tool that analyzes termination and nontermination of
lasso-shaped programs. This script contains the SMT commands that Ultimate 
LassoRanker used while checking if a lasso-shaped program has a geometric 
nontermination argument. (See [2] for a preliminary definition of
geometric nontermination argument.)

This SMT script belongs to a set of SMT scripts that was generated by applying
Ultimate Buchi Automizer [3,4] to benchmarks from the SV-COMP 2016 [5,6] 
which are available at [7]. Ultimate Buchi Automizer takes omega-traces
(lasso-shaped programs) and uses LassoRanker in order to check if the 
lasso-shaped program is terminating.

2016-04-30, Matthias Heizmann (heizmann@informatik.uni-freiburg.de)


[1] https://ultimate.informatik.uni-freiburg.de/LassoRanker/
[2] Jan Leike, Matthias Heizmann: Geometric Series as Nontermination
Arguments for Linear Lasso Programs. CoRR abs/1405.4413 (2014)
http://arxiv.org/abs/1405.4413
[3] http://ultimate.informatik.uni-freiburg.de/BuchiAutomizer/
[4] Matthias Heizmann, Jochen Hoenicke, Andreas Podelski: Software Model
Checking for People Who Love Automata. CAV 2013:36-52
[5] http://sv-comp.sosy-lab.org/2016/
[6] Dirk Beyer: Reliable and Reproducible Competition Results with BenchExec
and Witnesses (Report on SV-COMP 2016). TACAS 2016: 887-904
[7] https://github.com/dbeyer/sv-benchmarks
Benchmark
Size13644
Compressed Size1669
License Creative Commons Attribution 4.0 International (CC-BY-4.0)
Categoryindustrial
First Occurrence2014-07-21
Generated By
Generated On
Generator
Dolmen OK1
strict Dolmen OK1
check-sat calls1
Query 1
Status unsat
Inferred Status unsat
Size 13636
Compressed Size1670
Max. Term Depth7
Asserts 1
Declared Functions0
Declared Constants20
Declared Sorts 0
Defined Functions0
Defined Recursive Functions 0
Defined Sorts0
Constants0
Declared Datatypes0

Symbols

true1 or2 and3 =4
+91 -33 *273 >=31

Evaluations

Evaluation Rating Solver Variant Result Wallclock CPU Time
SMT-COMP 2016 0.57 (3/7) AProVE AProVE NIA 2014 default unknown ❌ 2400.07000 2415.83000
CVC4 CVC4-master-2016-05-27-cfef263-main default unknown ❌ 2400.02000 2397.34000
ProB ProB competition unknown ❌ 2400.07000 2401.48000
raSAT raSAT 0.3 default.sh unsat ✅ 2400.04000 2401.41000
raSAT 0.4 exp - final default.py unknown ❌ 2400.06000 4819.11000
SMT-RAT SMT-RAT default unknown ❌ 2400.07000 2401.63000
Yices2 Yices-2.4.2 default unsat ✅ 0.02898 0.02895
Z3 z3-4.4.1 default unsat ✅ 0.28167 0.28287
SMT-COMP 2017 0.40 (3/5) AProVE AProVE NIA 2014 default unknown ❌ 600.04000 609.59000
CVC4 CVC4-smtcomp2017-main default unsat ✅ 0.08132 0.08033
SMT-RAT SMTRAT-comp2017_2 default unknown ❌ 600.06200 600.04000
Yices2 Yices2-Main default unsat ✅ 0.17812 0.17739
Z3 z3-4.5.0 default unsat ✅ 0.40060 0.39944
SMT-COMP 2018 0.40 (3/5) AProVE AProVE NIA 2014_default unknown ❌ 1200.04000 1210.96000
CVC4 master-2018-06-10-b19c840-competition-default_default unsat ✅ 0.07948 0.07970
SMT-RAT SMTRAT-Rat-final_default unknown ❌ 1200.01000 1199.96000
Yices2 Yices 2.6.0_default unsat ✅ 0.04306 0.04300
Z3 z3-4.7.1_default unsat ✅ 0.68755 0.68758
SMT-COMP 2020 0.29 (5/7) AProVE AProVE NIA 2014_default unknown ❌ 1200.06000 1212.60000
CVC4 CVC4-sq-final_default unsat ✅ 0.11750 0.11728
MathSAT MathSAT5_default.sh unsat ✅ 0.04099 0.04094
Par4 Par4-wrapped-sq_default unsat ✅ 0.10077 0.00632
SMT-RAT smtrat-SMTCOMP_default unknown ❌ 1200.04000 1199.93000
Yices2 Yices 2.6.2 bug fix_default unsat ✅ 0.07756 0.07752
Z3 z3-4.8.8_default unsat ✅ 1.14851 1.14847
SMT-COMP 2021 0.33 (4/6) AProVE AProVE NIA 2014_2021 unknown ❌ 1200.03000 1211.12000
cvc5 cvc5-fixed_default unsat ✅ 0.12456 0.12509
MathSAT mathsat-5.6.6_default unsat ✅ 0.04408 0.04406
Par4 Par4-wrapped-sq_default unsat ✅ 0.05216 0.00642
SMT-RAT smtrat-SMTCOMP_default unknown ❌ 1200.02000 1199.96000
smtrat-SMTCOMP_default unknown ❌ 1200.10000 1200.01000
Z3 z3-4.8.11_default unsat ✅ 0.23109 0.23120
SMT-COMP 2024 0.25 (3/4) cvc5 cvc5 unsat ✅ 0.35000 0.24951
SMTInterpol SMTInterpol unknown ❌ 0.46476 0.57425
Yices2 Yices2 unsat ✅ 0.22399 0.12414
Z3alpha Z3-alpha unsat ✅ 0.78691 0.68697
SMT-COMP 2025 0.20 (4/5) cvc5 cvc5 unsat ✅ 0.31380 0.19513
SMTInterpol SMTInterpol unknown ❌ 0.48934 0.56829
Yices2 Yices2 unsat ✅ 0.34404 0.21760
Z3alpha Z3-alpha unsat ✅ 0.47404 0.59142
Z3 Z3-alpha-base unsat ✅ 0.39186 0.27187
z3siri-base unsat ✅ 0.40571 0.28206