Benchmark

non-incremental/QF_NIA/LassoRanker/ChenFlurMukhopadhyay-SAS2012-Ex2.06_false-termination.c_Iteration1_Loop+nonterminationTemplate_0.smt2

SMT script generated by Ultimate LassoRanker [1].
Ultimate LassoRanker is a tool that analyzes termination and nontermination of
lasso-shaped programs. This script contains the SMT commands that Ultimate 
LassoRanker used while checking if a lasso-shaped program has a geometric 
nontermination argument. (See [2] for a preliminary definition of
geometric nontermination argument.)

This SMT script belongs to a set of SMT scripts that was generated by applying
Ultimate Buchi Automizer [3,4] to benchmarks from the SV-COMP 2016 [5,6] 
which are available at [7]. Ultimate Buchi Automizer takes omega-traces
(lasso-shaped programs) and uses LassoRanker in order to check if the 
lasso-shaped program is terminating.

2016-04-30, Matthias Heizmann (heizmann@informatik.uni-freiburg.de)


[1] https://ultimate.informatik.uni-freiburg.de/LassoRanker/
[2] Jan Leike, Matthias Heizmann: Geometric Series as Nontermination
Arguments for Linear Lasso Programs. CoRR abs/1405.4413 (2014)
http://arxiv.org/abs/1405.4413
[3] http://ultimate.informatik.uni-freiburg.de/BuchiAutomizer/
[4] Matthias Heizmann, Jochen Hoenicke, Andreas Podelski: Software Model
Checking for People Who Love Automata. CAV 2013:36-52
[5] http://sv-comp.sosy-lab.org/2016/
[6] Dirk Beyer: Reliable and Reproducible Competition Results with BenchExec
and Witnesses (Report on SV-COMP 2016). TACAS 2016: 887-904
[7] https://github.com/dbeyer/sv-benchmarks
Benchmark
Size13916
Compressed Size1703
License Creative Commons Attribution 4.0 International (CC-BY-4.0)
Categoryindustrial
First Occurrence2014-07-21
Generated By
Generated On
Generator
Dolmen OK1
strict Dolmen OK1
check-sat calls1
Query 1
Status unsat
Inferred Status unsat
Size 13908
Compressed Size1680
Max. Term Depth7
Asserts 1
Declared Functions0
Declared Constants20
Declared Sorts 0
Defined Functions0
Defined Recursive Functions 0
Defined Sorts0
Constants0
Declared Datatypes0

Symbols

true1 or2 and3 =4
+91 -29 *273 >=31

Evaluations

Evaluation Rating Solver Variant Result Wallclock CPU Time
SMT-COMP 2016 1.00 (0/7) AProVE AProVE NIA 2014 default unknown ❌ 2400.11000 2413.69000
CVC4 CVC4-master-2016-05-27-cfef263-main default unknown ❌ 2400.09000 2397.13000
ProB ProB competition unknown ❌ 59.50770 59.54390
raSAT raSAT 0.3 default.sh unknown ❌ 2400.03000 2401.64000
raSAT 0.4 exp - final default.py unknown ❌ 2400.02000 4813.12000
SMT-RAT SMT-RAT default unknown ❌ 2400.03000 2401.38000
Yices2 Yices-2.4.2 default unknown ❌ 2400.03000 2401.39000
Z3 z3-4.4.1 default unknown ❌ 2400.06000 2401.40000
SMT-COMP 2017 1.00 (0/5) AProVE AProVE NIA 2014 default unknown ❌ 600.02300 607.63000
CVC4 CVC4-smtcomp2017-main default unknown ❌ 600.02500 597.37000
SMT-RAT SMTRAT-comp2017_2 default unknown ❌ 600.05900 599.98500
Yices2 Yices2-Main default unknown ❌ 600.08300 599.97000
Z3 z3-4.5.0 default unknown ❌ 600.01900 599.98000
SMT-COMP 2018 1.00 (0/5) AProVE AProVE NIA 2014_default unknown ❌ 1200.03000 1209.65000
CVC4 master-2018-06-10-b19c840-competition-default_default unknown ❌ 362.48800 360.22700
SMT-RAT SMTRAT-Rat-final_default unknown ❌ 1200.05000 1199.94000
Yices2 Yices 2.6.0_default unknown ❌ 1200.10000 1200.10000
Z3 z3-4.7.1_default unknown ❌ 1200.07000 1200.09000
SMT-COMP 2019 0.75 (2/8) AProVE AProVE NIA 2014-wrapped-sq_default unknown ❌ 2400.04000 2413.45000
CVC4 CVC4-2019-06-03-d350fe1-wrapped-sq_default unsat ✅ 644.85800 624.84000
CVC4-SymBreak_03_06_2019-wrapped-sq_default unknown ❌ 712.05700 693.03800
master-2018-06-10-b19c840-competition-default_default unknown ❌ 364.75800 361.33200
MathSAT mathsat-20190601-wrapped-sq_default unknown ❌ 2400.09000 2399.95000
mathsat-na-20190601-wrapped-sq_default unsat ✅ 10.09730 10.09670
Par4 Par4-wrapped-sq_default unknown ❌ 2400.12000 9499.02000
ProB ProB-wrapped-sq_default unknown ❌ 2400.05000 2399.94000
SMT-RAT SMTRAT-5-wrapped-sq_default unknown ❌ 2400.08000 2399.80000
Yices2 Yices 2.6.2-wrapped-sq_default unknown ❌ 2400.01000 2399.81000
Z3 z3-4.8.4-d6df51951f4c-wrapped-sq_default unknown ❌ 2400.02000 2399.93000
SMT-COMP 2020 0.86 (1/7) AProVE AProVE NIA 2014_default unknown ❌ 1200.03000 1209.59000
CVC4 CVC4-sq-final_default unknown ❌ 1200.02000 1192.01000
MathSAT MathSAT5_default.sh unsat ✅ 4.20766 4.20186
Par4 Par4-wrapped-sq_default unknown ❌ 1200.10000 4737.06000
SMT-RAT smtrat-SMTCOMP_default unknown ❌ 1200.07000 1199.93000
Yices2 Yices 2.6.2 bug fix_default unknown ❌ 1200.09000 1199.72000
Z3 z3-4.8.8_default unknown ❌ 1200.08000 1199.99000
SMT-COMP 2022 0.43 (4/7) cvc5 cvc5-default-2022-07-02-b15e116-wrapped_sq unsat ✅ 7.40878 7.40922
MathSAT MathSAT-5.6.8_default unknown ❌ 1200.03000 1199.95000
Par4 Par4-wrapped-sq_default unsat ✅ 8.39055 21.69000
Yices2 Yices 2.6.2 for SMTCOMP 2021_default unknown ❌ 1200.10000 1199.90000
Yices-ismt yices-ismt-0721_default unknown ❌ 1200.11000 1199.94000
Z3 z3-4.8.17_default unsat ✅ 26.47160 26.47310
Z3++ z3++0715_default unsat ✅ 12.94880 12.94730
SMT-COMP 2025 0.40 (3/5) cvc5 cvc5 unsat ✅ 9.86141 9.73754
SMTInterpol SMTInterpol unknown ❌ 0.50021 0.57566
Yices2 Yices2 unknown ❌ 1201.26421 1201.04636
Z3alpha Z3-alpha unsat ✅ 9.33811 29.18130
Z3 Z3-alpha-base unsat ✅ 355.57199 355.39647
z3siri-base unsat ✅ 179.99211 179.85334