Benchmark

non-incremental/QF_NRA/LassoRanker/SV-COMP/aviad_true-termination.c_Iteration1_Loop_4-nestedTemplate.smt2

SMT script generated by Ultimate LassoRanker [1]
Ultimate LassoRanker is a tool that synthesizes ranking functions for 
linear lasso programs and implements the techniques presented in [2] and [3].
For generating these SMT scripts Ultimate LassoRanker was used as a sub 
procedure of the termination analyzer Ultimate BuchiAutomizer [4] which 
implements the techniques presented in [5].

This SMT script belongs to a set of SMT scripts that was generated by applying
revision 11505 of BuchiAutomizer to
- the benchmarks from the demonstration category on termination of the SV-COMP 
2014 [6] available at [7],
- the benchmarks from [8] which are available at [9] and,
- benchmarks from the repository of Ultimate LassoRanker.
This set of SMT scripts contains only SMT scripts that we considered to be 
difficult because LassoRanker run into a timeout after 10 seconds.

2014-05-03, Matthias Heizmann (heizmann@informatik.uni-freiburg.de)


[1] http://ultimate.informatik.uni-freiburg.de/LassoRanker/
[2] Jan Leike and Matthias Heizmann. Ranking Templates for Linear Loops. In 
TACAS 2014.
[3] Matthias Heizmann, Jochen Hoenicke, Jan Leike and Andreas Podelski. Linear 
Ranking for Linear Lasso Programs. In ATVA 2013.
[4] http://ultimate.informatik.uni-freiburg.de/BuchiAutomizer/
[5] Matthias Heizmann, Jochen Hoenicke and Andreas Podelski. Termination 
Analysis by Learning Terminating Programs. Accepted at CAV 2014
[6] Dirk Beyer: Status Report on Software Verification - (Competition Summary 
SV-COMP 2014). TACAS 2014
[7] https://svn.sosy-lab.org/software/sv-benchmarks/trunk/c/termination-crafted/
[8] Marc Brockschmidt, Byron Cook, Carsten Fuhs: Better Termination Proving 
through Cooperation. CAV 2013:413-429
[9] http://verify.rwth-aachen.de/brockschmidt/Cooperating-T2/
Benchmark
Size7211106
Compressed Size219089
License Creative Commons Attribution 4.0 International (CC-BY-4.0)
Categoryindustrial
First Occurrence2014-07-21
Generated By
Generated On
Generator
Dolmen OK1
strict Dolmen OK1
check-sat calls1
Query 1
Status unknown
Inferred Status sat
Size 7211098
Compressed Size219077
Max. Term Depth9
Asserts 3121
Declared Functions0
Declared Constants53881
Declared Sorts 0
Defined Functions0
Defined Recursive Functions 0
Defined Sorts0
Constants0
Declared Datatypes0

Symbols

or9000 and3120 =35368 let6120
+35864 -50614 *67006 <3120
<=3120 >3121 >=53740

Evaluations

Evaluation Rating Solver Variant Result Wallclock CPU Time
SMT-COMP 2017 0.80 (1/5) CVC4 CVC4-smtcomp2017-main default sat ✅ 35.33240 35.33000
SMT-RAT SMTRAT-comp2017_2 default unknown ❌ 600.11400 600.06200
veriT veriT+raSAT+Redlog default unknown ❌ 600.06400 599.90000
Yices2 Yices2-Main default unknown ❌ 600.01300 599.81100
Z3 z3-4.5.0 default unknown ❌ 600.10600 600.04000
SMT-COMP 2018 0.80 (1/5) CVC4 master-2018-06-10-b19c840-competition-default_default sat ✅ 352.29400 346.73000
SMT-RAT SMTRAT-Rat-final_default unknown ❌ 1200.06000 1199.95000
SMTRAT-MCSAT-final_default unknown ❌ 1200.07000 1199.95000
veriT veriT+raSAT+Reduce_default unknown ❌ 1200.02000 1199.88000
Yices2 Yices 2.6.0_default unknown ❌ 1200.05000 1200.07000
Z3 z3-4.7.1_default unknown ❌ 1200.02000 1199.96000
SMT-COMP 2020 0.57 (3/7) CVC4 CVC4-sq-final_default sat ✅ 370.73000 365.56200
MathSAT MathSAT5_default.sh sat ✅ 30.34930 30.32400
Par4 Par4-wrapped-sq_default sat ✅ 354.00400 1056.54000
SMT-RAT smtrat-CDCAC_default unknown ❌ 1200.04000 1199.90000
smtrat-MCSAT_default unknown ❌ 1200.02000 1199.90000
veriT veriT+raSAT+Redlog_default unknown ❌ 1200.03000 1199.81000
Yices2 Yices 2.6.2 bug fix_default unknown ❌ 1200.11000 1199.81000
Z3 z3-4.8.8_default unknown ❌ 1200.07000 1199.63000
SMT-COMP 2021 0.80 (2/10) MathSAT mathsat-5.6.6_default sat ✅ 32.43380 32.43380
Par4 Par4-wrapped-sq_default sat ✅ 138.78200 416.01000
SMT-RAT smtrat-MCSAT_default unknown ❌ 1200.09000 1199.93000
veriT veriT+raSAT+Redlog_default unknown ❌ 1200.09000 1199.99000
Z3 z3-4.8.11_default unknown ❌ 1200.05000 1199.71000
SMT-COMP 2022 0.44 (5/9) cvc5 cvc5-default-2022-07-02-b15e116-wrapped_sq sat ✅ 140.60600 140.54400
MathSAT MathSAT-5.6.8_default sat ✅ 33.77860 33.77660
NRA-LS NRA-LS-FINAL_default sat ✅ 914.14300 913.97800
Par4 Par4-wrapped-sq_default sat ✅ 359.91200 1059.09000
SMT-RAT SMT-RAT-MCSAT_default unknown ❌ 1200.02000 1199.90000
veriT veriT+raSAT+Redlog_default unknown ❌ 1200.03000 1199.75000
Yices2 Yices 2.6.2 for SMTCOMP 2021_default unknown ❌ 1200.02000 1199.80000
Z3 z3-4.8.17_default unknown ❌ 1200.03000 1199.83000
Z3++ z3++0715_default sat ✅ 337.93800 337.90800
SMT-COMP 2023 0.14 (6/7) cvc5 cvc5-default-2023-05-16-ea045f305_sq sat ✅ 91.03290 91.02820
NRA-LS cvc5-NRA-LS-sq_default sat ✅ 634.59600 634.40400
Par4 Par4-wrapped-sq_default sat ✅ 278.24900 834.30000
SMT-RAT SMT-RAT-MCSAT_default unknown ❌ 1200.03000 1199.90000
Yices2 Yices 2 for SMTCOMP 2023_default sat ✅ 1119.92000 1119.89000
Z3alpha z3alpha_default sat ✅ 113.26200 113.26300
Z3++ z3++0715_default sat ✅ 32.16570 32.16550
Z3++_sq_0526_default sat ✅ 421.56300 421.50600
SMT-COMP 2024 0.40 (3/5) cvc5 cvc5 sat ✅ 66.55016 66.44022
SMTInterpol SMTInterpol unknown ❌ 92.01820 113.57474
SMT-RAT SMT-RAT unknown ❌ 1201.72852 1201.09372
Yices2 Yices2 sat ✅ 345.46554 345.33742
Z3alpha Z3-alpha sat ✅ 47.16911 47.06835
SMT-COMP 2025 0.33 (4/6) cvc5 cvc5 sat ✅ 889.87777 889.61966
SMTInterpol SMTInterpol unknown ❌ 60.71468 74.21578
SMT-RAT SMT-RAT unknown ❌ 1201.29685 1201.00080
Yices2 Yices2 sat ✅ 460.15492 459.94387
Z3alpha Z3-alpha sat ✅ 47.14760 176.43001
Z3 Z3-alpha-base sat ✅ 133.99114 133.83481
z3siri-base sat ✅ 134.72135 134.58544