Benchmark

non-incremental/QF_LRA/LassoRanker/CooperatingT2/brp_withassume.t2.c_Iteration8_Loop_4-lexTemplate.smt2

SMT script generated by Ultimate LassoRanker [1]
Ultimate LassoRanker is a tool that synthesizes ranking functions for 
linear lasso programs and implements the techniques presented in [2] and [3].
For generating these SMT scripts Ultimate LassoRanker was used as a sub 
procedure of the termination analyzer Ultimate BuchiAutomizer [4] which 
implements the techniques presented in [5].

This SMT script belongs to a set of SMT scripts that was generated by applying
revision 11505 of BuchiAutomizer to
- the benchmarks from the demonstration category on termination of the SV-COMP 
2014 [6] available at [7],
- the benchmarks from [8] which are available at [9] and,
- benchmarks from the repository of Ultimate LassoRanker.
This set of SMT scripts contains only SMT scripts that we considered to be 
difficult because LassoRanker run into a timeout after 10 seconds.

2014-05-03, Matthias Heizmann (heizmann@informatik.uni-freiburg.de)


[1] http://ultimate.informatik.uni-freiburg.de/LassoRanker/
[2] Jan Leike and Matthias Heizmann. Ranking Templates for Linear Loops. In 
TACAS 2014.
[3] Matthias Heizmann, Jochen Hoenicke, Jan Leike and Andreas Podelski. Linear 
Ranking for Linear Lasso Programs. In ATVA 2013.
[4] http://ultimate.informatik.uni-freiburg.de/BuchiAutomizer/
[5] Matthias Heizmann, Jochen Hoenicke and Andreas Podelski. Termination 
Analysis by Learning Terminating Programs. Accepted at CAV 2014
[6] Dirk Beyer: Status Report on Software Verification - (Competition Summary 
SV-COMP 2014). TACAS 2014
[7] https://svn.sosy-lab.org/software/sv-benchmarks/trunk/c/termination-crafted/
[8] Marc Brockschmidt, Byron Cook, Carsten Fuhs: Better Termination Proving 
through Cooperation. CAV 2013:413-429
[9] http://verify.rwth-aachen.de/brockschmidt/Cooperating-T2/
Benchmark
Size6593159
Compressed Size404339
License Creative Commons Attribution 4.0 International (CC-BY-4.0)
Categoryindustrial
First Occurrence2014-07-21
Generated By
Generated On
Generator
Dolmen OK1
strict Dolmen OK
check-sat calls1
Query 1
Status sat
Inferred Status sat
Size 6593151
Compressed Size404343
Max. Term Depth428
Asserts 644
Declared Functions0
Declared Constants25988
Declared Sorts 0
Defined Functions0
Defined Recursive Functions 0
Defined Sorts0
Constants0
Declared Datatypes0

Symbols

or2992 and2992 =43392 let11960
+50944 -31872 *59752 <1336
<=2360 >1180 >=25696

Evaluations

Evaluation Rating Solver Variant Result Wallclock CPU Time
SMT-COMP 2014 CVC4 CVC4 f7118b2 default sat ✅ 157.26700 157.29700
MathSAT MathSAT-5.2.12-Main default sat ✅ 321.09600 321.12100
SMTInterpol smtinterpol-2.1-118-g3dada2f default sat ✅ 58.78590 77.80220
veriT veriT-smtcomp2014 default sat ✅ 92.36950 92.39300
Yices2 Yices-2.2.1-smtcomp2014 default sat ✅ 2.76214 2.75458
Z3 Z3-4.3.2.a054b099c1d6-x64-debian-6.0.6-SMT-COMP-2014 default sat ✅ 344.66900 344.70900
SMT-COMP 2015 CVC4 CVC4-master-2015-06-15-9b32405-main default sat ✅ 166.33700 166.41800
CVC4-experimental-2015-06-15-ff5745a-main default sat ✅ 166.39400 166.47300
MathSAT MathSat 5.3.6 main smtcomp2015_main sat ✅ 336.30100 336.32000
SMTInterpol SMTInterpol v2.1-206-g86e9531 default sat ✅ 32.00330 48.44960
SMT-RAT SMT-RAT-final default sat ✅ 853.72800 853.96700
veriT veriT default sat ✅ 89.93420 89.93630
Yices2 Yices default sat ✅ 4.49967 4.49532
Z3 z3 4.4.0 default sat ✅ 448.89200 448.99100
SMT-COMP 2016 0.33 (6/9) CVC4 CVC4-master-2016-05-27-cfef263-main default sat ✅ 114.12100 114.18000
MathSAT mathsat-5.3.11-linux-x86_64-Main default sat ✅ 295.91200 296.08000
OpenSMT OpenSMT2-2016-05-12 default unknown ❌ 2400.10000 2401.56000
SMTInterpol smtinterpol-2.1-258-g92ab3df default sat ✅ 32.28600 56.05920
SMT-RAT SMT-RAT default unknown ❌ 2400.03000 2401.19000
Toysmt toysmt default unknown ❌ 1663.21000 1663.83000
veriT veriT-dev default sat ✅ 95.64900 95.70490
Yices2 Yices-2.4.2 default sat ✅ 4.35461 4.35704
Z3 z3-4.4.1 default sat ✅ 364.18000 364.30500
SMT-COMP 2017 0.25 (6/8) CVC4 CVC4-smtcomp2017-main default sat ✅ 128.00000 127.90000
MathSAT mathsat-5.4.1-linux-x86_64-Main default sat ✅ 116.69000 116.68800
OpenSMT opensmt2-2017-06-04 default unknown ❌ 600.04700 599.93400
SMTInterpol SMTInterpol default sat ✅ 27.46400 46.50730
SMT-RAT SMTRAT-comp2017_2 default unknown ❌ 600.02000 599.97900
veriT veriT-2017-06-17 default sat ✅ 82.95780 82.96080
Yices2 Yices2-Main default sat ✅ 4.78594 4.78515
Z3 z3-4.5.0 default sat ✅ 292.33800 292.20300
SMT-COMP 2018 0.22 (7/9) Ctrl-Ergo Ctrl-Ergo-SMTComp-2018_default sat ✅ 11.48310 43.42000
CVC4 master-2018-06-10-b19c840-competition-default_default sat ✅ 135.65700 135.59500
MathSAT mathsat-5.5.2-linux-x86_64-Main_default sat ✅ 97.33780 97.32870
OpenSMT opensmt2_default unknown ❌ 1200.01000 1199.73000
SMTInterpol SMTInterpol-2.5-19-g0d39cdee_default sat ✅ 42.52930 63.53400
SMT-RAT SMTRAT-Rat-final_default unknown ❌ 1200.08000 1200.08000
SMTRAT-MCSAT-final_default unknown ❌ 1200.03000 1199.89000
veriT veriT_default sat ✅ 95.38430 95.36360
Yices2 Yices 2.6.0_default sat ✅ 4.94825 4.94795
Z3 z3-4.7.1_default sat ✅ 254.42200 254.34200
SMT-COMP 2021 0.17 (5/6) MathSAT mathsat-5.6.6_default sat ✅ 127.54700 127.53900
mc2 mc2 2021-06-07_default.sh unknown ❌ 18.05150 18.05060
Par4 Par4-wrapped-sq_default sat ✅ 52.38300 205.92000
SMTInterpol smtinterpol-2.5-823-g881e8631_default sat ✅ 512.52300 560.49900
veriT veriT_default sat ✅ 85.58220 85.57200
Z3 z3-4.8.11_default sat ✅ 115.40900 115.37600
SMT-COMP 2024 cvc5 cvc5 sat ✅ 66.80932 66.67608
OpenSMT OpenSMT sat ✅ 6.22174 6.12133
SMTInterpol SMTInterpol sat ✅ 288.63335 319.98838
Yices2 Yices2 sat ✅ 2.85202 2.75207
Z3alpha Z3-alpha sat ✅ 55.53188 55.42952
SMT-COMP 2025 0.17 (5/6) cvc5 cvc5 sat ✅ 61.51747 61.38524
OpenSMT OpenSMT sat ✅ 5.37248 5.24164
SMTInterpol SMTInterpol unknown ❌ 1201.77549 1227.35357
Yices2 Yices2 sat ✅ 4.62198 4.50280
Z3alpha Z3-alpha sat ✅ 144.64473 572.54374
Z3 Z3-alpha-base sat ✅ 78.72903 78.59505