Benchmark

non-incremental/QF_LRA/LassoRanker/CooperatingT2/hongyi1.t2.c_Iteration1_Loop_4-phaseTemplate.smt2

SMT script generated by Ultimate LassoRanker [1]
Ultimate LassoRanker is a tool that synthesizes ranking functions for 
linear lasso programs and implements the techniques presented in [2] and [3].
For generating these SMT scripts Ultimate LassoRanker was used as a sub 
procedure of the termination analyzer Ultimate BuchiAutomizer [4] which 
implements the techniques presented in [5].

This SMT script belongs to a set of SMT scripts that was generated by applying
revision 11505 of BuchiAutomizer to
- the benchmarks from the demonstration category on termination of the SV-COMP 
2014 [6] available at [7],
- the benchmarks from [8] which are available at [9] and,
- benchmarks from the repository of Ultimate LassoRanker.
This set of SMT scripts contains only SMT scripts that we considered to be 
difficult because LassoRanker run into a timeout after 10 seconds.

2014-05-03, Matthias Heizmann (heizmann@informatik.uni-freiburg.de)


[1] http://ultimate.informatik.uni-freiburg.de/LassoRanker/
[2] Jan Leike and Matthias Heizmann. Ranking Templates for Linear Loops. In 
TACAS 2014.
[3] Matthias Heizmann, Jochen Hoenicke, Jan Leike and Andreas Podelski. Linear 
Ranking for Linear Lasso Programs. In ATVA 2013.
[4] http://ultimate.informatik.uni-freiburg.de/BuchiAutomizer/
[5] Matthias Heizmann, Jochen Hoenicke and Andreas Podelski. Termination 
Analysis by Learning Terminating Programs. Accepted at CAV 2014
[6] Dirk Beyer: Status Report on Software Verification - (Competition Summary 
SV-COMP 2014). TACAS 2014
[7] https://svn.sosy-lab.org/software/sv-benchmarks/trunk/c/termination-crafted/
[8] Marc Brockschmidt, Byron Cook, Carsten Fuhs: Better Termination Proving 
through Cooperation. CAV 2013:413-429
[9] http://verify.rwth-aachen.de/brockschmidt/Cooperating-T2/
Benchmark
Size3433425
Compressed Size238893
License Creative Commons Attribution 4.0 International (CC-BY-4.0)
Categoryindustrial
First Occurrence2014-07-21
Generated By
Generated On
Generator
Dolmen OK1
strict Dolmen OK
check-sat calls1
Query 1
Status sat
Inferred Status sat
Size 3433417
Compressed Size238880
Max. Term Depth196
Asserts 404
Declared Functions0
Declared Constants12660
Declared Sorts 0
Defined Functions0
Defined Recursive Functions 0
Defined Sorts0
Constants0
Declared Datatypes0

Symbols

or2048 and2048 =22628 let7184
+28108 -16776 *30472 <1048
<=1688 >724 >=12460

Evaluations

Evaluation Rating Solver Variant Result Wallclock CPU Time
SMT-COMP 2014 CVC4 CVC4 f7118b2 default sat ✅ 60.42230 60.43680
MathSAT MathSAT-5.2.12-Main default sat ✅ 181.77400 181.75700
SMTInterpol smtinterpol-2.1-118-g3dada2f default sat ✅ 23.40300 35.17670
veriT veriT-smtcomp2014 default sat ✅ 42.75560 42.75650
Yices2 Yices-2.2.1-smtcomp2014 default sat ✅ 4.59425 4.58730
Z3 Z3-4.3.2.a054b099c1d6-x64-debian-6.0.6-SMT-COMP-2014 default sat ✅ 1038.12000 1038.44000
SMT-COMP 2015 0.14 (6/7) CVC4 CVC4-master-2015-06-15-9b32405-main default sat ✅ 44.95550 44.97120
CVC4-experimental-2015-06-15-ff5745a-main default sat ✅ 45.15470 45.17110
MathSAT MathSat 5.3.6 main smtcomp2015_main sat ✅ 200.23900 200.30000
SMTInterpol SMTInterpol v2.1-206-g86e9531 default unknown ❌ 2400.02000 2430.38000
SMT-RAT SMT-RAT-final default sat ✅ 185.02400 185.08800
veriT veriT default sat ✅ 47.13370 47.14280
Yices2 Yices default sat ✅ 2.97530 2.97555
Z3 z3 4.4.0 default sat ✅ 834.14000 834.48900
SMT-COMP 2016 0.33 (6/9) CVC4 CVC4-master-2016-05-27-cfef263-main default sat ✅ 44.67080 44.69650
MathSAT mathsat-5.3.11-linux-x86_64-Main default sat ✅ 199.76300 199.89600
OpenSMT OpenSMT2-2016-05-12 default unknown ❌ 2400.10000 2401.36000
SMTInterpol smtinterpol-2.1-258-g92ab3df default sat ✅ 16.08020 35.62410
SMT-RAT SMT-RAT default unknown ❌ 2400.02000 2401.36000
Toysmt toysmt default unknown ❌ 1754.32000 1755.18000
veriT veriT-dev default sat ✅ 32.83060 32.85250
Yices2 Yices-2.4.2 default sat ✅ 5.03068 5.03343
Z3 z3-4.4.1 default sat ✅ 808.84900 809.32300
SMT-COMP 2017 0.38 (5/8) CVC4 CVC4-smtcomp2017-main default sat ✅ 49.39230 49.38960
MathSAT mathsat-5.4.1-linux-x86_64-Main default sat ✅ 55.65050 55.65010
OpenSMT opensmt2-2017-06-04 default unknown ❌ 600.11500 599.98900
SMTInterpol SMTInterpol default sat ✅ 18.97990 33.66520
SMT-RAT SMTRAT-comp2017_2 default unknown ❌ 600.10500 600.09000
veriT veriT-2017-06-17 default sat ✅ 25.00840 25.00910
Yices2 Yices2-Main default sat ✅ 5.56426 5.56307
Z3 z3-4.5.0 default unknown ❌ 600.08800 600.00800
SMT-COMP 2018 0.22 (7/9) Ctrl-Ergo Ctrl-Ergo-SMTComp-2018_default sat ✅ 34.71990 136.86000
CVC4 master-2018-06-10-b19c840-competition-default_default sat ✅ 50.95050 50.93990
MathSAT mathsat-5.5.2-linux-x86_64-Main_default sat ✅ 52.31280 52.31080
OpenSMT opensmt2_default unknown ❌ 1200.02000 1199.98000
SMTInterpol SMTInterpol-2.5-19-g0d39cdee_default sat ✅ 15.91490 29.53940
SMT-RAT SMTRAT-Rat-final_default unknown ❌ 1200.10000 1200.00000
SMTRAT-MCSAT-final_default unknown ❌ 1200.01000 1199.79000
veriT veriT_default sat ✅ 34.34130 34.33830
Yices2 Yices 2.6.0_default sat ✅ 3.81045 3.80945
Z3 z3-4.7.1_default sat ✅ 697.40900 697.18600
SMT-COMP 2019 0.12 (7/8) Ctrl-Ergo Ctrl-Ergo-2019-wrapped-sq_default sat ✅ 31.96250 125.27000
CVC4 CVC4-2019-06-03-d350fe1-wrapped-sq_default sat ✅ 42.76010 42.75560
CVC4-SymBreak_03_06_2019-wrapped-sq_default sat ✅ 117.66000 115.73400
master-2018-06-10-b19c840-competition-default_default sat ✅ 58.90990 58.90530
OpenSMT OpenSMT-wrapped-sq_default unknown ❌ 2400.02000 2399.80000
Par4 Par4-wrapped-sq_default sat ✅ 33.08230 130.55000
SMTInterpol smtinterpol-2.5-514-wrapped-sq_default sat ✅ 115.27000 141.67900
veriT veriT-wrapped-sq_default sat ✅ 38.33210 38.32680
Yices2 Yices 2.6.2-wrapped-sq_default sat ✅ 3.44170 3.44128
Z3 z3-4.8.4-d6df51951f4c-wrapped-sq_default sat ✅ 998.31700 998.13400
SMT-COMP 2025 cvc5 cvc5 sat ✅ 29.28209 29.15601
OpenSMT OpenSMT sat ✅ 4.15319 4.02444
SMTInterpol SMTInterpol sat ✅ 191.04106 210.54903
Yices2 Yices2 sat ✅ 2.21111 2.08972
Z3alpha Z3-alpha sat ✅ 34.56419 134.99980
Z3 Z3-alpha-base sat ✅ 23.34390 23.22365