Benchmark

non-incremental/QF_LRA/LassoRanker/CooperatingT2/p-55.t2.c_Iteration2_Loop_4-pieceTemplate.smt2

SMT script generated by Ultimate LassoRanker [1]
Ultimate LassoRanker is a tool that synthesizes ranking functions for 
linear lasso programs and implements the techniques presented in [2] and [3].
For generating these SMT scripts Ultimate LassoRanker was used as a sub 
procedure of the termination analyzer Ultimate BuchiAutomizer [4] which 
implements the techniques presented in [5].

This SMT script belongs to a set of SMT scripts that was generated by applying
revision 11505 of BuchiAutomizer to
- the benchmarks from the demonstration category on termination of the SV-COMP 
2014 [6] available at [7],
- the benchmarks from [8] which are available at [9] and,
- benchmarks from the repository of Ultimate LassoRanker.
This set of SMT scripts contains only SMT scripts that we considered to be 
difficult because LassoRanker run into a timeout after 10 seconds.

2014-05-03, Matthias Heizmann (heizmann@informatik.uni-freiburg.de)


[1] http://ultimate.informatik.uni-freiburg.de/LassoRanker/
[2] Jan Leike and Matthias Heizmann. Ranking Templates for Linear Loops. In 
TACAS 2014.
[3] Matthias Heizmann, Jochen Hoenicke, Jan Leike and Andreas Podelski. Linear 
Ranking for Linear Lasso Programs. In ATVA 2013.
[4] http://ultimate.informatik.uni-freiburg.de/BuchiAutomizer/
[5] Matthias Heizmann, Jochen Hoenicke and Andreas Podelski. Termination 
Analysis by Learning Terminating Programs. Accepted at CAV 2014
[6] Dirk Beyer: Status Report on Software Verification - (Competition Summary 
SV-COMP 2014). TACAS 2014
[7] https://svn.sosy-lab.org/software/sv-benchmarks/trunk/c/termination-crafted/
[8] Marc Brockschmidt, Byron Cook, Carsten Fuhs: Better Termination Proving 
through Cooperation. CAV 2013:413-429
[9] http://verify.rwth-aachen.de/brockschmidt/Cooperating-T2/
Benchmark
Size1353198
Compressed Size50572
License Creative Commons Attribution 4.0 International (CC-BY-4.0)
Categoryindustrial
First Occurrence2014-07-21
Generated By
Generated On
Generator
Dolmen OK1
strict Dolmen OK
check-sat calls1
Query 1
Status sat
Inferred Status sat
Size 1353190
Compressed Size50611
Max. Term Depth76
Asserts 505
Declared Functions0
Declared Constants3601
Declared Sorts 0
Defined Functions0
Defined Recursive Functions 0
Defined Sorts0
Constants0
Declared Datatypes0

Symbols

or2456 and2456 =6930 let6935
+12366 -4628 *12960 <1304
<=2036 >901 >=3444

Evaluations

Evaluation Rating Solver Variant Result Wallclock CPU Time
SMT-COMP 2014 CVC4 CVC4 f7118b2 default sat ✅ 25.69400 25.69410
MathSAT MathSAT-5.2.12-Main default sat ✅ 54.10900 54.10580
SMTInterpol smtinterpol-2.1-118-g3dada2f default sat ✅ 35.02790 48.42360
veriT veriT-smtcomp2014 default sat ✅ 10.60680 10.59640
Yices2 Yices-2.2.1-smtcomp2014 default sat ✅ 5.10085 5.09423
Z3 Z3-4.3.2.a054b099c1d6-x64-debian-6.0.6-SMT-COMP-2014 default sat ✅ 100.50700 100.50500
SMT-COMP 2015 0.14 (6/7) CVC4 CVC4-master-2015-06-15-9b32405-main default sat ✅ 30.52100 30.53140
CVC4-experimental-2015-06-15-ff5745a-main default sat ✅ 30.60620 30.60930
MathSAT MathSat 5.3.6 main smtcomp2015_main sat ✅ 37.67160 37.68630
SMTInterpol SMTInterpol v2.1-206-g86e9531 default sat ✅ 26.16440 37.71630
SMT-RAT SMT-RAT-final default unknown ❌ 2400.01000 2400.66000
veriT veriT default sat ✅ 7.98644 7.98879
Yices2 Yices default sat ✅ 3.77264 3.77343
Z3 z3 4.4.0 default sat ✅ 100.74400 100.76400
SMT-COMP 2016 0.22 (7/9) CVC4 CVC4-master-2016-05-27-cfef263-main default sat ✅ 28.15530 28.16600
MathSAT mathsat-5.3.11-linux-x86_64-Main default sat ✅ 67.70460 67.75120
OpenSMT OpenSMT2-2016-05-12 default sat ✅ 1161.45000 1162.09000
SMTInterpol smtinterpol-2.1-258-g92ab3df default sat ✅ 26.48530 38.80900
SMT-RAT SMT-RAT default unknown ❌ 2400.09000 2401.44000
Toysmt toysmt default unknown ❌ 1711.21000 1711.61000
veriT veriT-dev default sat ✅ 5.58819 5.59156
Yices2 Yices-2.4.2 default sat ✅ 3.75372 3.75612
Z3 z3-4.4.1 default sat ✅ 113.25000 113.32700
SMT-COMP 2017 0.25 (6/8) CVC4 CVC4-smtcomp2017-main default sat ✅ 39.43580 39.43110
MathSAT mathsat-5.4.1-linux-x86_64-Main default sat ✅ 24.02200 24.02270
OpenSMT opensmt2-2017-06-04 default unknown ❌ 600.01200 599.91800
SMTInterpol SMTInterpol default sat ✅ 37.27110 54.39370
SMT-RAT SMTRAT-comp2017_2 default unknown ❌ 600.01500 599.94000
veriT veriT-2017-06-17 default sat ✅ 10.23610 10.23350
Yices2 Yices2-Main default sat ✅ 3.17793 3.17678
Z3 z3-4.5.0 default sat ✅ 113.22800 113.22400
SMT-COMP 2018 0.22 (7/9) Ctrl-Ergo Ctrl-Ergo-SMTComp-2018_default sat ✅ 42.18030 167.24000
CVC4 master-2018-06-10-b19c840-competition-default_default sat ✅ 33.32940 33.32200
MathSAT mathsat-5.5.2-linux-x86_64-Main_default sat ✅ 32.41690 32.41260
OpenSMT opensmt2_default unknown ❌ 1200.11000 1199.96000
SMTInterpol SMTInterpol-2.5-19-g0d39cdee_default sat ✅ 35.12140 48.21510
SMT-RAT SMTRAT-Rat-final_default unknown ❌ 1200.09000 1200.05000
SMTRAT-MCSAT-final_default unknown ❌ 1200.06000 1199.99000
veriT veriT_default sat ✅ 5.48133 5.48155
Yices2 Yices 2.6.0_default sat ✅ 2.27685 2.27672
Z3 z3-4.7.1_default sat ✅ 110.70300 110.70500
SMT-COMP 2021 0.17 (5/6) MathSAT mathsat-5.6.6_default sat ✅ 27.57850 27.57290
mc2 mc2 2021-06-07_default.sh unknown ❌ 41.56030 41.53050
Par4 Par4-wrapped-sq_default sat ✅ 8.25619 32.35000
SMTInterpol smtinterpol-2.5-823-g881e8631_default sat ✅ 54.60030 75.02120
veriT veriT_default sat ✅ 7.39377 7.39294
Z3 z3-4.8.11_default sat ✅ 27.69530 27.69240
SMT-COMP 2022 cvc5 cvc5-default-2022-07-02-b15e116-wrapped_sq sat ✅ 21.60130 21.59420
MathSAT MathSAT-5.6.8_default sat ✅ 35.73380 35.73170
veriT veriT_default sat ✅ 5.76984 5.76996
Yices2 Yices 2.6.2 for SMTCOMP 2021_default sat ✅ 2.70435 5.37412
Yices 2.6.2 for SMTCOMP 2021_default sat ✅ 2.70463 2.70451
Z3 z3-4.8.17_default sat ✅ 24.33220 24.32970
SMT-COMP 2024 cvc5 cvc5 sat ✅ 14.78051 14.68030
OpenSMT OpenSMT sat ✅ 2.85296 2.75159
SMTInterpol SMTInterpol sat ✅ 31.61854 48.24065
Yices2 Yices2 sat ✅ 6.55697 6.45703
Z3alpha Z3-alpha sat ✅ 25.62483 25.52350