Benchmark

non-incremental/QF_LRA/LassoRanker/CooperatingT2/toeplz.t2.c_Iteration2_Loop_4-pieceTemplate.smt2

SMT script generated by Ultimate LassoRanker [1]
Ultimate LassoRanker is a tool that synthesizes ranking functions for 
linear lasso programs and implements the techniques presented in [2] and [3].
For generating these SMT scripts Ultimate LassoRanker was used as a sub 
procedure of the termination analyzer Ultimate BuchiAutomizer [4] which 
implements the techniques presented in [5].

This SMT script belongs to a set of SMT scripts that was generated by applying
revision 11505 of BuchiAutomizer to
- the benchmarks from the demonstration category on termination of the SV-COMP 
2014 [6] available at [7],
- the benchmarks from [8] which are available at [9] and,
- benchmarks from the repository of Ultimate LassoRanker.
This set of SMT scripts contains only SMT scripts that we considered to be 
difficult because LassoRanker run into a timeout after 10 seconds.

2014-05-03, Matthias Heizmann (heizmann@informatik.uni-freiburg.de)


[1] http://ultimate.informatik.uni-freiburg.de/LassoRanker/
[2] Jan Leike and Matthias Heizmann. Ranking Templates for Linear Loops. In 
TACAS 2014.
[3] Matthias Heizmann, Jochen Hoenicke, Jan Leike and Andreas Podelski. Linear 
Ranking for Linear Lasso Programs. In ATVA 2013.
[4] http://ultimate.informatik.uni-freiburg.de/BuchiAutomizer/
[5] Matthias Heizmann, Jochen Hoenicke and Andreas Podelski. Termination 
Analysis by Learning Terminating Programs. Accepted at CAV 2014
[6] Dirk Beyer: Status Report on Software Verification - (Competition Summary 
SV-COMP 2014). TACAS 2014
[7] https://svn.sosy-lab.org/software/sv-benchmarks/trunk/c/termination-crafted/
[8] Marc Brockschmidt, Byron Cook, Carsten Fuhs: Better Termination Proving 
through Cooperation. CAV 2013:413-429
[9] http://verify.rwth-aachen.de/brockschmidt/Cooperating-T2/
Benchmark
Size4146792
Compressed Size115366
License Creative Commons Attribution 4.0 International (CC-BY-4.0)
Categoryindustrial
First Occurrence2014-07-21
Generated By
Generated On
Generator
Dolmen OK1
strict Dolmen OK
check-sat calls1
Query 1
Status sat
Inferred Status sat
Size 4146784
Compressed Size115352
Max. Term Depth395
Asserts 505
Declared Functions0
Declared Constants7257
Declared Sorts 0
Defined Functions0
Defined Recursive Functions 0
Defined Sorts0
Constants0
Declared Datatypes0

Symbols

or2456 and2456 =30786 let27364
+44310 -11524 *46320 <1304
<=2036 >901 >=6804

Evaluations

Evaluation Rating Solver Variant Result Wallclock CPU Time
SMT-COMP 2015 0.14 (6/7) CVC4 CVC4-master-2015-06-15-9b32405-main default sat ✅ 152.89700 152.90100
CVC4-experimental-2015-06-15-ff5745a-main default sat ✅ 155.43200 155.47200
MathSAT MathSat 5.3.6 main smtcomp2015_main sat ✅ 220.72400 220.74600
SMTInterpol SMTInterpol v2.1-206-g86e9531 default unknown ❌ 2400.02000 2441.96000
SMT-RAT SMT-RAT-final default sat ✅ 115.91100 115.93100
veriT veriT default sat ✅ 265.95700 265.98600
Yices2 Yices default sat ✅ 32.68920 32.70400
Z3 z3 4.4.0 default sat ✅ 1438.46000 1438.97000
SMT-COMP 2016 0.33 (6/9) CVC4 CVC4-master-2016-05-27-cfef263-main default sat ✅ 129.97200 130.05100
MathSAT mathsat-5.3.11-linux-x86_64-Main default sat ✅ 227.77800 227.87000
OpenSMT OpenSMT2-2016-05-12 default unknown ❌ 2400.03000 2401.34000
SMTInterpol smtinterpol-2.1-258-g92ab3df default sat ✅ 161.43200 196.09300
SMT-RAT SMT-RAT default unknown ❌ 2400.04000 2401.35000
Toysmt toysmt default unknown ❌ 1737.92000 1738.25000
veriT veriT-dev default sat ✅ 26.17640 26.18780
Yices2 Yices-2.4.2 default sat ✅ 10.55580 10.56270
Z3 z3-4.4.1 default sat ✅ 1509.31000 1510.07000
SMT-COMP 2017 0.38 (5/8) CVC4 CVC4-smtcomp2017-main default sat ✅ 133.65300 133.63100
MathSAT mathsat-5.4.1-linux-x86_64-Main default sat ✅ 65.14620 65.13540
OpenSMT opensmt2-2017-06-04 default unknown ❌ 600.01400 599.98100
SMTInterpol SMTInterpol default sat ✅ 61.34900 79.66480
SMT-RAT SMTRAT-comp2017_2 default unknown ❌ 600.01200 599.92700
veriT veriT-2017-06-17 default sat ✅ 26.26890 26.26570
Yices2 Yices2-Main default sat ✅ 5.34599 5.34490
Z3 z3-4.5.0 default unknown ❌ 600.02000 599.87600
SMT-COMP 2018 0.33 (6/9) Ctrl-Ergo Ctrl-Ergo-SMTComp-2018_default sat ✅ 206.97800 822.05000
CVC4 master-2018-06-10-b19c840-competition-default_default sat ✅ 75.77480 75.77250
MathSAT mathsat-5.5.2-linux-x86_64-Main_default sat ✅ 24.08630 24.08170
OpenSMT opensmt2_default unknown ❌ 1200.10000 1199.83000
SMTInterpol SMTInterpol-2.5-19-g0d39cdee_default sat ✅ 1178.87000 1364.54000
SMT-RAT SMTRAT-Rat-final_default unknown ❌ 1200.01000 1199.90000
SMTRAT-MCSAT-final_default unknown ❌ 1200.02000 1200.05000
veriT veriT_default sat ✅ 39.09450 39.08680
Yices2 Yices 2.6.0_default sat ✅ 2.35625 2.35598
Z3 z3-4.7.1_default unknown ❌ 1200.01000 1199.93000
SMT-COMP 2020 CVC4 CVC4-sq-final_default sat ✅ 70.66380 70.62820
MathSAT MathSAT5_default.sh sat ✅ 35.97040 35.96970
OpenSMT OpenSMT_default sat ✅ 67.75610 67.74360
Par4 Par4-wrapped-sq_default sat ✅ 41.31890 163.16000
SMTInterpol smtinterpol-2.5-679-gacfde87a_default sat ✅ 111.29600 134.04100
veriT veriT_default sat ✅ 40.21110 40.20690
Yices2 Yices 2.6.2 bug fix_default sat ✅ 8.23465 8.23394
Z3 z3-4.8.8_default sat ✅ 67.32170 67.27900
SMT-COMP 2022 cvc5 cvc5-default-2022-07-02-b15e116-wrapped_sq sat ✅ 76.44030 76.43640
MathSAT MathSAT-5.6.8_default sat ✅ 30.97710 30.97600
veriT veriT_default sat ✅ 29.81470 29.80980
Yices2 Yices 2.6.2 for SMTCOMP 2021_default sat ✅ 6.17611 6.17533
Yices 2.6.2 for SMTCOMP 2021_default sat ✅ 6.17587 6.17489
Z3 z3-4.8.17_default sat ✅ 107.55600 107.54500
SMT-COMP 2024 cvc5 cvc5 sat ✅ 63.13789 63.00507
OpenSMT OpenSMT sat ✅ 15.19688 15.09391
SMTInterpol SMTInterpol sat ✅ 292.49617 317.90906
Yices2 Yices2 sat ✅ 6.46480 6.36375
Z3alpha Z3-alpha sat ✅ 61.49629 61.39464
SMT-COMP 2025 cvc5 cvc5 sat ✅ 33.81145 33.68841
OpenSMT OpenSMT sat ✅ 7.28944 7.16045
SMTInterpol SMTInterpol sat ✅ 176.33214 195.34656
Yices2 Yices2 sat ✅ 4.49937 4.36145
Z3alpha Z3-alpha sat ✅ 98.96572 392.61560
Z3 Z3-alpha-base sat ✅ 86.69338 86.55722