Benchmark
non-incremental/QF_LRA/LassoRanker/CooperatingT2/sas2.t2.c_Iteration6_Lasso_7-phaseTemplate.smt2
SMT script generated by Ultimate LassoRanker [1]
Ultimate LassoRanker is a tool that synthesizes ranking functions for
linear lasso programs and implements the techniques presented in [2] and [3].
For generating these SMT scripts Ultimate LassoRanker was used as a sub
procedure of the termination analyzer Ultimate BuchiAutomizer [4] which
implements the techniques presented in [5].
This SMT script belongs to a set of SMT scripts that was generated by applying
revision 11505 of BuchiAutomizer to
- the benchmarks from the demonstration category on termination of the SV-COMP
2014 [6] available at [7],
- the benchmarks from [8] which are available at [9] and,
- benchmarks from the repository of Ultimate LassoRanker.
This set of SMT scripts contains only SMT scripts that we considered to be
difficult because LassoRanker run into a timeout after 10 seconds.
2014-05-03, Matthias Heizmann (heizmann@informatik.uni-freiburg.de)
[1] http://ultimate.informatik.uni-freiburg.de/LassoRanker/
[2] Jan Leike and Matthias Heizmann. Ranking Templates for Linear Loops. In
TACAS 2014.
[3] Matthias Heizmann, Jochen Hoenicke, Jan Leike and Andreas Podelski. Linear
Ranking for Linear Lasso Programs. In ATVA 2013.
[4] http://ultimate.informatik.uni-freiburg.de/BuchiAutomizer/
[5] Matthias Heizmann, Jochen Hoenicke and Andreas Podelski. Termination
Analysis by Learning Terminating Programs. Accepted at CAV 2014
[6] Dirk Beyer: Status Report on Software Verification - (Competition Summary
SV-COMP 2014). TACAS 2014
[7] https://svn.sosy-lab.org/software/sv-benchmarks/trunk/c/termination-crafted/
[8] Marc Brockschmidt, Byron Cook, Carsten Fuhs: Better Termination Proving
through Cooperation. CAV 2013:413-429
[9] http://verify.rwth-aachen.de/brockschmidt/Cooperating-T2/
| Benchmark |
| Size | 3983633 |
| Compressed Size | 225971 |
| License |
Creative Commons Attribution 4.0 International
(CC-BY-4.0)
|
| Category | industrial |
| First Occurrence | 2014-07-21 |
| Generated By | — |
| Generated On | — |
| Generator | — |
| Dolmen OK | 1 |
| strict Dolmen OK | — |
| check-sat calls | 1 |
| Status | sat |
| Inferred Status | sat |
| Size | 3983625 |
| Compressed Size | 225956 |
| Max. Term Depth | 179 |
| Asserts | 199 |
| Declared Functions | 0 |
| Declared Constants | 2913 |
| Declared Sorts | 0 |
| Defined Functions | 0 |
| Defined Recursive Functions | 0 |
| Defined Sorts | 0 |
| Constants | 0 |
| Declared Datatypes | 0 |
Symbols
or | 2232 |
and | 2232 |
= | 41020 |
let | 6741 |
+ | 45824 |
- | 7764 |
* | 28044 |
< | 1816 |
<= | 2072 |
> | 327 |
>= | 2208 |
| |
Evaluations
| Evaluation |
Rating |
Solver |
Variant |
Result |
Wallclock |
CPU Time |
|
SMT-COMP 2017
|
1.00 (0/8) |
CVC4 |
CVC4-smtcomp2017-main default |
unknown ❌
|
600.02300
|
593.31000
|
| |
MathSAT |
mathsat-5.4.1-linux-x86_64-Main default |
unknown ❌
|
600.01400
|
599.89000
|
| |
OpenSMT |
opensmt2-2017-06-04 default |
unknown ❌
|
600.01300
|
599.91000
|
| |
SMTInterpol |
SMTInterpol default |
unknown ❌
|
600.08300
|
693.00000
|
| |
SMT-RAT |
SMTRAT-comp2017_2 default |
unknown ❌
|
600.01200
|
599.88800
|
| |
veriT |
veriT-2017-06-17 default |
unknown ❌
|
600.02700
|
599.87900
|
| |
Yices2 |
Yices2-Main default |
unknown ❌
|
600.10600
|
599.96000
|
| |
Z3 |
z3-4.5.0 default |
unknown ❌
|
600.10500
|
599.91000
|
|
SMT-COMP 2018
|
0.56 (4/9) |
Ctrl-Ergo |
Ctrl-Ergo-SMTComp-2018_default |
sat ✅
|
372.00500
|
1478.32000
|
| |
CVC4 |
master-2018-06-10-b19c840-competition-default_default |
sat ✅
|
860.01900
|
852.27600
|
| |
MathSAT |
mathsat-5.5.2-linux-x86_64-Main_default |
unknown ❌
|
1200.02000
|
1199.82000
|
| |
OpenSMT |
opensmt2_default |
unknown ❌
|
1200.02000
|
1200.00000
|
| |
SMTInterpol |
SMTInterpol-2.5-19-g0d39cdee_default |
sat ✅
|
572.35900
|
662.99800
|
| |
SMT-RAT |
SMTRAT-Rat-final_default |
unknown ❌
|
1200.01000
|
1199.96000
|
| |
|
SMTRAT-MCSAT-final_default |
unknown ❌
|
1200.02000
|
1199.87000
|
| |
veriT |
veriT_default |
unknown ❌
|
1200.02000
|
1199.91000
|
| |
Yices2 |
Yices 2.6.0_default |
sat ✅
|
123.09600
|
123.09500
|
| |
Z3 |
z3-4.7.1_default |
unknown ❌
|
1200.01000
|
1199.76000
|
|
SMT-COMP 2019
|
0.50 (4/8) |
Ctrl-Ergo |
Ctrl-Ergo-2019-wrapped-sq_default |
sat ✅
|
506.52700
|
2007.84000
|
| |
CVC4 |
CVC4-2019-06-03-d350fe1-wrapped-sq_default |
sat ✅
|
1127.17000
|
1124.34000
|
| |
|
CVC4-SymBreak_03_06_2019-wrapped-sq_default |
sat ✅
|
1516.63000
|
1515.02000
|
| |
|
master-2018-06-10-b19c840-competition-default_default |
sat ✅
|
920.20300
|
918.04000
|
| |
OpenSMT |
OpenSMT-wrapped-sq_default |
unknown ❌
|
2400.02000
|
2399.92000
|
| |
Par4 |
Par4-wrapped-sq_default |
sat ✅
|
325.54900
|
1286.33000
|
| |
SMTInterpol |
smtinterpol-2.5-514-wrapped-sq_default |
sat ✅
|
898.59100
|
926.88300
|
| |
veriT |
veriT-wrapped-sq_default |
unknown ❌
|
2400.01000
|
2399.81000
|
| |
Yices2 |
Yices 2.6.2-wrapped-sq_default |
unknown ❌
|
2400.10000
|
2399.85000
|
| |
Z3 |
z3-4.8.4-d6df51951f4c-wrapped-sq_default |
unknown ❌
|
2400.04000
|
2399.49000
|
|
SMT-COMP 2020
|
0.25 (6/8) |
CVC4 |
CVC4-sq-final_default |
sat ✅
|
946.24500
|
944.30100
|
| |
MathSAT |
MathSAT5_default.sh |
unknown ❌
|
1200.10000
|
1199.87000
|
| |
OpenSMT |
OpenSMT_default |
sat ✅
|
391.94700
|
391.91600
|
| |
Par4 |
Par4-wrapped-sq_default |
sat ✅
|
208.67400
|
822.28000
|
| |
SMTInterpol |
smtinterpol-2.5-679-gacfde87a_default |
sat ✅
|
486.55400
|
516.66100
|
| |
veriT |
veriT_default |
unknown ❌
|
1200.02000
|
1199.90000
|
| |
Yices2 |
Yices 2.6.2 bug fix_default |
sat ✅
|
828.99000
|
828.92400
|
| |
Z3 |
z3-4.8.8_default |
sat ✅
|
329.97900
|
329.89600
|
|
SMT-COMP 2021
|
0.67 (2/6) |
MathSAT |
mathsat-5.6.6_default |
unknown ❌
|
1200.07000
|
1199.78000
|
| |
mc2 |
mc2 2021-06-07_default.sh |
unknown ❌
|
13.93030
|
13.92910
|
| |
Par4 |
Par4-wrapped-sq_default |
sat ✅
|
207.61200
|
817.56000
|
| |
SMTInterpol |
smtinterpol-2.5-823-g881e8631_default |
unknown ❌
|
1200.14000
|
1254.06000
|
| |
veriT |
veriT_default |
unknown ❌
|
1200.08000
|
1199.81000
|
| |
Z3 |
z3-4.8.11_default |
sat ✅
|
1145.38000
|
1145.34000
|
|
SMT-COMP 2022
|
0.60 (2/5) |
cvc5 |
cvc5-default-2022-07-02-b15e116-wrapped_sq |
unknown ❌
|
1200.03000
|
1197.51000
|
| |
MathSAT |
MathSAT-5.6.8_default |
unknown ❌
|
1200.02000
|
1199.88000
|
| |
veriT |
veriT_default |
unknown ❌
|
1200.01000
|
1199.94000
|
| |
Yices2 |
Yices 2.6.2 for SMTCOMP 2021_default |
sat ✅
|
292.37500
|
292.32600
|
| |
|
Yices 2.6.2 for SMTCOMP 2021_default |
sat ✅
|
290.08900
|
290.03600
|
| |
Z3 |
z3-4.8.17_default |
sat ✅
|
828.54800
|
828.28600
|
|
SMT-COMP 2023
|
|
cvc5 |
cvc5-default-2023-05-16-ea045f305_sq |
sat ✅
|
666.01200
|
663.48100
|
| |
OpenSMT |
OpenSMT a78dcf01_default |
sat ✅
|
432.32100
|
432.25500
|
| |
SMTInterpol |
smtinterpol-2.5-1272-g2d6d356c_default |
sat ✅
|
760.77600
|
797.49500
|
| |
Yaga |
Yaga_SMT-COMP-2023_presubmition_default |
sat ✅
|
345.28400
|
345.23500
|
| |
Yices2 |
Yices 2 for SMTCOMP 2023_default |
sat ✅
|
169.11000
|
169.08300
|
| |
|
Yices 2.6.2 for SMTCOMP 2021_default |
sat ✅
|
165.65800
|
165.64600
|
| |
|
Yices 2.6.2 for SMTCOMP 2021_default |
sat ✅
|
165.37300
|
165.35300
|
|
SMT-COMP 2025
|
0.17 (5/6) |
cvc5 |
cvc5 |
sat ✅
|
860.71611
|
860.51449
|
| |
OpenSMT |
OpenSMT |
sat ✅
|
158.44121
|
158.29684
|
| |
SMTInterpol |
SMTInterpol |
sat ✅
|
258.41185
|
285.14244
|
| |
Yices2 |
Yices2 |
unknown ❌
|
1201.28839
|
1200.99056
|
| |
Z3alpha |
Z3-alpha |
sat ✅
|
1137.84161
|
4546.41596
|
| |
Z3 |
Z3-alpha-base |
sat ✅
|
391.76699
|
391.56721
|