Benchmark
non-incremental/QF_NRA/LassoRanker/CooperatingT2/dropbuf.t2.c_Iteration2_Loop_3-pieceTemplate.smt2
SMT script generated by Ultimate LassoRanker [1]
Ultimate LassoRanker is a tool that synthesizes ranking functions for
linear lasso programs and implements the techniques presented in [2] and [3].
For generating these SMT scripts Ultimate LassoRanker was used as a sub
procedure of the termination analyzer Ultimate BuchiAutomizer [4] which
implements the techniques presented in [5].
This SMT script belongs to a set of SMT scripts that was generated by applying
revision 11505 of BuchiAutomizer to
- the benchmarks from the demonstration category on termination of the SV-COMP
2014 [6] available at [7],
- the benchmarks from [8] which are available at [9] and,
- benchmarks from the repository of Ultimate LassoRanker.
This set of SMT scripts contains only SMT scripts that we considered to be
difficult because LassoRanker run into a timeout after 10 seconds.
2014-05-03, Matthias Heizmann (heizmann@informatik.uni-freiburg.de)
[1] http://ultimate.informatik.uni-freiburg.de/LassoRanker/
[2] Jan Leike and Matthias Heizmann. Ranking Templates for Linear Loops. In
TACAS 2014.
[3] Matthias Heizmann, Jochen Hoenicke, Jan Leike and Andreas Podelski. Linear
Ranking for Linear Lasso Programs. In ATVA 2013.
[4] http://ultimate.informatik.uni-freiburg.de/BuchiAutomizer/
[5] Matthias Heizmann, Jochen Hoenicke and Andreas Podelski. Termination
Analysis by Learning Terminating Programs. Accepted at CAV 2014
[6] Dirk Beyer: Status Report on Software Verification - (Competition Summary
SV-COMP 2014). TACAS 2014
[7] https://svn.sosy-lab.org/software/sv-benchmarks/trunk/c/termination-crafted/
[8] Marc Brockschmidt, Byron Cook, Carsten Fuhs: Better Termination Proving
through Cooperation. CAV 2013:413-429
[9] http://verify.rwth-aachen.de/brockschmidt/Cooperating-T2/
| Benchmark |
| Size | 254137 |
| Compressed Size | 16055 |
| License |
Creative Commons Attribution 4.0 International
(CC-BY-4.0)
|
| Category | industrial |
| First Occurrence | 2014-07-21 |
| Generated By | — |
| Generated On | — |
| Generator | — |
| Dolmen OK | 1 |
| strict Dolmen OK | 1 |
| check-sat calls | 1 |
| Status | sat |
| Inferred Status | sat |
| Size | 254129 |
| Compressed Size | 16271 |
| Max. Term Depth | 10 |
| Asserts | 105 |
| Declared Functions | 0 |
| Declared Constants | 1463 |
| Declared Sorts | 0 |
| Defined Functions | 0 |
| Defined Recursive Functions | 0 |
| Defined Sorts | 0 |
| Constants | 0 |
| Declared Datatypes | 0 |
Symbols
or | 274 |
and | 104 |
= | 1618 |
let | 188 |
+ | 2510 |
- | 1548 |
* | 3798 |
< | 104 |
<= | 104 |
> | 105 |
>= | 1340 |
| |
Evaluations
| Evaluation |
Rating |
Solver |
Variant |
Result |
Wallclock |
CPU Time |
|
SMT-COMP 2014
|
0.75 (1/4) |
CVC3 |
CVC3 default |
unknown ❌
|
2310.52000
|
2311.30000
|
| |
CVC4 |
CVC4 f7118b2 default |
unknown ❌
|
0.43584
|
0.42593
|
| |
raSAT |
raSAT-main-track-final default.sh |
unknown ❌
|
0.02656
|
0.02100
|
| |
Z3 |
Z3-4.3.2.a054b099c1d6-x64-debian-6.0.6-SMT-COMP-2014 default |
sat ✅
|
0.68682
|
0.67890
|
|
SMT-COMP 2015
|
0.83 (1/6) |
CVC3 |
CVC3 default |
unknown ❌
|
1972.24000
|
1972.86000
|
| |
CVC4 |
CVC4-master-2015-06-15-9b32405-main default |
unknown ❌
|
0.44938
|
0.44793
|
| |
|
CVC4-experimental-2015-06-15-ff5745a-main default |
unknown ❌
|
0.45309
|
0.45093
|
| |
raSAT |
raSAT default.sh |
unknown ❌
|
2400.01000
|
2400.82000
|
| |
SMT-RAT |
SMT-RAT-final default |
unknown ❌
|
2400.01000
|
2401.00000
|
| |
Yices2 |
Yices2-NL default |
unknown ❌
|
2400.01000
|
2400.68000
|
| |
Z3 |
z3 4.4.0 default |
sat ✅
|
1.29888
|
1.29880
|
|
SMT-COMP 2016
|
1.00 (0/5) |
CVC4 |
CVC4-master-2016-05-27-cfef263-main default |
unknown ❌
|
0.43312
|
0.43353
|
| |
raSAT |
raSAT 0.3 default.sh |
unknown ❌
|
2400.02000
|
2401.37000
|
| |
|
raSAT 0.4 exp - final default.py |
unknown ❌
|
2400.03000
|
4814.10000
|
| |
SMT-RAT |
SMT-RAT default |
unknown ❌
|
2400.06000
|
2401.43000
|
| |
Yices2 |
Yices-2.4.2 default |
unknown ❌
|
2400.12000
|
2401.30000
|
| |
Z3 |
z3-4.4.1 default |
unknown ❌
|
2400.02000
|
2401.30000
|
|
SMT-COMP 2017
|
0.40 (3/5) |
CVC4 |
CVC4-smtcomp2017-main default |
sat ✅
|
0.88457
|
0.88472
|
| |
SMT-RAT |
SMTRAT-comp2017_2 default |
unknown ❌
|
600.08400
|
600.01600
|
| |
veriT |
veriT+raSAT+Redlog default |
unknown ❌
|
600.05000
|
752.15000
|
| |
Yices2 |
Yices2-Main default |
sat ✅
|
10.38430
|
10.38370
|
| |
Z3 |
z3-4.5.0 default |
sat ✅
|
10.34920
|
10.34830
|
|
SMT-COMP 2018
|
0.80 (1/5) |
CVC4 |
master-2018-06-10-b19c840-competition-default_default |
sat ✅
|
6.64544
|
6.64525
|
| |
SMT-RAT |
SMTRAT-Rat-final_default |
unknown ❌
|
1200.02000
|
1199.87000
|
| |
|
SMTRAT-MCSAT-final_default |
unknown ❌
|
1200.11000
|
1199.90000
|
| |
veriT |
veriT+raSAT+Reduce_default |
unknown ❌
|
1200.02000
|
1199.93000
|
| |
Yices2 |
Yices 2.6.0_default |
unknown ❌
|
1200.03000
|
1199.83000
|
| |
Z3 |
z3-4.7.1_default |
unknown ❌
|
1200.11000
|
1200.00000
|
|
SMT-COMP 2019
|
0.29 (5/7) |
CVC4 |
CVC4-2019-06-03-d350fe1-wrapped-sq_default |
sat ✅
|
4.37902
|
4.37939
|
| |
|
CVC4-SymBreak_03_06_2019-wrapped-sq_default |
sat ✅
|
6.75989
|
6.76025
|
| |
MathSAT |
mathsat-20190601-wrapped-sq_default |
sat ✅
|
1.02213
|
1.02198
|
| |
|
mathsat-na-20190601-wrapped-sq_default |
sat ✅
|
1.03567
|
1.03553
|
| |
Par4 |
Par4-wrapped-sq_default |
sat ✅
|
3.15874
|
9.25000
|
| |
SMT-RAT |
SMTRAT-5-wrapped-sq_default |
unknown ❌
|
2400.05000
|
2399.79000
|
| |
|
SMTRAT-MCSAT-4-wrapped-sq_default |
unknown ❌
|
2400.05000
|
2399.82000
|
| |
veriT |
veriT+raSAT+Redlog-wrapped-sq_default |
unknown ❌
|
2400.02000
|
2400.10000
|
| |
Yices2 |
Yices 2.6.2-wrapped-sq_default |
sat ✅
|
10.72240
|
10.72160
|
| |
Z3 |
z3-4.8.4-d6df51951f4c-wrapped-sq_default |
sat ✅
|
3.29802
|
3.29823
|
| |
|
z3-4.7.1_default |
sat ✅
|
3.17147
|
3.17139
|
|
SMT-COMP 2021
|
0.70 (3/10) |
MathSAT |
mathsat-5.6.6_default |
sat ✅
|
1.89763
|
1.89760
|
| |
Par4 |
Par4-wrapped-sq_default |
sat ✅
|
2.75392
|
8.02000
|
| |
SMT-RAT |
smtrat-MCSAT_default |
unknown ❌
|
1200.10000
|
1200.02000
|
| |
veriT |
veriT+raSAT+Redlog_default |
unknown ❌
|
1200.10000
|
1200.01000
|
| |
Z3 |
z3-4.8.11_default |
sat ✅
|
1.77946
|
1.77953
|
|
SMT-COMP 2022
|
0.33 (6/9) |
cvc5 |
cvc5-default-2022-07-02-b15e116-wrapped_sq |
sat ✅
|
4.08293
|
4.08364
|
| |
MathSAT |
MathSAT-5.6.8_default |
sat ✅
|
2.08436
|
2.08408
|
| |
NRA-LS |
NRA-LS-FINAL_default |
sat ✅
|
10.68550
|
10.68000
|
| |
Par4 |
Par4-wrapped-sq_default |
sat ✅
|
1.38844
|
3.86000
|
| |
SMT-RAT |
SMT-RAT-MCSAT_default |
unknown ❌
|
1200.03000
|
1199.82000
|
| |
veriT |
veriT+raSAT+Redlog_default |
unknown ❌
|
1200.08000
|
1200.00000
|
| |
Yices2 |
Yices 2.6.2 for SMTCOMP 2021_default |
unknown ❌
|
1200.03000
|
1199.81000
|
| |
Z3 |
z3-4.8.17_default |
sat ✅
|
1.22250
|
1.22428
|
| |
Z3++ |
z3++0715_default |
sat ✅
|
4.07474
|
4.07461
|
|
SMT-COMP 2023
|
|
cvc5 |
cvc5-default-2023-05-16-ea045f305_sq |
sat ✅
|
2.57965
|
2.58003
|
| |
NRA-LS |
cvc5-NRA-LS-sq_default |
sat ✅
|
10.90840
|
10.90600
|
| |
Par4 |
Par4-wrapped-sq_default |
sat ✅
|
4.79720
|
14.28000
|
| |
SMT-RAT |
SMT-RAT-MCSAT_default |
sat ✅
|
614.17100
|
614.14000
|
| |
Yices2 |
Yices 2 for SMTCOMP 2023_default |
sat ✅
|
21.43130
|
21.42730
|
| |
Z3alpha |
z3alpha_default |
sat ✅
|
0.89590
|
0.89617
|
| |
Z3++ |
z3++0715_default |
sat ✅
|
25.81390
|
25.81050
|
| |
|
Z3++_sq_0526_default |
sat ✅
|
25.83500
|
25.83240
|
|
SMT-COMP 2024
|
0.20 (4/5) |
cvc5 |
cvc5 |
sat ✅
|
1.67594
|
1.57629
|
| |
SMTInterpol |
SMTInterpol |
unknown ❌
|
1.42881
|
3.92462
|
| |
SMT-RAT |
SMT-RAT |
sat ✅
|
328.41725
|
328.17074
|
| |
Yices2 |
Yices2 |
sat ✅
|
2.04906
|
1.94742
|
| |
Z3alpha |
Z3-alpha |
sat ✅
|
1.05485
|
0.95372
|
|
SMT-COMP 2025
|
0.17 (5/6) |
cvc5 |
cvc5 |
sat ✅
|
1.41033
|
1.29082
|
| |
SMTInterpol |
SMTInterpol |
unknown ❌
|
1.29120
|
3.31099
|
| |
SMT-RAT |
SMT-RAT |
sat ✅
|
493.35780
|
493.15333
|
| |
Yices2 |
Yices2 |
sat ✅
|
3.81874
|
3.69056
|
| |
Z3alpha |
Z3-alpha |
sat ✅
|
1.54501
|
2.52062
|
| |
Z3 |
Z3-alpha-base |
sat ✅
|
0.90182
|
0.78389
|
| |
|
z3siri-base |
sat ✅
|
0.90811
|
0.78771
|