Benchmark
non-incremental/QF_NRA/LassoRanker/CooperatingT2/mc91test.t2.c_Iteration2_Loop_4-pieceTemplate.smt2
SMT script generated by Ultimate LassoRanker [1]
Ultimate LassoRanker is a tool that synthesizes ranking functions for
linear lasso programs and implements the techniques presented in [2] and [3].
For generating these SMT scripts Ultimate LassoRanker was used as a sub
procedure of the termination analyzer Ultimate BuchiAutomizer [4] which
implements the techniques presented in [5].
This SMT script belongs to a set of SMT scripts that was generated by applying
revision 11505 of BuchiAutomizer to
- the benchmarks from the demonstration category on termination of the SV-COMP
2014 [6] available at [7],
- the benchmarks from [8] which are available at [9] and,
- benchmarks from the repository of Ultimate LassoRanker.
This set of SMT scripts contains only SMT scripts that we considered to be
difficult because LassoRanker run into a timeout after 10 seconds.
2014-05-03, Matthias Heizmann (heizmann@informatik.uni-freiburg.de)
[1] http://ultimate.informatik.uni-freiburg.de/LassoRanker/
[2] Jan Leike and Matthias Heizmann. Ranking Templates for Linear Loops. In
TACAS 2014.
[3] Matthias Heizmann, Jochen Hoenicke, Jan Leike and Andreas Podelski. Linear
Ranking for Linear Lasso Programs. In ATVA 2013.
[4] http://ultimate.informatik.uni-freiburg.de/BuchiAutomizer/
[5] Matthias Heizmann, Jochen Hoenicke and Andreas Podelski. Termination
Analysis by Learning Terminating Programs. Accepted at CAV 2014
[6] Dirk Beyer: Status Report on Software Verification - (Competition Summary
SV-COMP 2014). TACAS 2014
[7] https://svn.sosy-lab.org/software/sv-benchmarks/trunk/c/termination-crafted/
[8] Marc Brockschmidt, Byron Cook, Carsten Fuhs: Better Termination Proving
through Cooperation. CAV 2013:413-429
[9] http://verify.rwth-aachen.de/brockschmidt/Cooperating-T2/
| Benchmark |
| Size | 137783 |
| Compressed Size | 8684 |
| License |
Creative Commons Attribution 4.0 International
(CC-BY-4.0)
|
| Category | industrial |
| First Occurrence | 2014-07-21 |
| Generated By | — |
| Generated On | — |
| Generator | — |
| Dolmen OK | 1 |
| strict Dolmen OK | 1 |
| check-sat calls | 1 |
| Status | sat |
| Inferred Status | sat |
| Size | 137775 |
| Compressed Size | 8920 |
| Max. Term Depth | 10 |
| Asserts | 64 |
| Declared Functions | 0 |
| Declared Constants | 809 |
| Declared Sorts | 0 |
| Defined Functions | 0 |
| Defined Recursive Functions | 0 |
| Defined Sorts | 0 |
| Constants | 0 |
| Declared Datatypes | 0 |
Symbols
or | 161 |
and | 63 |
= | 764 |
let | 109 |
+ | 1294 |
- | 866 |
* | 2190 |
< | 63 |
<= | 63 |
> | 64 |
>= | 707 |
| |
Evaluations
| Evaluation |
Rating |
Solver |
Variant |
Result |
Wallclock |
CPU Time |
|
SMT-COMP 2014
|
1.00 (0/4) |
CVC3 |
CVC3 default |
unknown ❌
|
208.93200
|
208.89200
|
| |
CVC4 |
CVC4 f7118b2 default |
unknown ❌
|
0.24598
|
0.23496
|
| |
raSAT |
raSAT-main-track-final default.sh |
unknown ❌
|
0.02243
|
0.00900
|
| |
Z3 |
Z3-4.3.2.a054b099c1d6-x64-debian-6.0.6-SMT-COMP-2014 default |
unknown ❌
|
2399.02000
|
2400.09000
|
|
SMT-COMP 2015
|
0.67 (2/6) |
CVC3 |
CVC3 default |
unknown ❌
|
190.98400
|
191.07900
|
| |
CVC4 |
CVC4-master-2015-06-15-9b32405-main default |
unknown ❌
|
0.24722
|
0.24496
|
| |
|
CVC4-experimental-2015-06-15-ff5745a-main default |
unknown ❌
|
0.24835
|
0.24596
|
| |
raSAT |
raSAT default.sh |
unknown ❌
|
2400.01000
|
2400.87000
|
| |
SMT-RAT |
SMT-RAT-final default |
unknown ❌
|
2400.01000
|
2401.02000
|
| |
Yices2 |
Yices2-NL default |
sat ✅
|
82.01010
|
82.04450
|
| |
Z3 |
z3 4.4.0 default |
sat ✅
|
2.14627
|
2.14667
|
|
SMT-COMP 2016
|
0.80 (1/5) |
CVC4 |
CVC4-master-2016-05-27-cfef263-main default |
unknown ❌
|
0.24118
|
0.24147
|
| |
raSAT |
raSAT 0.3 default.sh |
unknown ❌
|
2400.12000
|
2401.67000
|
| |
|
raSAT 0.4 exp - final default.py |
unknown ❌
|
2400.10000
|
4812.55000
|
| |
SMT-RAT |
SMT-RAT default |
unknown ❌
|
2400.05000
|
2401.40000
|
| |
Yices2 |
Yices-2.4.2 default |
sat ✅
|
28.61090
|
28.62980
|
| |
Z3 |
z3-4.4.1 default |
unknown ❌
|
2400.10000
|
2401.36000
|
|
SMT-COMP 2017
|
0.40 (3/5) |
CVC4 |
CVC4-smtcomp2017-main default |
sat ✅
|
0.35124
|
0.34991
|
| |
SMT-RAT |
SMTRAT-comp2017_2 default |
unknown ❌
|
600.07200
|
600.00900
|
| |
veriT |
veriT+raSAT+Redlog default |
unknown ❌
|
600.04400
|
718.22000
|
| |
Yices2 |
Yices2-Main default |
sat ✅
|
15.44050
|
15.44050
|
| |
Z3 |
z3-4.5.0 default |
sat ✅
|
1.01474
|
1.01408
|
|
SMT-COMP 2018
|
0.40 (3/5) |
CVC4 |
master-2018-06-10-b19c840-competition-default_default |
sat ✅
|
2.79285
|
2.79290
|
| |
SMT-RAT |
SMTRAT-Rat-final_default |
unknown ❌
|
1200.02000
|
1199.85000
|
| |
|
SMTRAT-MCSAT-final_default |
unknown ❌
|
1200.10000
|
1200.10000
|
| |
veriT |
veriT+raSAT+Reduce_default |
unknown ❌
|
1200.01000
|
1199.89000
|
| |
Yices2 |
Yices 2.6.0_default |
sat ✅
|
8.24041
|
8.23997
|
| |
Z3 |
z3-4.7.1_default |
sat ✅
|
0.25192
|
0.25188
|
|
SMT-COMP 2022
|
0.22 (7/9) |
cvc5 |
cvc5-default-2022-07-02-b15e116-wrapped_sq |
sat ✅
|
1.75026
|
1.75070
|
| |
MathSAT |
MathSAT-5.6.8_default |
sat ✅
|
0.74413
|
0.74406
|
| |
NRA-LS |
NRA-LS-FINAL_default |
sat ✅
|
4.19731
|
4.19702
|
| |
Par4 |
Par4-wrapped-sq_default |
sat ✅
|
0.23996
|
0.00629
|
| |
SMT-RAT |
SMT-RAT-MCSAT_default |
unknown ❌
|
1200.02000
|
1199.95000
|
| |
veriT |
veriT+raSAT+Redlog_default |
unknown ❌
|
1200.02000
|
1199.94000
|
| |
Yices2 |
Yices 2.6.2 for SMTCOMP 2021_default |
sat ✅
|
9.89433
|
9.89384
|
| |
Z3 |
z3-4.8.17_default |
sat ✅
|
0.25228
|
0.25421
|
| |
Z3++ |
z3++0715_default |
sat ✅
|
0.53744
|
0.53746
|
|
SMT-COMP 2023
|
|
cvc5 |
cvc5-default-2023-05-16-ea045f305_sq |
sat ✅
|
0.75330
|
0.75373
|
| |
NRA-LS |
cvc5-NRA-LS-sq_default |
sat ✅
|
5.04463
|
5.03925
|
| |
Par4 |
Par4-wrapped-sq_default |
sat ✅
|
2.44436
|
7.21000
|
| |
SMT-RAT |
SMT-RAT-MCSAT_default |
sat ✅
|
339.28300
|
339.25300
|
| |
Yices2 |
Yices 2 for SMTCOMP 2023_default |
sat ✅
|
5.30088
|
5.30034
|
| |
Z3alpha |
z3alpha_default |
sat ✅
|
33.98480
|
67.86410
|
| |
Z3++ |
z3++0715_default |
sat ✅
|
0.50433
|
0.50438
|
| |
|
Z3++_sq_0526_default |
sat ✅
|
0.51581
|
0.51576
|
|
SMT-COMP 2024
|
0.20 (4/5) |
cvc5 |
cvc5 |
sat ✅
|
0.88312
|
0.78360
|
| |
SMTInterpol |
SMTInterpol |
unknown ❌
|
1.13081
|
2.84307
|
| |
SMT-RAT |
SMT-RAT |
sat ✅
|
183.02350
|
182.92213
|
| |
Yices2 |
Yices2 |
sat ✅
|
1.50744
|
1.40764
|
| |
Z3alpha |
Z3-alpha |
sat ✅
|
96.74252
|
96.64088
|