Benchmark
non-incremental/QF_NRA/LassoRanker/Ultimate/ChenFlurMukhopadhyay-2012SAS-Fig1.bpl_Iteration1_Loop_3-pieceTemplate.smt2
SMT script generated by Ultimate LassoRanker [1]
Ultimate LassoRanker is a tool that synthesizes ranking functions for
linear lasso programs and implements the techniques presented in [2] and [3].
For generating these SMT scripts Ultimate LassoRanker was used as a sub
procedure of the termination analyzer Ultimate BuchiAutomizer [4] which
implements the techniques presented in [5].
This SMT script belongs to a set of SMT scripts that was generated by applying
revision 11505 of BuchiAutomizer to
- the benchmarks from the demonstration category on termination of the SV-COMP
2014 [6] available at [7],
- the benchmarks from [8] which are available at [9] and,
- benchmarks from the repository of Ultimate LassoRanker.
This set of SMT scripts contains only SMT scripts that we considered to be
difficult because LassoRanker run into a timeout after 10 seconds.
2014-05-03, Matthias Heizmann (heizmann@informatik.uni-freiburg.de)
[1] http://ultimate.informatik.uni-freiburg.de/LassoRanker/
[2] Jan Leike and Matthias Heizmann. Ranking Templates for Linear Loops. In
TACAS 2014.
[3] Matthias Heizmann, Jochen Hoenicke, Jan Leike and Andreas Podelski. Linear
Ranking for Linear Lasso Programs. In ATVA 2013.
[4] http://ultimate.informatik.uni-freiburg.de/BuchiAutomizer/
[5] Matthias Heizmann, Jochen Hoenicke and Andreas Podelski. Termination
Analysis by Learning Terminating Programs. Accepted at CAV 2014
[6] Dirk Beyer: Status Report on Software Verification - (Competition Summary
SV-COMP 2014). TACAS 2014
[7] https://svn.sosy-lab.org/software/sv-benchmarks/trunk/c/termination-crafted/
[8] Marc Brockschmidt, Byron Cook, Carsten Fuhs: Better Termination Proving
through Cooperation. CAV 2013:413-429
[9] http://verify.rwth-aachen.de/brockschmidt/Cooperating-T2/
| Benchmark |
| Size | 47707 |
| Compressed Size | 4268 |
| License |
Creative Commons Attribution 4.0 International
(CC-BY-4.0)
|
| Category | industrial |
| First Occurrence | 2014-07-21 |
| Generated By | — |
| Generated On | — |
| Generator | — |
| Dolmen OK | 1 |
| strict Dolmen OK | 1 |
| check-sat calls | 1 |
| Status | unknown |
| Inferred Status | unsat |
| Size | 47699 |
| Compressed Size | 4274 |
| Max. Term Depth | 10 |
| Asserts | 40 |
| Declared Functions | 0 |
| Declared Constants | 318 |
| Declared Sorts | 0 |
| Defined Functions | 0 |
| Defined Recursive Functions | 0 |
| Defined Sorts | 0 |
| Constants | 0 |
| Declared Datatypes | 0 |
Symbols
or | 98 |
and | 39 |
= | 300 |
let | 68 |
+ | 396 |
- | 239 |
* | 578 |
< | 39 |
<= | 39 |
> | 40 |
>= | 280 |
| |
Evaluations
| Evaluation |
Rating |
Solver |
Variant |
Result |
Wallclock |
CPU Time |
|
SMT-COMP 2017
|
0.80 (1/5) |
CVC4 |
CVC4-smtcomp2017-main default |
unsat ✅
|
33.19420
|
33.18130
|
| |
SMT-RAT |
SMTRAT-comp2017_2 default |
unknown ❌
|
600.03400
|
600.02500
|
| |
veriT |
veriT+raSAT+Redlog default |
unknown ❌
|
600.13900
|
660.89000
|
| |
Yices2 |
Yices2-Main default |
unknown ❌
|
600.04700
|
599.98000
|
| |
Z3 |
z3-4.5.0 default |
unknown ❌
|
600.02100
|
599.89000
|
|
SMT-COMP 2018
|
0.80 (1/5) |
CVC4 |
master-2018-06-10-b19c840-competition-default_default |
unsat ✅
|
0.96435
|
0.96455
|
| |
SMT-RAT |
SMTRAT-Rat-final_default |
unknown ❌
|
1200.02000
|
1199.87000
|
| |
|
SMTRAT-MCSAT-final_default |
unknown ❌
|
1200.09000
|
1199.90000
|
| |
veriT |
veriT+raSAT+Reduce_default |
unknown ❌
|
1200.02000
|
1199.96000
|
| |
Yices2 |
Yices 2.6.0_default |
unknown ❌
|
1200.02000
|
1199.96000
|
| |
Z3 |
z3-4.7.1_default |
unknown ❌
|
1200.09000
|
1199.88000
|
|
SMT-COMP 2022
|
0.44 (5/9) |
cvc5 |
cvc5-default-2022-07-02-b15e116-wrapped_sq |
unsat ✅
|
6.01399
|
6.01433
|
| |
MathSAT |
MathSAT-5.6.8_default |
unsat ✅
|
2.22130
|
2.22098
|
| |
NRA-LS |
NRA-LS-FINAL_default |
unsat ✅
|
1.38210
|
1.38211
|
| |
Par4 |
Par4-wrapped-sq_default |
unsat ✅
|
1.46166
|
3.41000
|
| |
SMT-RAT |
SMT-RAT-MCSAT_default |
unknown ❌
|
1200.09000
|
1199.98000
|
| |
veriT |
veriT+raSAT+Redlog_default |
unknown ❌
|
1200.02000
|
1200.01000
|
| |
Yices2 |
Yices 2.6.2 for SMTCOMP 2021_default |
unknown ❌
|
1200.02000
|
1199.87000
|
| |
Z3 |
z3-4.8.17_default |
unknown ❌
|
1200.02000
|
1199.91000
|
| |
Z3++ |
z3++0715_default |
unsat ✅
|
214.90400
|
214.88200
|
|
SMT-COMP 2023
|
|
cvc5 |
cvc5-default-2023-05-16-ea045f305_sq |
unsat ✅
|
3.13965
|
3.14011
|
| |
NRA-LS |
cvc5-NRA-LS-sq_default |
unsat ✅
|
1.36855
|
1.36857
|
| |
Par4 |
Par4-wrapped-sq_default |
unsat ✅
|
0.91932
|
2.44000
|
| |
SMT-RAT |
SMT-RAT-MCSAT_default |
unsat ✅
|
187.32200
|
187.29400
|
| |
Yices2 |
Yices 2 for SMTCOMP 2023_default |
unsat ✅
|
30.30880
|
30.30480
|
| |
Z3alpha |
z3alpha_default |
unsat ✅
|
267.08800
|
267.07700
|
| |
Z3++ |
z3++0715_default |
unsat ✅
|
210.04800
|
210.01900
|
| |
|
Z3++_sq_0526_default |
unsat ✅
|
282.48900
|
282.48500
|
|
SMT-COMP 2024
|
0.40 (3/5) |
cvc5 |
cvc5 |
unsat ✅
|
2.81380
|
2.71407
|
| |
SMTInterpol |
SMTInterpol |
unknown ❌
|
0.77351
|
1.71843
|
| |
SMT-RAT |
SMT-RAT |
unsat ✅
|
66.84347
|
66.73077
|
| |
Yices2 |
Yices2 |
unknown ❌
|
1201.27813
|
1200.54892
|
| |
Z3alpha |
Z3-alpha |
unsat ✅
|
97.15455
|
97.05358
|
|
SMT-COMP 2025
|
0.17 (5/6) |
cvc5 |
cvc5 |
unsat ✅
|
1.53701
|
1.41449
|
| |
SMTInterpol |
SMTInterpol |
unknown ❌
|
0.71354
|
1.49688
|
| |
SMT-RAT |
SMT-RAT |
unsat ✅
|
62.79815
|
62.65937
|
| |
Yices2 |
Yices2 |
unsat ✅
|
5.59650
|
5.47589
|
| |
Z3alpha |
Z3-alpha |
unsat ✅
|
1.37287
|
3.24066
|
| |
Z3 |
Z3-alpha-base |
unsat ✅
|
51.08052
|
50.94210
|
| |
|
z3siri-base |
unsat ✅
|
51.06704
|
50.94734
|