Benchmark

non-incremental/QF_NRA/LassoRanker/Ultimate/piecewise.bpl_Iteration1_Loop_4-pieceTemplate.smt2

SMT script generated by Ultimate LassoRanker [1]
Ultimate LassoRanker is a tool that synthesizes ranking functions for 
linear lasso programs and implements the techniques presented in [2] and [3].
For generating these SMT scripts Ultimate LassoRanker was used as a sub 
procedure of the termination analyzer Ultimate BuchiAutomizer [4] which 
implements the techniques presented in [5].

This SMT script belongs to a set of SMT scripts that was generated by applying
revision 11505 of BuchiAutomizer to
- the benchmarks from the demonstration category on termination of the SV-COMP 
2014 [6] available at [7],
- the benchmarks from [8] which are available at [9] and,
- benchmarks from the repository of Ultimate LassoRanker.
This set of SMT scripts contains only SMT scripts that we considered to be 
difficult because LassoRanker run into a timeout after 10 seconds.

2014-05-03, Matthias Heizmann (heizmann@informatik.uni-freiburg.de)


[1] http://ultimate.informatik.uni-freiburg.de/LassoRanker/
[2] Jan Leike and Matthias Heizmann. Ranking Templates for Linear Loops. In 
TACAS 2014.
[3] Matthias Heizmann, Jochen Hoenicke, Jan Leike and Andreas Podelski. Linear 
Ranking for Linear Lasso Programs. In ATVA 2013.
[4] http://ultimate.informatik.uni-freiburg.de/BuchiAutomizer/
[5] Matthias Heizmann, Jochen Hoenicke and Andreas Podelski. Termination 
Analysis by Learning Terminating Programs. Accepted at CAV 2014
[6] Dirk Beyer: Status Report on Software Verification - (Competition Summary 
SV-COMP 2014). TACAS 2014
[7] https://svn.sosy-lab.org/software/sv-benchmarks/trunk/c/termination-crafted/
[8] Marc Brockschmidt, Byron Cook, Carsten Fuhs: Better Termination Proving 
through Cooperation. CAV 2013:413-429
[9] http://verify.rwth-aachen.de/brockschmidt/Cooperating-T2/
Benchmark
Size322547
Compressed Size16448
License Creative Commons Attribution 4.0 International (CC-BY-4.0)
Categoryindustrial
First Occurrence2014-07-21
Generated By
Generated On
Generator
Dolmen OK1
strict Dolmen OK1
check-sat calls1
Query 1
Status sat
Inferred Status sat
Size 322539
Compressed Size16431
Max. Term Depth10
Asserts 316
Declared Functions0
Declared Constants1999
Declared Sorts 0
Defined Functions0
Defined Recursive Functions 0
Defined Sorts0
Constants0
Declared Datatypes0

Symbols

or861 and315 =1954 let579
+2251 -1632 *3420 <315
<=315 >316 >=1911

Evaluations

Evaluation Rating Solver Variant Result Wallclock CPU Time
SMT-COMP 2014 1.00 (0/4) CVC3 CVC3 default unknown ❌ 2400.58000 2401.42000
CVC4 CVC4 f7118b2 default unknown ❌ 0.57712 0.56691
raSAT raSAT-main-track-final default.sh unknown ❌ 0.06248 0.04899
Z3 Z3-4.3.2.a054b099c1d6-x64-debian-6.0.6-SMT-COMP-2014 default unknown ❌ 2399.32000 2400.10000
SMT-COMP 2015 1.00 (0/6) CVC3 CVC3 default unknown ❌ 1138.37000 1138.68000
CVC4 CVC4-master-2015-06-15-9b32405-main default unknown ❌ 0.59351 0.59191
CVC4-experimental-2015-06-15-ff5745a-main default unknown ❌ 0.59842 0.59691
raSAT raSAT default.sh unknown ❌ 2400.01000 2400.45000
SMT-RAT SMT-RAT-final default unknown ❌ 2400.01000 2400.84000
Yices2 Yices2-NL default unknown ❌ 2400.01000 2400.87000
Z3 z3 4.4.0 default unknown ❌ 2400.02000 2400.84000
SMT-COMP 2016 0.80 (1/5) CVC4 CVC4-master-2016-05-27-cfef263-main default unknown ❌ 0.62145 0.62187
raSAT raSAT 0.3 default.sh unknown ❌ 2400.10000 2401.46000
raSAT 0.4 exp - final default.py unknown ❌ 2400.03000 4813.34000
SMT-RAT SMT-RAT default unknown ❌ 2400.05000 2401.40000
Yices2 Yices-2.4.2 default sat ✅ 1244.61000 1245.26000
Z3 z3-4.4.1 default unknown ❌ 2400.02000 2401.35000
SMT-COMP 2017 0.80 (1/5) CVC4 CVC4-smtcomp2017-main default sat ✅ 14.88930 14.88490
SMT-RAT SMTRAT-comp2017_2 default unknown ❌ 600.08700 600.00000
veriT veriT+raSAT+Redlog default unknown ❌ 600.03700 720.29000
Yices2 Yices2-Main default unknown ❌ 600.01200 600.02300
Z3 z3-4.5.0 default unknown ❌ 600.11200 599.90000
SMT-COMP 2018 0.60 (2/5) CVC4 master-2018-06-10-b19c840-competition-default_default sat ✅ 12.74840 12.74790
SMT-RAT SMTRAT-Rat-final_default unknown ❌ 1200.02000 1199.91000
SMTRAT-MCSAT-final_default unknown ❌ 1200.09000 1200.11000
veriT veriT+raSAT+Reduce_default unknown ❌ 1200.10000 1199.96000
Yices2 Yices 2.6.0_default sat ✅ 350.82900 350.84400
Z3 z3-4.7.1_default unknown ❌ 1200.01000 1199.85000
SMT-COMP 2019 0.43 (4/7) CVC4 CVC4-2019-06-03-d350fe1-wrapped-sq_default sat ✅ 12.49150 12.49020
CVC4-SymBreak_03_06_2019-wrapped-sq_default sat ✅ 19.79990 19.79360
MathSAT mathsat-20190601-wrapped-sq_default sat ✅ 21.19190 21.19000
mathsat-na-20190601-wrapped-sq_default sat ✅ 20.70620 20.70730
Par4 Par4-wrapped-sq_default sat ✅ 17.62660 52.24000
SMT-RAT SMTRAT-5-wrapped-sq_default unknown ❌ 2400.02000 2399.78000
SMTRAT-MCSAT-4-wrapped-sq_default unknown ❌ 2400.08000 2399.94000
veriT veriT+raSAT+Redlog-wrapped-sq_default unknown ❌ 2400.08000 2399.86000
Yices2 Yices 2.6.2-wrapped-sq_default sat ✅ 532.18800 532.19400
Z3 z3-4.8.4-d6df51951f4c-wrapped-sq_default unknown ❌ 2400.02000 2399.90000
z3-4.7.1_default unknown ❌ 2400.02000 2399.60000
SMT-COMP 2023 0.29 (5/7) cvc5 cvc5-default-2023-05-16-ea045f305_sq sat ✅ 23.50870 23.50760
NRA-LS cvc5-NRA-LS-sq_default sat ✅ 26.70020 26.69690
Par4 Par4-wrapped-sq_default sat ✅ 14.40530 42.89000
SMT-RAT SMT-RAT-MCSAT_default unknown ❌ 1200.02000 1200.01000
Yices2 Yices 2 for SMTCOMP 2023_default sat ✅ 56.77840 56.75920
Z3alpha z3alpha_default unknown ❌ 1200.10000 1199.96000
Z3++ z3++0715_default sat ✅ 152.14300 152.11400
Z3++_sq_0526_default sat ✅ 117.59800 117.58200
SMT-COMP 2024 0.40 (3/5) cvc5 cvc5 sat ✅ 4.19890 4.09945
SMTInterpol SMTInterpol unknown ❌ 1.90789 5.28577
SMT-RAT SMT-RAT unknown ❌ 1201.71269 1200.67202
Yices2 Yices2 sat ✅ 13.26882 13.16843
Z3alpha Z3-alpha sat ✅ 2.27659 2.17690
SMT-COMP 2025 0.33 (4/6) cvc5 cvc5 sat ✅ 7.30613 7.18011
SMTInterpol SMTInterpol unknown ❌ 1.65934 4.42384
SMT-RAT SMT-RAT unknown ❌ 1201.27703 1201.02332
Yices2 Yices2 sat ✅ 3.90671 3.78915
Z3alpha Z3-alpha sat ✅ 4.09578 12.23234
Z3 Z3-alpha-base sat ✅ 51.91422 51.77816
z3siri-base sat ✅ 51.98738 51.86362