Benchmark

non-incremental/QF_LRA/LassoRanker/Ultimate/yPositive-SIscaled75.bpl_Iteration1_Loop_7-phaseTemplate.smt2

SMT script generated by Ultimate LassoRanker [1]
Ultimate LassoRanker is a tool that synthesizes ranking functions for 
linear lasso programs and implements the techniques presented in [2] and [3].
For generating these SMT scripts Ultimate LassoRanker was used as a sub 
procedure of the termination analyzer Ultimate BuchiAutomizer [4] which 
implements the techniques presented in [5].

This SMT script belongs to a set of SMT scripts that was generated by applying
revision 11505 of BuchiAutomizer to
- the benchmarks from the demonstration category on termination of the SV-COMP 
2014 [6] available at [7],
- the benchmarks from [8] which are available at [9] and,
- benchmarks from the repository of Ultimate LassoRanker.
This set of SMT scripts contains only SMT scripts that we considered to be 
difficult because LassoRanker run into a timeout after 10 seconds.

2014-05-03, Matthias Heizmann (heizmann@informatik.uni-freiburg.de)


[1] http://ultimate.informatik.uni-freiburg.de/LassoRanker/
[2] Jan Leike and Matthias Heizmann. Ranking Templates for Linear Loops. In 
TACAS 2014.
[3] Matthias Heizmann, Jochen Hoenicke, Jan Leike and Andreas Podelski. Linear 
Ranking for Linear Lasso Programs. In ATVA 2013.
[4] http://ultimate.informatik.uni-freiburg.de/BuchiAutomizer/
[5] Matthias Heizmann, Jochen Hoenicke and Andreas Podelski. Termination 
Analysis by Learning Terminating Programs. Accepted at CAV 2014
[6] Dirk Beyer: Status Report on Software Verification - (Competition Summary 
SV-COMP 2014). TACAS 2014
[7] https://svn.sosy-lab.org/software/sv-benchmarks/trunk/c/termination-crafted/
[8] Marc Brockschmidt, Byron Cook, Carsten Fuhs: Better Termination Proving 
through Cooperation. CAV 2013:413-429
[9] http://verify.rwth-aachen.de/brockschmidt/Cooperating-T2/
Benchmark
Size4604204
Compressed Size225882
License Creative Commons Attribution 4.0 International (CC-BY-4.0)
Categoryindustrial
First Occurrence2014-07-21
Generated By
Generated On
Generator
Dolmen OK1
strict Dolmen OK
check-sat calls1
Query 1
Status sat
Inferred Status sat
Size 4604196
Compressed Size225870
Max. Term Depth621
Asserts 31
Declared Functions0
Declared Constants3010
Declared Sorts 0
Defined Functions0
Defined Recursive Functions 0
Defined Sorts0
Constants0
Declared Datatypes0

Symbols

or438 and438 =31388 let3733
+35077 -5136 *12193 <406
<=422 >39 >=2456

Evaluations

Evaluation Rating Solver Variant Result Wallclock CPU Time
SMT-COMP 2015 0.43 (4/7) CVC4 CVC4-master-2015-06-15-9b32405-main default sat ✅ 933.09900 933.47500
CVC4-experimental-2015-06-15-ff5745a-main default sat ✅ 936.62000 936.92900
MathSAT MathSat 5.3.6 main smtcomp2015_main unknown ❌ 2400.01000 2401.05000
SMTInterpol SMTInterpol v2.1-206-g86e9531 default sat ✅ 314.07500 697.23200
SMT-RAT SMT-RAT-final default sat ✅ 77.00500 76.99630
veriT veriT default unknown ❌ 2400.01000 2400.58000
Yices2 Yices default sat ✅ 50.88460 50.89530
Z3 z3 4.4.0 default unknown ❌ 2400.01000 2401.10000
SMT-COMP 2016 0.78 (2/9) CVC4 CVC4-master-2016-05-27-cfef263-main default unknown ❌ 2400.11000 2390.95000
MathSAT mathsat-5.3.11-linux-x86_64-Main default unknown ❌ 2400.12000 2401.23000
OpenSMT OpenSMT2-2016-05-12 default unknown ❌ 2400.03000 2401.30000
SMTInterpol smtinterpol-2.1-258-g92ab3df default sat ✅ 395.70700 560.06900
SMT-RAT SMT-RAT default unknown ❌ 2400.12000 2401.45000
Toysmt toysmt default unknown ❌ 1758.32000 1759.33000
veriT veriT-dev default unknown ❌ 2400.11000 2401.22000
Yices2 Yices-2.4.2 default sat ✅ 8.38923 8.39338
Z3 z3-4.4.1 default unknown ❌ 2400.12000 2401.02000
SMT-COMP 2017 0.88 (1/8) CVC4 CVC4-smtcomp2017-main default unknown ❌ 600.02000 569.27100
MathSAT mathsat-5.4.1-linux-x86_64-Main default unknown ❌ 600.01600 599.86000
OpenSMT opensmt2-2017-06-04 default unknown ❌ 600.01400 599.82000
SMTInterpol SMTInterpol default unknown ❌ 600.05100 845.76000
SMT-RAT SMTRAT-comp2017_2 default unknown ❌ 600.03000 599.91000
veriT veriT-2017-06-17 default unknown ❌ 600.09100 599.98000
Yices2 Yices2-Main default sat ✅ 40.61210 40.60740
Z3 z3-4.5.0 default unknown ❌ 600.01400 599.95000
SMT-COMP 2018 0.67 (3/9) Ctrl-Ergo Ctrl-Ergo-SMTComp-2018_default sat ✅ 113.59300 450.70000
CVC4 master-2018-06-10-b19c840-competition-default_default sat ✅ 325.47000 315.38700
MathSAT mathsat-5.5.2-linux-x86_64-Main_default unknown ❌ 1200.01000 1199.71000
OpenSMT opensmt2_default unknown ❌ 1200.01000 1199.99000
SMTInterpol SMTInterpol-2.5-19-g0d39cdee_default unknown ❌ 1200.06000 1682.09000
SMT-RAT SMTRAT-Rat-final_default unknown ❌ 1200.07000 1200.06000
SMTRAT-MCSAT-final_default unknown ❌ 1200.12000 1200.03000
veriT veriT_default unknown ❌ 1200.03000 1199.80000
Yices2 Yices 2.6.0_default sat ✅ 47.88600 47.88380
Z3 z3-4.7.1_default unknown ❌ 1200.02000 1200.00000
SMT-COMP 2022 0.40 (3/5) cvc5 cvc5-default-2022-07-02-b15e116-wrapped_sq sat ✅ 728.69300 718.83200
MathSAT MathSAT-5.6.8_default unknown ❌ 1200.02000 1199.46000
veriT veriT_default unknown ❌ 1200.03000 1199.69000
Yices2 Yices 2.6.2 for SMTCOMP 2021_default sat ✅ 37.57120 37.57130
Yices 2.6.2 for SMTCOMP 2021_default sat ✅ 37.58380 37.58060
Z3 z3-4.8.17_default sat ✅ 80.05970 80.05380