Benchmark

non-incremental/QF_LRA/LassoRanker/Ultimate/SyntaxSupportDivision1.bpl_Iteration1_Lasso_4-pieceTemplate.smt2

SMT script generated by Ultimate LassoRanker [1]
Ultimate LassoRanker is a tool that synthesizes ranking functions for 
linear lasso programs and implements the techniques presented in [2] and [3].
For generating these SMT scripts Ultimate LassoRanker was used as a sub 
procedure of the termination analyzer Ultimate BuchiAutomizer [4] which 
implements the techniques presented in [5].

This SMT script belongs to a set of SMT scripts that was generated by applying
revision 11505 of BuchiAutomizer to
- the benchmarks from the demonstration category on termination of the SV-COMP 
2014 [6] available at [7],
- the benchmarks from [8] which are available at [9] and,
- benchmarks from the repository of Ultimate LassoRanker.
This set of SMT scripts contains only SMT scripts that we considered to be 
difficult because LassoRanker run into a timeout after 10 seconds.

2014-05-03, Matthias Heizmann (heizmann@informatik.uni-freiburg.de)


[1] http://ultimate.informatik.uni-freiburg.de/LassoRanker/
[2] Jan Leike and Matthias Heizmann. Ranking Templates for Linear Loops. In 
TACAS 2014.
[3] Matthias Heizmann, Jochen Hoenicke, Jan Leike and Andreas Podelski. Linear 
Ranking for Linear Lasso Programs. In ATVA 2013.
[4] http://ultimate.informatik.uni-freiburg.de/BuchiAutomizer/
[5] Matthias Heizmann, Jochen Hoenicke and Andreas Podelski. Termination 
Analysis by Learning Terminating Programs. Accepted at CAV 2014
[6] Dirk Beyer: Status Report on Software Verification - (Competition Summary 
SV-COMP 2014). TACAS 2014
[7] https://svn.sosy-lab.org/software/sv-benchmarks/trunk/c/termination-crafted/
[8] Marc Brockschmidt, Byron Cook, Carsten Fuhs: Better Termination Proving 
through Cooperation. CAV 2013:413-429
[9] http://verify.rwth-aachen.de/brockschmidt/Cooperating-T2/
Benchmark
Size4669743
Compressed Size255235
License Creative Commons Attribution 4.0 International (CC-BY-4.0)
Categoryindustrial
First Occurrence2014-07-21
Generated By
Generated On
Generator
Dolmen OK1
strict Dolmen OK
check-sat calls1
Query 1
Status sat
Inferred Status sat
Size 4669735
Compressed Size255243
Max. Term Depth35
Asserts 1681
Declared Functions0
Declared Constants17673
Declared Sorts 0
Defined Functions0
Defined Recursive Functions 0
Defined Sorts0
Constants0
Declared Datatypes0

Symbols

or8272 and8272 =17702 let14060
+37670 -19972 *48756 <3952
<=6760 >3145 >=17136

Evaluations

Evaluation Rating Solver Variant Result Wallclock CPU Time
SMT-COMP 2015 0.43 (4/7) CVC4 CVC4-master-2015-06-15-9b32405-main default sat ✅ 394.33200 394.51600
CVC4-experimental-2015-06-15-ff5745a-main default sat ✅ 389.86000 389.88200
MathSAT MathSat 5.3.6 main smtcomp2015_main sat ✅ 2180.49000 2181.27000
SMTInterpol SMTInterpol v2.1-206-g86e9531 default unknown ❌ 2400.02000 2428.41000
SMT-RAT SMT-RAT-final default unknown ❌ 2400.01000 2400.90000
veriT veriT default sat ✅ 121.56900 121.55700
Yices2 Yices default sat ✅ 20.57440 20.58190
Z3 z3 4.4.0 default unknown ❌ 2400.01000 2400.84000
SMT-COMP 2016 0.44 (5/9) CVC4 CVC4-master-2016-05-27-cfef263-main default sat ✅ 383.97700 380.69400
MathSAT mathsat-5.3.11-linux-x86_64-Main default sat ✅ 2306.01000 2307.41000
OpenSMT OpenSMT2-2016-05-12 default unknown ❌ 2400.02000 2401.53000
SMTInterpol smtinterpol-2.1-258-g92ab3df default sat ✅ 111.10600 142.89900
SMT-RAT SMT-RAT default unknown ❌ 2400.06000 2401.28000
Toysmt toysmt default unknown ❌ 1594.02000 1594.50000
veriT veriT-dev default sat ✅ 112.31900 112.37600
Yices2 Yices-2.4.2 default sat ✅ 11.31960 11.32550
Z3 z3-4.4.1 default unknown ❌ 2400.02000 2401.30000
SMT-COMP 2017 0.38 (5/8) CVC4 CVC4-smtcomp2017-main default sat ✅ 403.65100 400.99100
MathSAT mathsat-5.4.1-linux-x86_64-Main default sat ✅ 424.10600 423.97500
OpenSMT opensmt2-2017-06-04 default unknown ❌ 600.06700 599.95100
SMTInterpol SMTInterpol default sat ✅ 174.08000 215.14100
SMT-RAT SMTRAT-comp2017_2 default unknown ❌ 600.03800 599.98100
veriT veriT-2017-06-17 default sat ✅ 81.99810 81.99220
Yices2 Yices2-Main default sat ✅ 17.66390 17.66220
Z3 z3-4.5.0 default unknown ❌ 600.07900 599.93000
SMT-COMP 2018 0.44 (5/9) Ctrl-Ergo Ctrl-Ergo-SMTComp-2018_default unknown ❌ 1200.09000 4773.40000
CVC4 master-2018-06-10-b19c840-competition-default_default sat ✅ 469.08200 466.61300
MathSAT mathsat-5.5.2-linux-x86_64-Main_default sat ✅ 504.27900 504.11400
OpenSMT opensmt2_default unknown ❌ 1200.01000 1199.84000
SMTInterpol SMTInterpol-2.5-19-g0d39cdee_default sat ✅ 943.58700 1090.86000
SMT-RAT SMTRAT-Rat-final_default unknown ❌ 1200.02000 1200.04000
SMTRAT-MCSAT-final_default unknown ❌ 1200.01000 1199.94000
veriT veriT_default sat ✅ 156.39900 156.38200
Yices2 Yices 2.6.0_default sat ✅ 19.57020 19.57010
Z3 z3-4.7.1_default unknown ❌ 1200.01000 1199.77000
SMT-COMP 2020 CVC4 CVC4-sq-final_default sat ✅ 553.62700 550.36400
MathSAT MathSAT5_default.sh sat ✅ 657.00500 656.88400
OpenSMT OpenSMT_default sat ✅ 161.55500 161.52900
Par4 Par4-wrapped-sq_default sat ✅ 60.79080 240.02000
SMTInterpol smtinterpol-2.5-679-gacfde87a_default sat ✅ 919.48600 957.04700
veriT veriT_default sat ✅ 118.83100 118.82900
Yices2 Yices 2.6.2 bug fix_default sat ✅ 15.26020 15.26120
Z3 z3-4.8.8_default sat ✅ 251.70800 251.65700
SMT-COMP 2023 0.20 (4/5) cvc5 cvc5-default-2023-05-16-ea045f305_sq sat ✅ 540.10800 536.03800
OpenSMT OpenSMT a78dcf01_default sat ✅ 112.10500 112.10900
SMTInterpol smtinterpol-2.5-1272-g2d6d356c_default sat ✅ 702.71800 731.35500
Yaga Yaga_SMT-COMP-2023_presubmition_default unknown ❌ 1200.01000 1199.86000
Yices2 Yices 2 for SMTCOMP 2023_default sat ✅ 19.02190 19.01120
Yices 2.6.2 for SMTCOMP 2021_default sat ✅ 18.65390 18.65140
Yices 2.6.2 for SMTCOMP 2021_default sat ✅ 18.70060 18.69700
SMT-COMP 2025 0.17 (5/6) cvc5 cvc5 sat ✅ 106.84975 106.71460
OpenSMT OpenSMT sat ✅ 115.31159 115.14853
SMTInterpol SMTInterpol unknown ❌ 1201.44122 1223.48979
Yices2 Yices2 sat ✅ 8.68031 8.55082
Z3alpha Z3-alpha sat ✅ 401.72180 1601.76198
Z3 Z3-alpha-base sat ✅ 253.05901 252.86890