Benchmark

non-incremental/QF_LRA/LassoRanker/Ultimate/Braverman-2006CAV-Ex1-int.bpl_Iteration1_Loop_4-pieceTemplate.smt2

SMT script generated by Ultimate LassoRanker [1]
Ultimate LassoRanker is a tool that synthesizes ranking functions for 
linear lasso programs and implements the techniques presented in [2] and [3].
For generating these SMT scripts Ultimate LassoRanker was used as a sub 
procedure of the termination analyzer Ultimate BuchiAutomizer [4] which 
implements the techniques presented in [5].

This SMT script belongs to a set of SMT scripts that was generated by applying
revision 11505 of BuchiAutomizer to
- the benchmarks from the demonstration category on termination of the SV-COMP 
2014 [6] available at [7],
- the benchmarks from [8] which are available at [9] and,
- benchmarks from the repository of Ultimate LassoRanker.
This set of SMT scripts contains only SMT scripts that we considered to be 
difficult because LassoRanker run into a timeout after 10 seconds.

2014-05-03, Matthias Heizmann (heizmann@informatik.uni-freiburg.de)


[1] http://ultimate.informatik.uni-freiburg.de/LassoRanker/
[2] Jan Leike and Matthias Heizmann. Ranking Templates for Linear Loops. In 
TACAS 2014.
[3] Matthias Heizmann, Jochen Hoenicke, Jan Leike and Andreas Podelski. Linear 
Ranking for Linear Lasso Programs. In ATVA 2013.
[4] http://ultimate.informatik.uni-freiburg.de/BuchiAutomizer/
[5] Matthias Heizmann, Jochen Hoenicke and Andreas Podelski. Termination 
Analysis by Learning Terminating Programs. Accepted at CAV 2014
[6] Dirk Beyer: Status Report on Software Verification - (Competition Summary 
SV-COMP 2014). TACAS 2014
[7] https://svn.sosy-lab.org/software/sv-benchmarks/trunk/c/termination-crafted/
[8] Marc Brockschmidt, Byron Cook, Carsten Fuhs: Better Termination Proving 
through Cooperation. CAV 2013:413-429
[9] http://verify.rwth-aachen.de/brockschmidt/Cooperating-T2/
Benchmark
Size101539
Compressed Size6464
License Creative Commons Attribution 4.0 International (CC-BY-4.0)
Categoryindustrial
First Occurrence2014-07-21
Generated By
Generated On
Generator
Dolmen OK1
strict Dolmen OK
check-sat calls1
Query 1
Status unsat
Inferred Status unsat
Size 101531
Compressed Size6473
Max. Term Depth26
Asserts 64
Declared Functions0
Declared Constants277
Declared Sorts 0
Defined Functions0
Defined Recursive Functions 0
Defined Sorts0
Constants0
Declared Datatypes0

Symbols

or299 and299 =398 let448
+965 -362 *1218 <200
<=257 >100 >=231

Evaluations

Evaluation Rating Solver Variant Result Wallclock CPU Time
SMT-COMP 2014 CVC4 CVC4 f7118b2 default unsat ✅ 21.76120 21.75870
MathSAT MathSAT-5.2.12-Main default unsat ✅ 47.82270 47.83170
SMTInterpol smtinterpol-2.1-118-g3dada2f default unsat ✅ 134.39600 160.16300
veriT veriT-smtcomp2014 default unsat ✅ 23.94820 23.94840
Yices2 Yices-2.2.1-smtcomp2014 default unsat ✅ 10.09860 10.09550
Z3 Z3-4.3.2.a054b099c1d6-x64-debian-6.0.6-SMT-COMP-2014 default unsat ✅ 30.44490 30.44240
SMT-COMP 2015 0.29 (5/7) CVC4 CVC4-master-2015-06-15-9b32405-main default unsat ✅ 12.90870 12.91000
CVC4-experimental-2015-06-15-ff5745a-main default unsat ✅ 12.93640 12.93900
MathSAT MathSat 5.3.6 main smtcomp2015_main unsat ✅ 65.68880 65.71500
SMTInterpol SMTInterpol v2.1-206-g86e9531 default unknown ❌ 2400.02000 2413.47000
SMT-RAT SMT-RAT-final default unknown ❌ 2400.01000 2401.02000
veriT veriT default unsat ✅ 10.35700 10.36040
Yices2 Yices default unsat ✅ 34.54890 34.56270
Z3 z3 4.4.0 default unsat ✅ 25.54990 25.56010
SMT-COMP 2016 0.22 (7/9) CVC4 CVC4-master-2016-05-27-cfef263-main default unsat ✅ 9.50341 9.50804
MathSAT mathsat-5.3.11-linux-x86_64-Main default unsat ✅ 59.05190 59.08740
OpenSMT OpenSMT2-2016-05-12 default unsat ✅ 38.96970 38.99180
SMTInterpol smtinterpol-2.1-258-g92ab3df default unsat ✅ 88.14010 109.36000
SMT-RAT SMT-RAT default unknown ❌ 2400.02000 2401.39000
Toysmt toysmt default unknown ❌ 2400.12000 2401.41000
veriT veriT-dev default unsat ✅ 9.08385 9.08983
Yices2 Yices-2.4.2 default unsat ✅ 11.50130 11.50780
Z3 z3-4.4.1 default unsat ✅ 21.78890 21.80270
SMT-COMP 2017 0.12 (7/8) CVC4 CVC4-smtcomp2017-main default unsat ✅ 8.81709 8.81714
MathSAT mathsat-5.4.1-linux-x86_64-Main default unsat ✅ 255.83400 255.84100
OpenSMT opensmt2-2017-06-04 default unsat ✅ 39.65110 39.65140
SMTInterpol SMTInterpol default unsat ✅ 74.34440 91.26440
SMT-RAT SMTRAT-comp2017_2 default unknown ❌ 600.00900 600.03800
veriT veriT-2017-06-17 default unsat ✅ 16.01280 16.01220
Yices2 Yices2-Main default unsat ✅ 45.73830 45.73040
Z3 z3-4.5.0 default unsat ✅ 44.36080 44.35710
SMT-COMP 2018 0.11 (8/9) Ctrl-Ergo Ctrl-Ergo-SMTComp-2018_default unsat ✅ 585.80700 2330.03000
CVC4 master-2018-06-10-b19c840-competition-default_default unsat ✅ 12.83030 12.82920
MathSAT mathsat-5.5.2-linux-x86_64-Main_default unsat ✅ 208.91200 208.90000
OpenSMT opensmt2_default unsat ✅ 32.73690 32.73550
SMTInterpol SMTInterpol-2.5-19-g0d39cdee_default unsat ✅ 74.11060 95.40950
SMT-RAT SMTRAT-Rat-final_default unknown ❌ 1200.10000 1199.98000
SMTRAT-MCSAT-final_default unknown ❌ 1200.02000 1199.92000
veriT veriT_default unsat ✅ 11.35120 11.35160
Yices2 Yices 2.6.0_default unsat ✅ 5.75104 5.75118
Z3 z3-4.7.1_default unsat ✅ 81.79730 81.79190
SMT-COMP 2020 CVC4 CVC4-sq-final_default unsat ✅ 11.34290 11.34130
MathSAT MathSAT5_default.sh unsat ✅ 215.98200 215.97700
OpenSMT OpenSMT_default unsat ✅ 3.65035 3.65010
Par4 Par4-wrapped-sq_default unsat ✅ 20.33300 80.19000
SMTInterpol smtinterpol-2.5-679-gacfde87a_default unsat ✅ 149.72200 165.34900
veriT veriT_default unsat ✅ 16.75620 16.75560
Yices2 Yices 2.6.2 bug fix_default unsat ✅ 46.75740 46.75100
Z3 z3-4.8.8_default unsat ✅ 28.37420 28.36440
SMT-COMP 2024 cvc5 cvc5 unsat ✅ 6.95096 6.85133
OpenSMT OpenSMT unsat ✅ 3.54471 3.44373
SMTInterpol SMTInterpol unsat ✅ 114.59545 144.07005
Yices2 Yices2 unsat ✅ 21.91630 21.81521
Z3alpha Z3-alpha unsat ✅ 11.50699 11.40525