Benchmark

non-incremental/QF_LRA/LassoRanker/CooperatingT2/firewire.t2.c_Iteration1_Lasso_7-nestedTemplate.smt2

SMT script generated by Ultimate LassoRanker [1]
Ultimate LassoRanker is a tool that synthesizes ranking functions for 
linear lasso programs and implements the techniques presented in [2] and [3].
For generating these SMT scripts Ultimate LassoRanker was used as a sub 
procedure of the termination analyzer Ultimate BuchiAutomizer [4] which 
implements the techniques presented in [5].

This SMT script belongs to a set of SMT scripts that was generated by applying
revision 11505 of BuchiAutomizer to
- the benchmarks from the demonstration category on termination of the SV-COMP 
2014 [6] available at [7],
- the benchmarks from [8] which are available at [9] and,
- benchmarks from the repository of Ultimate LassoRanker.
This set of SMT scripts contains only SMT scripts that we considered to be 
difficult because LassoRanker run into a timeout after 10 seconds.

2014-05-03, Matthias Heizmann (heizmann@informatik.uni-freiburg.de)


[1] http://ultimate.informatik.uni-freiburg.de/LassoRanker/
[2] Jan Leike and Matthias Heizmann. Ranking Templates for Linear Loops. In 
TACAS 2014.
[3] Matthias Heizmann, Jochen Hoenicke, Jan Leike and Andreas Podelski. Linear 
Ranking for Linear Lasso Programs. In ATVA 2013.
[4] http://ultimate.informatik.uni-freiburg.de/BuchiAutomizer/
[5] Matthias Heizmann, Jochen Hoenicke and Andreas Podelski. Termination 
Analysis by Learning Terminating Programs. Accepted at CAV 2014
[6] Dirk Beyer: Status Report on Software Verification - (Competition Summary 
SV-COMP 2014). TACAS 2014
[7] https://svn.sosy-lab.org/software/sv-benchmarks/trunk/c/termination-crafted/
[8] Marc Brockschmidt, Byron Cook, Carsten Fuhs: Better Termination Proving 
through Cooperation. CAV 2013:413-429
[9] http://verify.rwth-aachen.de/brockschmidt/Cooperating-T2/
Benchmark
Size2718056
Compressed Size130442
License Creative Commons Attribution 4.0 International (CC-BY-4.0)
Categoryindustrial
First Occurrence2014-07-21
Generated By
Generated On
Generator
Dolmen OK1
strict Dolmen OK
check-sat calls1
Query 1
Status sat
Inferred Status sat
Size 2718048
Compressed Size130446
Max. Term Depth18
Asserts 641
Declared Functions0
Declared Constants9300
Declared Sorts 0
Defined Functions0
Defined Recursive Functions 0
Defined Sorts0
Constants0
Declared Datatypes0

Symbols

or2816 and2816 =17140 let6416
+24412 -14328 *36896 <1216
<=2240 >1153 >=8768

Evaluations

Evaluation Rating Solver Variant Result Wallclock CPU Time
SMT-COMP 2014 CVC4 CVC4 f7118b2 default sat ✅ 69.92120 69.91840
MathSAT MathSAT-5.2.12-Main default sat ✅ 163.78700 163.76900
SMTInterpol smtinterpol-2.1-118-g3dada2f default sat ✅ 45.71630 58.33310
veriT veriT-smtcomp2014 default sat ✅ 19.89820 19.89100
Yices2 Yices-2.2.1-smtcomp2014 default sat ✅ 3.28295 3.28250
Z3 Z3-4.3.2.a054b099c1d6-x64-debian-6.0.6-SMT-COMP-2014 default sat ✅ 30.76840 30.75430
SMT-COMP 2015 0.29 (5/7) CVC4 CVC4-master-2015-06-15-9b32405-main default sat ✅ 69.16440 69.18950
CVC4-experimental-2015-06-15-ff5745a-main default sat ✅ 68.24130 68.26660
MathSAT MathSat 5.3.6 main smtcomp2015_main sat ✅ 161.79700 161.86300
SMTInterpol SMTInterpol v2.1-206-g86e9531 default unknown ❌ 2400.02000 2430.87000
SMT-RAT SMT-RAT-final default unknown ❌ 2400.01000 2401.03000
veriT veriT default sat ✅ 24.94440 24.95320
Yices2 Yices default sat ✅ 3.18972 3.18951
Z3 z3 4.4.0 default sat ✅ 32.41130 32.40010
SMT-COMP 2016 0.33 (6/9) CVC4 CVC4-master-2016-05-27-cfef263-main default sat ✅ 80.99430 81.03680
MathSAT mathsat-5.3.11-linux-x86_64-Main default sat ✅ 161.42500 161.53400
OpenSMT OpenSMT2-2016-05-12 default unknown ❌ 2400.04000 2401.37000
SMTInterpol smtinterpol-2.1-258-g92ab3df default sat ✅ 51.99060 73.73540
SMT-RAT SMT-RAT default unknown ❌ 2400.02000 2401.18000
Toysmt toysmt default unknown ❌ 1642.42000 1643.33000
veriT veriT-dev default sat ✅ 25.35750 25.37240
Yices2 Yices-2.4.2 default sat ✅ 4.92272 4.92595
Z3 z3-4.4.1 default sat ✅ 29.60700 29.62280
SMT-COMP 2017 0.25 (6/8) CVC4 CVC4-smtcomp2017-main default sat ✅ 74.39130 74.38410
MathSAT mathsat-5.4.1-linux-x86_64-Main default sat ✅ 96.24940 96.24060
OpenSMT opensmt2-2017-06-04 default unknown ❌ 600.10900 600.04600
SMTInterpol SMTInterpol default sat ✅ 43.12550 62.45530
SMT-RAT SMTRAT-comp2017_2 default unknown ❌ 600.11000 600.06200
veriT veriT-2017-06-17 default sat ✅ 26.04540 26.04250
Yices2 Yices2-Main default sat ✅ 4.33239 4.33099
Z3 z3-4.5.0 default sat ✅ 31.88270 63.67410
SMT-COMP 2018 0.11 (8/9) Ctrl-Ergo Ctrl-Ergo-SMTComp-2018_default sat ✅ 67.86640 269.03000
CVC4 master-2018-06-10-b19c840-competition-default_default sat ✅ 94.57660 94.56810
MathSAT mathsat-5.5.2-linux-x86_64-Main_default sat ✅ 83.77450 83.76050
OpenSMT opensmt2_default sat ✅ 47.77820 47.77440
SMTInterpol SMTInterpol-2.5-19-g0d39cdee_default sat ✅ 45.72730 61.55680
SMT-RAT SMTRAT-Rat-final_default unknown ❌ 1200.03000 1200.01000
SMTRAT-MCSAT-final_default unknown ❌ 1200.04000 1199.90000
veriT veriT_default sat ✅ 23.43790 23.43530
Yices2 Yices 2.6.0_default sat ✅ 3.40423 3.40412
Z3 z3-4.7.1_default sat ✅ 49.83030 49.82860
SMT-COMP 2019 Ctrl-Ergo Ctrl-Ergo-2019-wrapped-sq_default sat ✅ 60.58450 239.55000
CVC4 CVC4-2019-06-03-d350fe1-wrapped-sq_default sat ✅ 64.89420 64.88510
CVC4-SymBreak_03_06_2019-wrapped-sq_default sat ✅ 141.07200 139.97400
master-2018-06-10-b19c840-competition-default_default sat ✅ 82.37980 82.37030
OpenSMT OpenSMT-wrapped-sq_default sat ✅ 91.15520 91.14910
Par4 Par4-wrapped-sq_default sat ✅ 25.69510 101.47000
SMTInterpol smtinterpol-2.5-514-wrapped-sq_default sat ✅ 203.77600 226.56300
veriT veriT-wrapped-sq_default sat ✅ 23.72150 23.71900
Yices2 Yices 2.6.2-wrapped-sq_default sat ✅ 3.84633 3.83788
Z3 z3-4.8.4-d6df51951f4c-wrapped-sq_default sat ✅ 50.16210 50.16240
SMT-COMP 2021 0.17 (5/6) MathSAT mathsat-5.6.6_default sat ✅ 90.39290 90.37620
mc2 mc2 2021-06-07_default.sh unknown ❌ 16.91560 16.91420
Par4 Par4-wrapped-sq_default sat ✅ 29.60620 116.33000
SMTInterpol smtinterpol-2.5-823-g881e8631_default sat ✅ 265.34900 289.54600
veriT veriT_default sat ✅ 22.75490 22.75230
Z3 z3-4.8.11_default sat ✅ 79.54850 79.54790
SMT-COMP 2022 cvc5 cvc5-default-2022-07-02-b15e116-wrapped_sq sat ✅ 66.54290 66.53920
MathSAT MathSAT-5.6.8_default sat ✅ 75.73420 75.73180
veriT veriT_default sat ✅ 22.89790 22.89770
Yices2 Yices 2.6.2 for SMTCOMP 2021_default sat ✅ 3.52361 3.52314
Yices 2.6.2 for SMTCOMP 2021_default sat ✅ 3.52201 3.52161
Z3 z3-4.8.17_default sat ✅ 75.15360 75.11720
SMT-COMP 2023 cvc5 cvc5-default-2023-05-16-ea045f305_sq sat ✅ 62.04230 62.03800
OpenSMT OpenSMT a78dcf01_default sat ✅ 16.55230 16.55110
SMTInterpol smtinterpol-2.5-1272-g2d6d356c_default sat ✅ 248.69900 270.19900
Yaga Yaga_SMT-COMP-2023_presubmition_default sat ✅ 426.58600 426.53200
Yices2 Yices 2 for SMTCOMP 2023_default sat ✅ 3.16212 3.16186
Yices 2.6.2 for SMTCOMP 2021_default sat ✅ 3.12618 3.12614
Yices 2.6.2 for SMTCOMP 2021_default sat ✅ 3.13186 3.13149
SMT-COMP 2025 cvc5 cvc5 sat ✅ 36.80500 36.67478
OpenSMT OpenSMT sat ✅ 12.77475 12.65544
SMTInterpol SMTInterpol sat ✅ 245.41982 263.37942
Yices2 Yices2 sat ✅ 2.58904 2.47242
Z3alpha Z3-alpha sat ✅ 81.67989 323.78269
Z3 Z3-alpha-base sat ✅ 58.13995 58.00178