Benchmark

non-incremental/QF_LRA/LassoRanker/CooperatingT2/mc91test.t2.c_Iteration6_Loop_4-pieceTemplate.smt2

SMT script generated by Ultimate LassoRanker [1]
Ultimate LassoRanker is a tool that synthesizes ranking functions for 
linear lasso programs and implements the techniques presented in [2] and [3].
For generating these SMT scripts Ultimate LassoRanker was used as a sub 
procedure of the termination analyzer Ultimate BuchiAutomizer [4] which 
implements the techniques presented in [5].

This SMT script belongs to a set of SMT scripts that was generated by applying
revision 11505 of BuchiAutomizer to
- the benchmarks from the demonstration category on termination of the SV-COMP 
2014 [6] available at [7],
- the benchmarks from [8] which are available at [9] and,
- benchmarks from the repository of Ultimate LassoRanker.
This set of SMT scripts contains only SMT scripts that we considered to be 
difficult because LassoRanker run into a timeout after 10 seconds.

2014-05-03, Matthias Heizmann (heizmann@informatik.uni-freiburg.de)


[1] http://ultimate.informatik.uni-freiburg.de/LassoRanker/
[2] Jan Leike and Matthias Heizmann. Ranking Templates for Linear Loops. In 
TACAS 2014.
[3] Matthias Heizmann, Jochen Hoenicke, Jan Leike and Andreas Podelski. Linear 
Ranking for Linear Lasso Programs. In ATVA 2013.
[4] http://ultimate.informatik.uni-freiburg.de/BuchiAutomizer/
[5] Matthias Heizmann, Jochen Hoenicke and Andreas Podelski. Termination 
Analysis by Learning Terminating Programs. Accepted at CAV 2014
[6] Dirk Beyer: Status Report on Software Verification - (Competition Summary 
SV-COMP 2014). TACAS 2014
[7] https://svn.sosy-lab.org/software/sv-benchmarks/trunk/c/termination-crafted/
[8] Marc Brockschmidt, Byron Cook, Carsten Fuhs: Better Termination Proving 
through Cooperation. CAV 2013:413-429
[9] http://verify.rwth-aachen.de/brockschmidt/Cooperating-T2/
Benchmark
Size472504
Compressed Size20441
License Creative Commons Attribution 4.0 International (CC-BY-4.0)
Categoryindustrial
First Occurrence2014-07-21
Generated By
Generated On
Generator
Dolmen OK1
strict Dolmen OK
check-sat calls1
Query 1
Status sat
Inferred Status sat
Size 472496
Compressed Size20449
Max. Term Depth67
Asserts 169
Declared Functions0
Declared Constants1031
Declared Sorts 0
Defined Functions0
Defined Recursive Functions 0
Defined Sorts0
Constants0
Declared Datatypes0

Symbols

or808 and808 =2568 let2581
+4570 -1510 *5148 <484
<=682 >283 >=924

Evaluations

Evaluation Rating Solver Variant Result Wallclock CPU Time
SMT-COMP 2014 CVC4 CVC4 f7118b2 default sat ✅ 7.94742 7.93879
MathSAT MathSAT-5.2.12-Main default sat ✅ 18.47640 18.47520
SMTInterpol smtinterpol-2.1-118-g3dada2f default sat ✅ 6.72171 13.91790
veriT veriT-smtcomp2014 default sat ✅ 5.19555 5.19621
Yices2 Yices-2.2.1-smtcomp2014 default sat ✅ 12.97310 12.96800
Z3 Z3-4.3.2.a054b099c1d6-x64-debian-6.0.6-SMT-COMP-2014 default sat ✅ 44.52850 44.52520
SMT-COMP 2015 0.14 (6/7) CVC4 CVC4-master-2015-06-15-9b32405-main default sat ✅ 8.39801 8.39772
CVC4-experimental-2015-06-15-ff5745a-main default sat ✅ 8.34950 8.34973
MathSAT MathSat 5.3.6 main smtcomp2015_main sat ✅ 12.84340 12.84800
SMTInterpol SMTInterpol v2.1-206-g86e9531 default unknown ❌ 2400.02000 2415.79000
SMT-RAT SMT-RAT-final default sat ✅ 2.54090 2.54061
veriT veriT default sat ✅ 6.19893 6.20006
Yices2 Yices default sat ✅ 0.36750 0.36694
Z3 z3 4.4.0 default sat ✅ 58.35990 58.38410
SMT-COMP 2016 0.22 (7/9) CVC4 CVC4-master-2016-05-27-cfef263-main default sat ✅ 5.66051 5.66390
MathSAT mathsat-5.3.11-linux-x86_64-Main default sat ✅ 31.14060 31.16030
OpenSMT OpenSMT2-2016-05-12 default sat ✅ 201.05700 201.15600
SMTInterpol smtinterpol-2.1-258-g92ab3df default sat ✅ 13.56570 22.15850
SMT-RAT SMT-RAT default unknown ❌ 2400.02000 2401.37000
Toysmt toysmt default unknown ❌ 2400.04000 2401.10000
veriT veriT-dev default sat ✅ 3.23073 3.23307
Yices2 Yices-2.4.2 default sat ✅ 1.91958 1.92080
Z3 z3-4.4.1 default sat ✅ 28.54880 28.56900
SMT-COMP 2017 0.12 (7/8) CVC4 CVC4-smtcomp2017-main default sat ✅ 5.09087 5.09058
MathSAT mathsat-5.4.1-linux-x86_64-Main default sat ✅ 4.70831 4.70895
OpenSMT opensmt2-2017-06-04 default sat ✅ 318.20700 318.16800
SMTInterpol SMTInterpol default sat ✅ 45.25660 54.47040
SMT-RAT SMTRAT-comp2017_2 default unknown ❌ 600.02100 599.97200
veriT veriT-2017-06-17 default sat ✅ 10.71150 10.70950
Yices2 Yices2-Main default sat ✅ 0.90997 0.90046
Z3 z3-4.5.0 default sat ✅ 24.23080 24.22920
SMT-COMP 2018 0.11 (8/9) Ctrl-Ergo Ctrl-Ergo-SMTComp-2018_default sat ✅ 5.90753 23.12000
CVC4 master-2018-06-10-b19c840-competition-default_default sat ✅ 11.27270 11.27250
MathSAT mathsat-5.5.2-linux-x86_64-Main_default sat ✅ 5.97677 5.97767
OpenSMT opensmt2_default sat ✅ 264.19800 264.12700
SMTInterpol SMTInterpol-2.5-19-g0d39cdee_default sat ✅ 10.00130 16.16460
SMT-RAT SMTRAT-Rat-final_default unknown ❌ 1200.01000 1200.01000
SMTRAT-MCSAT-final_default unknown ❌ 1200.01000 1199.95000
veriT veriT_default sat ✅ 3.72688 3.72689
Yices2 Yices 2.6.0_default sat ✅ 0.46628 0.46602
Z3 z3-4.7.1_default sat ✅ 22.25090 22.24990
SMT-COMP 2022 cvc5 cvc5-default-2022-07-02-b15e116-wrapped_sq sat ✅ 7.43653 7.43598
MathSAT MathSAT-5.6.8_default sat ✅ 1.08613 2.14601
veriT veriT_default sat ✅ 5.88280 5.88304
Yices2 Yices 2.6.2 for SMTCOMP 2021_default sat ✅ 4.89125 4.89097
Yices 2.6.2 for SMTCOMP 2021_default sat ✅ 4.87351 4.87302
Z3 z3-4.8.17_default sat ✅ 5.57678 5.57866
SMT-COMP 2023 cvc5 cvc5-default-2023-05-16-ea045f305_sq sat ✅ 4.63339 4.63139
OpenSMT OpenSMT a78dcf01_default sat ✅ 0.83866 0.83859
SMTInterpol smtinterpol-2.5-1272-g2d6d356c_default sat ✅ 34.70540 59.60390
Yaga Yaga_SMT-COMP-2023_presubmition_default sat ✅ 4.96634 4.96512
Yices2 Yices 2 for SMTCOMP 2023_default sat ✅ 0.88348 0.88339
Yices 2.6.2 for SMTCOMP 2021_default sat ✅ 0.84356 0.84354
Yices 2.6.2 for SMTCOMP 2021_default sat ✅ 0.84136 0.84129