Benchmark

non-incremental/QF_LRA/LassoRanker/SV-COMP/aviad_true-termination.c_Iteration1_Loop_6-nestedTemplate.smt2

SMT script generated by Ultimate LassoRanker [1]
Ultimate LassoRanker is a tool that synthesizes ranking functions for 
linear lasso programs and implements the techniques presented in [2] and [3].
For generating these SMT scripts Ultimate LassoRanker was used as a sub 
procedure of the termination analyzer Ultimate BuchiAutomizer [4] which 
implements the techniques presented in [5].

This SMT script belongs to a set of SMT scripts that was generated by applying
revision 11505 of BuchiAutomizer to
- the benchmarks from the demonstration category on termination of the SV-COMP 
2014 [6] available at [7],
- the benchmarks from [8] which are available at [9] and,
- benchmarks from the repository of Ultimate LassoRanker.
This set of SMT scripts contains only SMT scripts that we considered to be 
difficult because LassoRanker run into a timeout after 10 seconds.

2014-05-03, Matthias Heizmann (heizmann@informatik.uni-freiburg.de)


[1] http://ultimate.informatik.uni-freiburg.de/LassoRanker/
[2] Jan Leike and Matthias Heizmann. Ranking Templates for Linear Loops. In 
TACAS 2014.
[3] Matthias Heizmann, Jochen Hoenicke, Jan Leike and Andreas Podelski. Linear 
Ranking for Linear Lasso Programs. In ATVA 2013.
[4] http://ultimate.informatik.uni-freiburg.de/BuchiAutomizer/
[5] Matthias Heizmann, Jochen Hoenicke and Andreas Podelski. Termination 
Analysis by Learning Terminating Programs. Accepted at CAV 2014
[6] Dirk Beyer: Status Report on Software Verification - (Competition Summary 
SV-COMP 2014). TACAS 2014
[7] https://svn.sosy-lab.org/software/sv-benchmarks/trunk/c/termination-crafted/
[8] Marc Brockschmidt, Byron Cook, Carsten Fuhs: Better Termination Proving 
through Cooperation. CAV 2013:413-429
[9] http://verify.rwth-aachen.de/brockschmidt/Cooperating-T2/
Benchmark
Size13052907
Compressed Size340799
License Creative Commons Attribution 4.0 International (CC-BY-4.0)
Categoryindustrial
First Occurrence2014-07-21
Generated By
Generated On
Generator
Dolmen OK1
strict Dolmen OK
check-sat calls1
Query 1
Status sat
Inferred Status sat
Size 13052899
Compressed Size340787
Max. Term Depth13
Asserts 4369
Declared Functions0
Declared Constants66867
Declared Sorts 0
Defined Functions0
Defined Recursive Functions 0
Defined Sorts0
Constants0
Declared Datatypes0

Symbols

or20832 and20832 =33976 let28244
+67264 -70946 *103442 <8568
<=16632 >8401 >=66668

Evaluations

Evaluation Rating Solver Variant Result Wallclock CPU Time
SMT-COMP 2014 CVC4 CVC4 f7118b2 default sat ✅ 918.23000 918.36600
MathSAT MathSAT-5.2.12-Main default sat ✅ 136.66500 136.68100
SMTInterpol smtinterpol-2.1-118-g3dada2f default sat ✅ 55.43460 68.92150
veriT veriT-smtcomp2014 default sat ✅ 112.18200 112.18600
Yices2 Yices-2.2.1-smtcomp2014 default sat ✅ 2.82579 2.82457
Z3 Z3-4.3.2.a054b099c1d6-x64-debian-6.0.6-SMT-COMP-2014 default sat ✅ 173.61600 173.61200
SMT-COMP 2015 0.29 (5/7) CVC4 CVC4-master-2015-06-15-9b32405-main default sat ✅ 887.89700 888.13300
CVC4-experimental-2015-06-15-ff5745a-main default sat ✅ 903.85800 904.26500
MathSAT MathSat 5.3.6 main smtcomp2015_main sat ✅ 146.66700 146.72800
SMTInterpol SMTInterpol v2.1-206-g86e9531 default unknown ❌ 2400.02000 2517.36000
SMT-RAT SMT-RAT-final default unknown ❌ 2400.01000 2400.83000
veriT veriT default sat ✅ 122.51400 122.44100
Yices2 Yices default sat ✅ 2.61542 2.61560
Z3 z3 4.4.0 default sat ✅ 214.60600 214.64800
SMT-COMP 2016 0.33 (6/9) CVC4 CVC4-master-2016-05-27-cfef263-main default sat ✅ 887.95200 878.08000
MathSAT mathsat-5.3.11-linux-x86_64-Main default sat ✅ 124.34200 124.42600
OpenSMT OpenSMT2-2016-05-12 default unknown ❌ 2400.02000 2401.20000
SMTInterpol smtinterpol-2.1-258-g92ab3df default sat ✅ 53.55480 73.73630
SMT-RAT SMT-RAT default unknown ❌ 2400.02000 2401.58000
Toysmt toysmt default unknown ❌ 1482.11000 1482.62000
veriT veriT-dev default sat ✅ 128.16100 128.22500
Yices2 Yices-2.4.2 default sat ✅ 3.81265 3.81440
Z3 z3-4.4.1 default sat ✅ 215.82600 215.91400
SMT-COMP 2017 0.38 (5/8) CVC4 CVC4-smtcomp2017-main default unknown ❌ 600.04300 593.37000
MathSAT mathsat-5.4.1-linux-x86_64-Main default sat ✅ 498.81900 498.69800
OpenSMT opensmt2-2017-06-04 default unknown ❌ 600.07100 599.82000
SMTInterpol SMTInterpol default sat ✅ 62.08080 79.70590
SMT-RAT SMTRAT-comp2017_2 default unknown ❌ 600.06800 600.02000
veriT veriT-2017-06-17 default sat ✅ 127.88700 127.86900
Yices2 Yices2-Main default sat ✅ 4.52016 4.51940
Z3 z3-4.5.0 default sat ✅ 303.97100 303.90700
SMT-COMP 2018 0.11 (8/9) Ctrl-Ergo Ctrl-Ergo-SMTComp-2018_default sat ✅ 31.26250 120.43000
CVC4 master-2018-06-10-b19c840-competition-default_default sat ✅ 1082.86000 1075.96000
MathSAT mathsat-5.5.2-linux-x86_64-Main_default sat ✅ 518.86200 518.84100
OpenSMT opensmt2_default sat ✅ 227.75500 227.75600
SMTInterpol SMTInterpol-2.5-19-g0d39cdee_default sat ✅ 69.07580 87.58620
SMT-RAT SMTRAT-Rat-final_default unknown ❌ 1200.01000 1199.86000
SMTRAT-MCSAT-final_default unknown ❌ 1200.01000 1199.97000
veriT veriT_default sat ✅ 130.90100 130.87900
Yices2 Yices 2.6.0_default sat ✅ 2.50967 2.50944
Z3 z3-4.7.1_default sat ✅ 532.41000 532.39400
SMT-COMP 2024 cvc5 cvc5 sat ✅ 60.87293 60.73961
OpenSMT OpenSMT sat ✅ 22.62283 22.51294
SMTInterpol SMTInterpol sat ✅ 678.63020 721.99642
Yices2 Yices2 sat ✅ 1.52045 1.42053
Z3alpha Z3-alpha sat ✅ 100.41918 100.29221
SMT-COMP 2025 0.17 (5/6) cvc5 cvc5 sat ✅ 58.24986 58.11412
OpenSMT OpenSMT sat ✅ 13.06976 12.94450
SMTInterpol SMTInterpol unknown ❌ 1201.70035 1223.52416
Yices2 Yices2 sat ✅ 2.15635 2.03218
Z3alpha Z3-alpha sat ✅ 261.09349 1032.69233
Z3 Z3-alpha-base sat ✅ 109.59727 109.46093