Benchmark

non-incremental/FP/20200911-Pine/1599122164642173000.smt2

These benchmarks were generated while developing the tool Pine [1], which uses
CVC4/Z3 to check inductiveness of invariants. The work is described in [2].

[1] https://github.com/izycheva/pine
[2] A.Izycheva, E.Darulova, H.Seidl, SAS'20, "Counterexample- and Simulation-Guided Floating-Point Loop Invariant Synthesis"

 Loop:
   u' := u + 0.01 * v
   v' := v + 0.01 * (-0.5 * v - 9.81 * u)

 Input ranges:
   u in [0.0,0.0]
   v in [2.0,3.0]

 Invariant:
   -1.0*u + 0.24*v + 0.89*u^2 + -0.12*u*v + 0.07*v^2 <= 1.43
 and
   u in [-0.7,2.0]
   v in [-6.3,3.0]

 Query: Loop and Invariant and not Invariant'
Benchmark
Size4052
Compressed Size1218
License Creative Commons Attribution 4.0 International (CC-BY-4.0)
Categoryindustrial
First Occurrence2021-07-18
Generated ByAnastasiia Izycheva, Eva Darulova
Generated On2020-09-11 00:00:00
GeneratorPine (using Z3 Python API)
Dolmen OK1
strict Dolmen OK1
check-sat calls1
Query 1
Status unknown
Inferred Status sat
Size 4044
Compressed Size1232
Max. Term Depth31
Asserts 1
Declared Functions0
Declared Constants4
Declared Sorts 0
Defined Functions0
Defined Recursive Functions 0
Defined Sorts0
Constants0
Declared Datatypes0

Symbols

not1 and6 let27 fp24
fp.add10 fp.sub1 fp.mul20 fp.neg1
fp.leq10 fp.eq2 roundNearestTiesToEven31

Evaluations

Evaluation Rating Solver Variant Result Wallclock CPU Time
SMT-COMP 2021 0.33 (2/3) CVC4 CVC4-sq-final_default sat ✅ 251.73500 251.65600
UltimateEliminator UltimateEliminator+MathSAT-5.6.6_default unknown ❌ 7.12117 4.24331
Z3 z3-4.8.11_default sat ✅ 248.38200 229.10100
z3-4.8.4-d6df51951f4c-wrapped-sq_default unknown ❌ 64.67700 64.67360
SMT-COMP 2024 Bitwuzla Bitwuzla sat ✅ 32.89196 32.79060
cvc5 cvc5 sat ✅ 39.89362 39.79335
SMT-COMP 2025 0.33 (2/3) Bitwuzla Bitwuzla sat ✅ 24.80334 24.66893
cvc5 cvc5 sat ✅ 12.50742 12.38723
UltimateEliminator UltimateEliminator+MathSAT unknown ❌ 1201.79093 1205.59877