Benchmark

non-incremental/FP/20200911-Pine/1599121901174413000.smt2

These benchmarks were generated while developing the tool Pine [1], which uses
CVC4/Z3 to check inductiveness of invariants. The work is described in [2].

[1] https://github.com/izycheva/pine
[2] A.Izycheva, E.Darulova, H.Seidl, SAS'20, "Counterexample- and Simulation-Guided Floating-Point Loop Invariant Synthesis"

 Loop:
   u' := u + 0.01 * v
   v' := v + 0.01 * (-0.5 * v - 9.81 * (u - (u*u*u)/6.0 + (u*u*u*u*u)/120.0))

 Input ranges:
   u in [0.0,0.0]
   v in [2.0,3.0]

 Invariant:
   -0.15*u + -0.02*v + 1.0*u^2 + 0.03*u*v + 0.1*v^2 <= 0.96
 and
   u in [-0.9,1.1]
   v in [-3.0,3.1]

 Query: Loop and Invariant and not Invariant'
Benchmark
Size4553
Compressed Size1310
License Creative Commons Attribution 4.0 International (CC-BY-4.0)
Categoryindustrial
First Occurrence2021-07-18
Generated ByAnastasiia Izycheva, Eva Darulova
Generated On2020-09-11 00:00:00
GeneratorPine (using Z3 Python API)
Dolmen OK1
strict Dolmen OK1
check-sat calls1
Query 1
Status unknown
Inferred Status sat
Size 4545
Compressed Size1325
Max. Term Depth35
Asserts 1
Declared Functions0
Declared Constants4
Declared Sorts 0
Defined Functions0
Defined Recursive Functions 0
Defined Sorts0
Constants0
Declared Datatypes0

Symbols

not1 and6 let31 fp26
fp.add11 fp.sub2 fp.mul24 fp.div2
fp.neg1 fp.leq10 fp.eq2 roundNearestTiesToEven39

Evaluations

Evaluation Rating Solver Variant Result Wallclock CPU Time
SMT-COMP 2022 0.50 (2/4) Bitwuzla Bitwuzla-wrapped_default sat ✅ 171.58600 171.54400
cvc5 cvc5_default sat ✅ 665.17300 661.76700
cvc5-default-2022-07-02-b15e116-wrapped_sq sat ✅ 232.16000 232.07600
UltimateEliminator UltimateEliminator+MathSAT-5.6.7-wrapped_default unknown ❌ 6.81048 4.33578
Z3 z3-4.8.17_default unknown ❌ 1200.04000 1199.82000
SMT-COMP 2025 0.67 (1/3) Bitwuzla Bitwuzla sat ✅ 66.69772 66.55543
cvc5 cvc5 unknown ❌ 1201.77635 1200.95980
UltimateEliminator UltimateEliminator+MathSAT unknown ❌ 1201.77424 1205.55719