Benchmark

non-incremental/FP/20200911-Pine/1599121861030829000.smt2

These benchmarks were generated while developing the tool Pine [1], which uses
CVC4/Z3 to check inductiveness of invariants. The work is described in [2].

[1] https://github.com/izycheva/pine
[2] A.Izycheva, E.Darulova, H.Seidl, SAS'20, "Counterexample- and Simulation-Guided Floating-Point Loop Invariant Synthesis"

 Loop:
   x1' := x1 + 0.01 * x2
   x2' := -0.01 * x1 + 0.99 * x2

 Input ranges:
   x1 in [0.0,1.0]
   x2 in [0.0,1.0]

 Invariant:
   -1.0*x1 + -0.33*x2 + 0.88*x1^2 + 0.11*x1*x2 + 0.6*x2^2 <= 0.27
 and
   x1 in [-0.3,1.3]
   x2 in [-0.8,1.0]

 Query: Loop and Invariant and not Invariant'
Benchmark
Size3956
Compressed Size1162
License Creative Commons Attribution 4.0 International (CC-BY-4.0)
Categoryindustrial
First Occurrence2021-07-18
Generated ByAnastasiia Izycheva, Eva Darulova
Generated On2020-09-11 00:00:00
GeneratorPine (using Z3 Python API)
Dolmen OK1
strict Dolmen OK1
check-sat calls1
Query 1
Status unknown
Inferred Status sat
Size 3948
Compressed Size1160
Max. Term Depth30
Asserts 1
Declared Functions0
Declared Constants4
Declared Sorts 0
Defined Functions0
Defined Recursive Functions 0
Defined Sorts0
Constants0
Declared Datatypes0

Symbols

not1 and6 let26 fp23
fp.add10 fp.mul19 fp.neg1 fp.leq10
fp.eq2 roundNearestTiesToEven29

Evaluations

Evaluation Rating Solver Variant Result Wallclock CPU Time
SMT-COMP 2021 0.33 (2/3) CVC4 CVC4-sq-final_default sat ✅ 798.68600 798.61600
UltimateEliminator UltimateEliminator+MathSAT-5.6.6_default unknown ❌ 7.19771 5.15816
Z3 z3-4.8.11_default sat ✅ 369.06300 368.97300
z3-4.8.4-d6df51951f4c-wrapped-sq_default unknown ❌ 133.22300 133.21600
SMT-COMP 2024 Bitwuzla Bitwuzla sat ✅ 39.20668 39.09009
cvc5 cvc5 sat ✅ 32.86442 32.74751
SMT-COMP 2025 Bitwuzla Bitwuzla sat ✅ 25.34060 25.21449
cvc5 cvc5 sat ✅ 66.42568 66.29095
UltimateEliminator UltimateEliminator+MathSAT sat ✅ 1142.79463 1146.96215