Benchmark

non-incremental/FP/20200911-Pine/1599121862359721000.smt2

These benchmarks were generated while developing the tool Pine [1], which uses
CVC4/Z3 to check inductiveness of invariants. The work is described in [2].

[1] https://github.com/izycheva/pine
[2] A.Izycheva, E.Darulova, H.Seidl, SAS'20, "Counterexample- and Simulation-Guided Floating-Point Loop Invariant Synthesis"

 Loop:
   x1' := x1 + 0.01 * x2
   x2' := -0.01 * x1 + 0.99 * x2

 Input ranges:
   x1 in [0.0,1.0]
   x2 in [0.0,1.0]

 Invariant:
   -1.0*x1 + -0.3*x2 + 0.88*x1^2 + 0.15*x1*x2 + 0.77*x2^2 <= 0.59
 and
   x1 in [-0.4,1.5]
   x2 in [-0.9,1.0]

 Query: Loop and Invariant and not Invariant'
Benchmark
Size3948
Compressed Size1196
License Creative Commons Attribution 4.0 International (CC-BY-4.0)
Categoryindustrial
First Occurrence2021-07-18
Generated ByAnastasiia Izycheva, Eva Darulova
Generated On2020-09-11 00:00:00
GeneratorPine (using Z3 Python API)
Dolmen OK1
strict Dolmen OK1
check-sat calls1
Query 1
Status unknown
Inferred Status sat
Size 3940
Compressed Size1207
Max. Term Depth30
Asserts 1
Declared Functions0
Declared Constants4
Declared Sorts 0
Defined Functions0
Defined Recursive Functions 0
Defined Sorts0
Constants0
Declared Datatypes0

Symbols

not1 and6 let26 fp23
fp.add10 fp.mul19 fp.neg1 fp.leq10
fp.eq2 roundNearestTiesToEven29

Evaluations

Evaluation Rating Solver Variant Result Wallclock CPU Time
SMT-COMP 2021 0.33 (2/3) CVC4 CVC4-sq-final_default sat ✅ 466.65000 466.60500
UltimateEliminator UltimateEliminator+MathSAT-5.6.6_default unknown ❌ 7.17436 4.40076
Z3 z3-4.8.11_default sat ✅ 170.85200 170.83600
z3-4.8.4-d6df51951f4c-wrapped-sq_default unknown ❌ 141.70300 141.70600
SMT-COMP 2024 Bitwuzla Bitwuzla sat ✅ 15.62869 15.52572
cvc5 cvc5 sat ✅ 889.02352 888.61869
SMT-COMP 2025 Bitwuzla Bitwuzla sat ✅ 24.10294 23.96789
cvc5 cvc5 sat ✅ 865.84200 865.60814
UltimateEliminator UltimateEliminator+MathSAT sat ✅ 707.75714 711.29329