Benchmark

non-incremental/QF_NRA/20200911-Pine/1599121976975720000.smt2

These benchmarks were generated while developing the tool Pine [1], which uses
CVC4/Z3 to check inductiveness of invariants. The work is described in [2].

[1] https://github.com/izycheva/pine
[2] A.Izycheva, E.Darulova, H.Seidl, SAS'20, "Counterexample- and Simulation-Guided Floating-Point Loop Invariant Synthesis"

 Loop:
   u' := u + 0.01 * v
   v' := v + 0.01 * (-0.5 * v - 9.81 * u)

 Input ranges:
   u in [0.0,0.0]
   v in [2.0,3.0]

 Invariant:
   -0.16*u + -0.03*v + 1.0*u^2 + 0.03*u*v + 0.09*v^2 <= 0.77
 and
   u in [-0.8,1.0]
   v in [-2.8,3.0]

 Query: Loop and Invariant and not Invariant'
Benchmark
Size1927
Compressed Size927
License Creative Commons Attribution 4.0 International (CC-BY-4.0)
Categoryindustrial
First Occurrence2021-07-18
Generated ByAnastasiia Izycheva, Eva Darulova
Generated On2020-09-11 00:00:00
GeneratorPine (using Z3 Python API)
Dolmen OK1
strict Dolmen OK1
check-sat calls1
Query 1
Status unknown
Inferred Status sat
Size 1919
Compressed Size938
Max. Term Depth17
Asserts 1
Declared Functions0
Declared Constants4
Declared Sorts 0
Defined Functions0
Defined Recursive Functions 0
Defined Sorts0
Constants0
Declared Datatypes0

Symbols

not1 and6 =2 let9
/18 +10 -10 *20
<=4 >=6

Evaluations

Evaluation Rating Solver Variant Result Wallclock CPU Time
SMT-COMP 2024 0.20 (4/5) cvc5 cvc5 sat ✅ 0.27646 0.17682
SMTInterpol SMTInterpol unknown ❌ 0.43558 0.46815
SMT-RAT SMT-RAT sat ✅ 0.22592 0.12612
Yices2 Yices2 sat ✅ 0.22716 0.12698
Z3alpha Z3-alpha sat ✅ 0.26921 0.16966
SMT-COMP 2025 0.17 (5/6) cvc5 cvc5 sat ✅ 0.27212 0.15821
SMTInterpol SMTInterpol unknown ❌ 0.44547 0.44581
SMT-RAT SMT-RAT sat ✅ 0.29897 0.16937
Yices2 Yices2 sat ✅ 0.30322 0.17488
Z3alpha Z3-alpha sat ✅ 0.60716 0.48941
Z3 Z3-alpha-base sat ✅ 0.27691 0.15924
z3siri-base sat ✅ 0.26889 0.15447