Benchmark

non-incremental/QF_NRA/20200911-Pine/1599122158826418000.smt2

These benchmarks were generated while developing the tool Pine [1], which uses
CVC4/Z3 to check inductiveness of invariants. The work is described in [2].

[1] https://github.com/izycheva/pine
[2] A.Izycheva, E.Darulova, H.Seidl, SAS'20, "Counterexample- and Simulation-Guided Floating-Point Loop Invariant Synthesis"

 Loop:
   u' := u + 0.01 * v
   v' := v + 0.01 * (-0.5 * v - 9.81 * u)

 Input ranges:
   u in [0.0,0.0]
   v in [2.0,3.0]

 Invariant:
   -0.55*u + 1.0*u^2 + -0.01*u*v + 0.03*v^2 <= 0.3
 and
   u in [-0.1,0.9]
   v in [-3.1,3.0]

 Query: Loop and Invariant and not Invariant'
Benchmark
Size1834
Compressed Size900
License Creative Commons Attribution 4.0 International (CC-BY-4.0)
Categoryindustrial
First Occurrence2021-07-18
Generated ByAnastasiia Izycheva, Eva Darulova
Generated On2020-09-11 00:00:00
GeneratorPine (using Z3 Python API)
Dolmen OK1
strict Dolmen OK1
check-sat calls1
Query 1
Status unknown
Inferred Status sat
Size 1826
Compressed Size913
Max. Term Depth14
Asserts 1
Declared Functions0
Declared Constants4
Declared Sorts 0
Defined Functions0
Defined Recursive Functions 0
Defined Sorts0
Constants0
Declared Datatypes0

Symbols

not1 and6 =2 let7
/18 +8 -10 *18
<=4 >=6

Evaluations

Evaluation Rating Solver Variant Result Wallclock CPU Time
SMT-COMP 2021 0.50 (5/10) CVC4 CVC4-sq-final_default sat ✅ 0.03201 0.03233
cvc5 cvc5-fixed_default sat ✅ 0.03452 0.03508
Par4 Par4-wrapped-sq_default sat ✅ 0.01844 0.00776
UltimateEliminator UltimateEliminator+MathSAT-5.6.6_default unknown ❌ 1200.02000 1203.79000
Vampire vampire_smt_4.6-fixed_vampire_smtcomp unknown ❌ 1200.05000 4636.40000
YicesQS yices-QS-2021-06-13under10_default sat ✅ 0.04697 0.04698
Z3 z3-4.8.11_default sat ✅ 0.01896 0.01907
z3-4.8.8_default sat ✅ 0.03456 0.03460
SMT-COMP 2022 0.11 (8/9) cvc5 cvc5-default-2022-07-02-b15e116-wrapped_sq sat ✅ 0.04965 0.05015
MathSAT MathSAT-5.6.8_default unknown ❌ 1200.08000 1199.92000
NRA-LS NRA-LS-FINAL_default sat ✅ 0.04938 0.04942
Par4 Par4-wrapped-sq_default sat ✅ 0.02827 0.00715
SMT-RAT SMT-RAT-MCSAT_default sat ✅ 0.01713 0.01710
veriT veriT+raSAT+Redlog_default sat ✅ 0.22296 0.22434
Yices2 Yices 2.6.2 for SMTCOMP 2021_default sat ✅ 0.02161 0.02158
Z3 z3-4.8.17_default sat ✅ 0.01915 0.02104
Z3++ z3++0715_default sat ✅ 0.27799 0.27805
SMT-COMP 2023 cvc5 cvc5-default-2023-05-16-ea045f305_sq sat ✅ 0.02350 0.02406
NRA-LS cvc5-NRA-LS-sq_default sat ✅ 0.03995 0.04001
Par4 Par4-wrapped-sq_default sat ✅ 0.01748 0.00654
SMT-RAT SMT-RAT-MCSAT_default sat ✅ 0.02488 0.02486
Yices2 Yices 2 for SMTCOMP 2023_default sat ✅ 0.02015 0.02013
Z3alpha z3alpha_default sat ✅ 0.02338 0.02361
Z3++ z3++0715_default sat ✅ 0.28002 0.28011
Z3++_sq_0526_default sat ✅ 0.28718 0.28727
SMT-COMP 2024 0.20 (4/5) cvc5 cvc5 sat ✅ 0.22156 0.12206
SMTInterpol SMTInterpol unknown ❌ 0.42135 0.43517
SMT-RAT SMT-RAT sat ✅ 0.21798 0.11822
Yices2 Yices2 sat ✅ 0.22187 0.12212
Z3alpha Z3-alpha sat ✅ 0.29310 0.19369
SMT-COMP 2025 0.17 (5/6) cvc5 cvc5 sat ✅ 0.30714 0.17427
SMTInterpol SMTInterpol unknown ❌ 0.44157 0.41752
SMT-RAT SMT-RAT sat ✅ 0.29651 0.16801
Yices2 Yices2 sat ✅ 0.29376 0.16647
Z3alpha Z3-alpha sat ✅ 0.61156 0.48638
Z3 Z3-alpha-base sat ✅ 0.30895 0.18542
z3siri-base sat ✅ 0.30308 0.17357