Benchmark

non-incremental/QF_BV/20210312-Bouvier/vlsat3_a32.smt2

Publications:

[1] Pierre Bouvier, Hubert Garavel, and Hernan Ponce de Leon.
    "Automatic Decomposition of Petri Nets into Automata Networks -
    A Synthetic Account". Proceedings PETRI NETS 2020, LNCS 12152,
    Springer. https://doi.org/10.1007/978-3-030-51831-8_1

[2] Hubert Garavel. "Nested-Unit Petri Nets". Journal of Logical
    and Algebraic Methods in Programming, vol. 104, Elsevier, 2019. 
    https://doi.org/10.1016/j.jlamp.2018.11.005

In [1], several methods for decomposing an ordinary, safe Petri net
into a flat, unit-safe NUPN [2], have been proposed. These methods
are implemented in a complete tool chain involving SAT solvers, SMT
solvers, and tools for graph coloring and finding maximal cliques.
From a data set of 12,000+ NUPN models, 51,000+ SMT formulas have
been generated, out of which a subset of 1200 interesting formulas
to be used as SMT-LIB 2.6 benchmarks was carefully selected.

Original filename: vlsat3_a32.smt2

Specific parameters for the present benchmark:
- number of places: 87
- number of units: 19
- number of edges in the concurrency graph: 2585
- number of variables: 87
- number of uninterpreted functions: 0
- number of asserts: 2672
- total number of operators in asserts: 13330
Benchmark
Size139860
Compressed Size9900
License Creative Commons Attribution 4.0 International (CC-BY-4.0)
Categoryindustrial
First Occurrence2021-07-18
Generated ByPierre Bouvier
Generated On2021-03-12 00:00:00
Generator
Dolmen OK1
strict Dolmen OK1
check-sat calls1
Query 1
Status unsat
Inferred Status unsat
Size 139852
Compressed Size9438
Max. Term Depth3
Asserts 2672
Declared Functions0
Declared Constants87
Declared Sorts 0
Defined Functions0
Defined Recursive Functions 0
Defined Sorts0
Constants0
Declared Datatypes0

Symbols

=2585 distinct87 extract18 bvand2585

Evaluations

Evaluation Rating Solver Variant Result Wallclock CPU Time
SMT-COMP 2023 0.50 (3/6) Bitwuzla Bitwuzla-fixed_default unsat ✅ 124.66900 124.65000
cvc5 cvc5-default-2023-05-16-ea045f305_sq unknown ❌ 1200.11000 1200.01000
STP STP 2022.4_default unsat ✅ 89.56000 89.55800
STP 2022.4_default unsat ✅ 88.93920 88.94700
UltimateEliminator UltimateIntBlastingWrapper+SMTInterpol_default unknown ❌ 1200.07000 1304.23000
Yices2 Yices 2 for SMTCOMP 2023_default unknown ❌ 1200.03000 1199.80000
Z3-Owl z3-Owl-Final_default unknown ❌ 1200.02000 1199.88000
z3-Owl-Final_default unsat ✅ 232.64400 232.59600
SMT-COMP 2024 0.50 (3/6) Bitwuzla Bitwuzla unsat ✅ 96.10366 95.99893
cvc5 cvc5 unknown ❌ 1201.72293 1201.09395
SMTInterpol SMTInterpol unknown ❌ 1202.22046 1254.62604
STP STP unsat ✅ 160.56646 160.39658
Yices2 Yices2 unsat ✅ 777.21758 777.08190
Z3alpha Z3-alpha unknown ❌ 1201.71794 1201.00916
SMT-COMP 2025 0.22 (7/9) Bitwuzla Bitwuzla unsat ✅ 87.49211 87.34965
Bitwuzla-MachBV-base unsat ✅ 110.20367 110.06052
Bitwuzla-MachBV Bitwuzla-MachBV unsat ✅ 2.00541 1.88175
BVDecide bv_decide unsat ✅ 139.27975 139.14563
bv_decide-nokernel unsat ✅ 102.60116 102.45396
cvc5 cvc5 unsat ✅ 925.91586 925.69894
SMTInterpol SMTInterpol unknown ❌ 1201.78673 1237.04784
Yices2 Yices2 unsat ✅ 2.92333 2.80546
Z3alpha Z3-alpha unsat ✅ 583.77017 2332.73786
Z3 Z3-alpha-base unsat ✅ 338.12259 337.90900
Z3-Owl-base unknown ❌ 1201.28485 1200.95127
z3siri-base unsat ✅ 321.01887 320.86135
Z3-Owl Z3-Owl unknown ❌ 1201.75635 1200.94546