Benchmark

non-incremental/QF_BV/20210312-Bouvier/vlsat3_a23.smt2

Publications:

[1] Pierre Bouvier, Hubert Garavel, and Hernan Ponce de Leon.
    "Automatic Decomposition of Petri Nets into Automata Networks -
    A Synthetic Account". Proceedings PETRI NETS 2020, LNCS 12152,
    Springer. https://doi.org/10.1007/978-3-030-51831-8_1

[2] Hubert Garavel. "Nested-Unit Petri Nets". Journal of Logical
    and Algebraic Methods in Programming, vol. 104, Elsevier, 2019. 
    https://doi.org/10.1016/j.jlamp.2018.11.005

In [1], several methods for decomposing an ordinary, safe Petri net
into a flat, unit-safe NUPN [2], have been proposed. These methods
are implemented in a complete tool chain involving SAT solvers, SMT
solvers, and tools for graph coloring and finding maximal cliques.
From a data set of 12,000+ NUPN models, 51,000+ SMT formulas have
been generated, out of which a subset of 1200 interesting formulas
to be used as SMT-LIB 2.6 benchmarks was carefully selected.

Original filename: vlsat3_a23.smt2

Specific parameters for the present benchmark:
- number of places: 120
- number of units: 16
- number of edges in the concurrency graph: 6855
- number of variables: 120
- number of uninterpreted functions: 0
- number of asserts: 6975
- total number of operators in asserts: 34755
Benchmark
Size341481
Compressed Size21105
License Creative Commons Attribution 4.0 International (CC-BY-4.0)
Categoryindustrial
First Occurrence2021-07-18
Generated ByPierre Bouvier
Generated On2021-03-12 00:00:00
Generator
Dolmen OK1
strict Dolmen OK1
check-sat calls1
Query 1
Status unsat
Inferred Status unsat
Size 341473
Compressed Size21112
Max. Term Depth3
Asserts 6975
Declared Functions0
Declared Constants120
Declared Sorts 0
Defined Functions0
Defined Recursive Functions 0
Defined Sorts0
Constants0
Declared Datatypes0

Symbols

=6855 distinct120 extract15 bvand6855

Evaluations

Evaluation Rating Solver Variant Result Wallclock CPU Time
SMT-COMP 2021 Bitwuzla Bitwuzla-fixed_default unsat ✅ 176.78700 176.76700
MathSAT mathsat-5.6.6_default unsat ✅ 757.07400 756.95600
STP STP 2021.0_default unsat ✅ 224.66300 890.32800
Z3 z3-4.8.11_default unsat ✅ 147.56500 147.55800
SMT-COMP 2022 0.14 (6/7) Bitwuzla Bitwuzla-fixed_default unsat ✅ 105.39800 105.39200
Bitwuzla-wrapped_default unsat ✅ 218.40900 218.35800
cvc5 cvc5-default-2022-07-02-b15e116-wrapped_sq unsat ✅ 896.61800 896.44400
MathSAT MathSAT-5.6.8_default unknown ❌ 1200.03000 1199.81000
STP STP 2022.4_default unsat ✅ 122.00100 121.98800
Yices2 Yices 2.6.2 for SMTCOMP 2021_default unsat ✅ 75.36540 75.36520
Z3++BV z3++bv_0702_default unsat ✅ 867.64000 867.57000
Z3 z3-4.8.17_default unsat ✅ 150.75700 150.71700
SMT-COMP 2023 0.33 (4/6) Bitwuzla Bitwuzla-fixed_default unsat ✅ 170.71700 170.62000
cvc5 cvc5-default-2023-05-16-ea045f305_sq unsat ✅ 575.80100 575.79000
STP STP 2022.4_default unsat ✅ 126.62600 126.62400
STP 2022.4_default unsat ✅ 126.45700 126.45600
UltimateEliminator UltimateIntBlastingWrapper+SMTInterpol_default unknown ❌ 1200.06000 1278.94000
Yices2 Yices 2 for SMTCOMP 2023_default unknown ❌ 1200.10000 1200.04000
Z3-Owl z3-Owl-Final_default unknown ❌ 1200.12000 1200.09000
z3-Owl-Final_default unsat ✅ 303.03200 303.00900
SMT-COMP 2025 0.22 (7/9) Bitwuzla Bitwuzla unsat ✅ 142.81985 142.67383
Bitwuzla-MachBV-base unsat ✅ 77.59944 77.47600
Bitwuzla-MachBV Bitwuzla-MachBV unsat ✅ 1.03268 0.91022
BVDecide bv_decide unsat ✅ 240.47015 240.30059
bv_decide-nokernel unsat ✅ 118.56167 118.40919
cvc5 cvc5 unsat ✅ 208.09833 207.95220
SMTInterpol SMTInterpol unknown ❌ 1201.88921 1564.27629
Yices2 Yices2 unsat ✅ 1.77801 1.63205
Z3alpha Z3-alpha unsat ✅ 119.28371 475.40044
Z3 Z3-alpha-base unsat ✅ 222.99860 222.83539
Z3-Owl-base unsat ✅ 1190.15650 1189.83289
z3siri-base unsat ✅ 211.57274 211.43577
Z3-Owl Z3-Owl unknown ❌ 1201.75639 1200.87745