Benchmark

non-incremental/QF_BV/20210312-Bouvier/vlsat3_g65.smt2

Publications:

[1] Pierre Bouvier, Hubert Garavel, and Hernan Ponce de Leon.
    "Automatic Decomposition of Petri Nets into Automata Networks -
    A Synthetic Account". Proceedings PETRI NETS 2020, LNCS 12152,
    Springer. https://doi.org/10.1007/978-3-030-51831-8_1

[2] Hubert Garavel. "Nested-Unit Petri Nets". Journal of Logical
    and Algebraic Methods in Programming, vol. 104, Elsevier, 2019. 
    https://doi.org/10.1016/j.jlamp.2018.11.005

In [1], several methods for decomposing an ordinary, safe Petri net
into a flat, unit-safe NUPN [2], have been proposed. These methods
are implemented in a complete tool chain involving SAT solvers, SMT
solvers, and tools for graph coloring and finding maximal cliques.
From a data set of 12,000+ NUPN models, 51,000+ SMT formulas have
been generated, out of which a subset of 1200 interesting formulas
to be used as SMT-LIB 2.6 benchmarks was carefully selected.

Original filename: vlsat3_g65.smt2

Specific parameters for the present benchmark:
- number of places: 164
- number of units: 100
- number of edges in the concurrency graph: 13302
- number of variables: 164
- number of uninterpreted functions: 0
- number of asserts: 13466
- total number of operators in asserts: 67794
Benchmark
Size1790038
Compressed Size50963
License Creative Commons Attribution 4.0 International (CC-BY-4.0)
Categoryindustrial
First Occurrence2021-07-18
Generated ByPierre Bouvier
Generated On2021-03-12 00:00:00
Generator
Dolmen OK1
strict Dolmen OK1
check-sat calls1
Query 1
Status sat
Inferred Status sat
Size 1790030
Compressed Size50971
Max. Term Depth3
Asserts 13466
Declared Functions0
Declared Constants164
Declared Sorts 0
Defined Functions0
Defined Recursive Functions 0
Defined Sorts0
Constants0
Declared Datatypes0

Symbols

=13302 distinct164 extract99 bvand13302

Evaluations

Evaluation Rating Solver Variant Result Wallclock CPU Time
SMT-COMP 2022 Bitwuzla Bitwuzla-fixed_default sat ✅ 9.90548 9.90524
Bitwuzla-wrapped_default sat ✅ 9.23542 9.23567
cvc5 cvc5-default-2022-07-02-b15e116-wrapped_sq sat ✅ 32.27000 32.26890
MathSAT MathSAT-5.6.8_default sat ✅ 8.98162 8.98066
STP STP 2022.4_default sat ✅ 10.73250 10.72760
Yices2 Yices 2.6.2 for SMTCOMP 2021_default sat ✅ 25.75600 25.75530
Z3++BV z3++bv_0702_default sat ✅ 9.19716 9.19639
Z3 z3-4.8.17_default sat ✅ 3.51029 3.51231
SMT-COMP 2023 0.17 (5/6) Bitwuzla Bitwuzla-fixed_default sat ✅ 14.53120 14.52860
cvc5 cvc5-default-2023-05-16-ea045f305_sq sat ✅ 33.00610 33.00220
STP STP 2022.4_default sat ✅ 10.75930 10.75890
STP 2022.4_default sat ✅ 10.69060 10.69060
UltimateEliminator UltimateIntBlastingWrapper+SMTInterpol_default unknown ❌ 262.35000 778.56500
Yices2 Yices 2 for SMTCOMP 2023_default sat ✅ 145.55100 145.50800
Z3-Owl z3-Owl-Final_default sat ✅ 135.51500 135.52400
z3-Owl-Final_default sat ✅ 140.66000 140.62700
SMT-COMP 2024 0.17 (5/6) Bitwuzla Bitwuzla sat ✅ 5.67873 5.57862
cvc5 cvc5 sat ✅ 19.88629 19.78601
SMTInterpol SMTInterpol unknown ❌ 118.95933 393.29274
STP STP sat ✅ 6.87652 6.77469
Yices2 Yices2 sat ✅ 6.75075 6.64770
Z3alpha Z3-alpha sat ✅ 366.88940 366.58439
SMT-COMP 2025 0.11 (8/9) Bitwuzla Bitwuzla sat ✅ 2.62557 2.50681
Bitwuzla-MachBV-base sat ✅ 5.20424 5.08709
Bitwuzla-MachBV Bitwuzla-MachBV sat ✅ 3.72710 3.59643
BVDecide bv_decide sat ✅ 169.96508 169.87862
bv_decide-nokernel sat ✅ 161.10161 160.97143
cvc5 cvc5 sat ✅ 12.36783 12.24754
SMTInterpol SMTInterpol unknown ❌ 59.36876 198.02946
Yices2 Yices2 sat ✅ 2.89269 2.77327
Z3alpha Z3-alpha sat ✅ 9.69071 21.45614
Z3 Z3-alpha-base sat ✅ 2.80005 2.67608
Z3-Owl-base sat ✅ 9.49774 9.37358
z3siri-base sat ✅ 2.75630 2.63567
Z3-Owl Z3-Owl sat ✅ 57.53358 57.40345