Benchmark

non-incremental/QF_BV/20210312-Bouvier/vlsat3_a13.smt2

Publications:

[1] Pierre Bouvier, Hubert Garavel, and Hernan Ponce de Leon.
    "Automatic Decomposition of Petri Nets into Automata Networks -
    A Synthetic Account". Proceedings PETRI NETS 2020, LNCS 12152,
    Springer. https://doi.org/10.1007/978-3-030-51831-8_1

[2] Hubert Garavel. "Nested-Unit Petri Nets". Journal of Logical
    and Algebraic Methods in Programming, vol. 104, Elsevier, 2019. 
    https://doi.org/10.1016/j.jlamp.2018.11.005

In [1], several methods for decomposing an ordinary, safe Petri net
into a flat, unit-safe NUPN [2], have been proposed. These methods
are implemented in a complete tool chain involving SAT solvers, SMT
solvers, and tools for graph coloring and finding maximal cliques.
From a data set of 12,000+ NUPN models, 51,000+ SMT formulas have
been generated, out of which a subset of 1200 interesting formulas
to be used as SMT-LIB 2.6 benchmarks was carefully selected.

Original filename: vlsat3_a13.smt2

Specific parameters for the present benchmark:
- number of places: 80
- number of units: 16
- number of edges in the concurrency graph: 2970
- number of variables: 80
- number of uninterpreted functions: 0
- number of asserts: 3050
- total number of operators in asserts: 15210
Benchmark
Size149840
Compressed Size10957
License Creative Commons Attribution 4.0 International (CC-BY-4.0)
Categoryindustrial
First Occurrence2021-07-18
Generated ByPierre Bouvier
Generated On2021-03-12 00:00:00
Generator
Dolmen OK1
strict Dolmen OK1
check-sat calls1
Query 1
Status unsat
Inferred Status unsat
Size 149832
Compressed Size10373
Max. Term Depth3
Asserts 3050
Declared Functions0
Declared Constants80
Declared Sorts 0
Defined Functions0
Defined Recursive Functions 0
Defined Sorts0
Constants0
Declared Datatypes0

Symbols

=2970 distinct80 extract15 bvand2970

Evaluations

Evaluation Rating Solver Variant Result Wallclock CPU Time
SMT-COMP 2021 0.25 (3/4) Bitwuzla Bitwuzla-fixed_default unsat ✅ 101.15200 101.10900
MathSAT mathsat-5.6.6_default unknown ❌ 1200.04000 1199.82000
STP STP 2021.0_default unsat ✅ 165.35700 656.15000
Z3 z3-4.8.11_default unsat ✅ 315.32200 315.26900
SMT-COMP 2022 Bitwuzla Bitwuzla-fixed_default unsat ✅ 111.32800 111.30400
Bitwuzla-wrapped_default unsat ✅ 66.20920 66.18840
cvc5 cvc5-default-2022-07-02-b15e116-wrapped_sq unsat ✅ 758.26600 758.22800
MathSAT MathSAT-5.6.8_default unsat ✅ 487.85200 487.79600
STP STP 2022.4_default unsat ✅ 158.42400 158.40600
Yices2 Yices 2.6.2 for SMTCOMP 2021_default unsat ✅ 106.32000 106.31600
Z3++BV z3++bv_0702_default unsat ✅ 1087.49000 1087.34000
Z3 z3-4.8.17_default unsat ✅ 188.64500 188.64500
SMT-COMP 2023 0.33 (4/6) Bitwuzla Bitwuzla-fixed_default unsat ✅ 66.72980 66.73550
cvc5 cvc5-default-2023-05-16-ea045f305_sq unsat ✅ 1119.11000 1119.06000
STP STP 2022.4_default unsat ✅ 150.88100 150.88200
STP 2022.4_default unsat ✅ 151.33400 151.31200
UltimateEliminator UltimateIntBlastingWrapper+SMTInterpol_default unknown ❌ 1200.05000 1326.11000
Yices2 Yices 2 for SMTCOMP 2023_default unknown ❌ 1200.02000 1199.90000
Z3-Owl z3-Owl-Final_default unknown ❌ 1200.11000 1200.07000
z3-Owl-Final_default unsat ✅ 141.95000 141.91500
SMT-COMP 2025 0.11 (8/9) Bitwuzla Bitwuzla unsat ✅ 55.61752 55.47763
Bitwuzla-MachBV-base unsat ✅ 61.32515 61.18254
Bitwuzla-MachBV Bitwuzla-MachBV unsat ✅ 0.96858 0.83624
BVDecide bv_decide unsat ✅ 142.80925 142.66393
bv_decide-nokernel unsat ✅ 101.92080 101.80386
cvc5 cvc5 unsat ✅ 549.62617 549.42090
SMTInterpol SMTInterpol unknown ❌ 1201.77248 1232.66322
Yices2 Yices2 unsat ✅ 0.94621 0.81791
Z3alpha Z3-alpha unsat ✅ 335.61184 1340.48224
Z3 Z3-alpha-base unsat ✅ 255.43650 255.29514
Z3-Owl-base unknown ❌ 1201.25009 1201.05968
z3siri-base unsat ✅ 241.45433 241.31539
Z3-Owl Z3-Owl unsat ✅ 639.97153 639.76327