Benchmark

non-incremental/QF_LIA/2019-cmodelsdiff/labyrinth/0011-labyrinth-12-0.smt2

[1] Da Shen and Yuliya Lierler.
    "SMT-based Answer Set Solver CMODELS-DIFF (System Description)"
    34th International Conference on Logic Programming (2018)
Benchmark
Size14608175
Compressed Size1545489
License Creative Commons Attribution 4.0 International (CC-BY-4.0)
Categoryindustrial
First Occurrence2021-07-18
Generated ByDa Shen, Yuliya Lierler
Generated On2019-04-29 00:00:00
GeneratorCMODELS-DIFF
Dolmen OK1
strict Dolmen OK1
check-sat calls1
Query 1
Status unknown
Inferred Status sat
Size 14608167
Compressed Size1545497
Max. Term Depth4
Asserts 222426
Declared Functions0
Declared Constants68796
Declared Sorts 0
Defined Functions2
Defined Recursive Functions 0
Defined Sorts0
Constants0
Declared Datatypes0

Symbols

ite2 not265233 or212074 =>4752
Int2 +4752 -4752 <2
<=1331 >=6083

Evaluations

Evaluation Rating Solver Variant Result Wallclock CPU Time
SMT-COMP 2021 cvc5 cvc5-fixed_default sat ✅ 26.97730 26.97480
MathSAT mathsat-5.6.6_default sat ✅ 7.49612 7.49565
OpenSMT OpenSMT-fixed_default sat ✅ 57.77460 57.74490
Par4 Par4-wrapped-sq_default sat ✅ 2.10353 5.99783
SMTInterpol smtinterpol-2.5-823-g881e8631_default sat ✅ 29.62470 47.30260
veriT veriT_default sat ✅ 8.85092 8.84986
Z3 z3-4.8.11_default sat ✅ 14.84610 14.84020
SMT-COMP 2024 cvc5 cvc5 sat ✅ 14.02654 13.92655
OpenSMT OpenSMT sat ✅ 9.03795 8.93285
SMTInterpol SMTInterpol sat ✅ 280.05667 673.43407
Yices2 Yices2 sat ✅ 1.96319 1.86313
Z3alpha Z3-alpha sat ✅ 11.52902 11.42307
SMT-COMP 2025 cvc5 cvc5 sat ✅ 13.26719 13.15538
OpenSMT OpenSMT sat ✅ 13.41682 13.29743
SMTInterpol SMTInterpol sat ✅ 23.05469 41.25682
Yices2 Yices2 sat ✅ 1.04207 0.92288
Z3alpha Z3-alpha sat ✅ 9.70738 23.63705
Z3 Z3-alpha-base sat ✅ 7.71614 7.59403