Benchmark

non-incremental/QF_LIA/2019-cmodelsdiff/randomNontight/n40-sat-b19.lp.smt2

[1] Da Shen and Yuliya Lierler.
    "SMT-based Answer Set Solver CMODELS-DIFF (System Description)"
    34th International Conference on Logic Programming (2018)
Benchmark
Size481406
Compressed Size50341
License Creative Commons Attribution 4.0 International (CC-BY-4.0)
Categoryindustrial
First Occurrence2021-07-18
Generated ByDa Shen, Yuliya Lierler
Generated On2019-04-29 00:00:00
GeneratorCMODELS-DIFF
Dolmen OK1
strict Dolmen OK1
check-sat calls1
Query 1
Status unknown
Inferred Status sat
Size 481398
Compressed Size50348
Max. Term Depth4
Asserts 6587
Declared Functions0
Declared Constants1947
Declared Sorts 0
Defined Functions2
Defined Recursive Functions 0
Defined Sorts0
Constants0
Declared Datatypes0

Symbols

ite2 not8748 or5636 =>802
Int2 +802 -802 <2
<=40 >=842

Evaluations

Evaluation Rating Solver Variant Result Wallclock CPU Time
SMT-COMP 2022 cvc5 cvc5-default-2022-07-02-b15e116-wrapped_sq sat ✅ 1.57984 1.58040
MathSAT MathSAT-5.6.8_default sat ✅ 0.20180 0.20168
Par4 Par4-wrapped-sq_default sat ✅ 0.06225 0.05573
veriT veriT_default sat ✅ 0.22486 0.22507
Yices2 Yices 2.6.2 for SMTCOMP 2021_default sat ✅ 0.05141 0.05136
Z3 z3-4.8.17_default sat ✅ 0.24505 0.24695
SMT-COMP 2024 cvc5 cvc5 sat ✅ 0.67928 0.57936
OpenSMT OpenSMT sat ✅ 0.35391 0.25402
SMTInterpol SMTInterpol sat ✅ 3.16450 9.16339
Yices2 Yices2 sat ✅ 0.38290 0.28308
Z3alpha Z3-alpha sat ✅ 0.46637 0.36679
SMT-COMP 2025 cvc5 cvc5 sat ✅ 0.60860 0.48990
OpenSMT OpenSMT sat ✅ 0.36032 0.24375
SMTInterpol SMTInterpol sat ✅ 4.02874 10.94967
Yices2 Yices2 sat ✅ 0.34823 0.21782
Z3alpha Z3-alpha sat ✅ 0.65714 1.23075
Z3 Z3-alpha-base sat ✅ 0.56156 0.44059