Benchmark

non-incremental/QF_UF/2018-Goel-hwbench/QF_UF_lann.2.prop1_ab_cti_max.smt2

Generated by the tool Averroes 2 (successor of [1]) which implements safety property
verification on hardware systems.

This SMT problem belongs to a set of SMT problems generated by applying Averroes 2
to benchmarks derived from [2-5].

A total of 412 systems (345 from [2], 19 from [3], 26 from [4], 22 from [5]) were
syntactically converted from their original formats (using [6, 7]), and given to 
Averroes 2 to perform property checking with abstraction (wide bit-vectors -> terms, 
wide operators -> UF) using SMT solvers [8, 9].

[1] Lee S., Sakallah K.A. (2014) Unbounded Scalable Verification Based on Approximate
Property-Directed Reachability and Datapath Abstraction. In: Biere A., Bloem R. (eds)
Computer Aided Verification. CAV 2014. Lecture Notes in Computer Science, vol 8559.
Springer, Cham
[2] http://fmv.jku.at/aiger/index.html#beem
[3] http://www.cs.cmu.edu/~modelcheck/vcegar
[4] http://www.cprover.org/hardware/v2c
[5] http://github.com/aman-goel/verilogbench
[6] http://www.clifford.at/yosys
[7] http://github.com/chengyinwu/V3
[8] http://github.com/Z3Prover/z3
[9] http://github.com/SRI-CSL/yices2

id: lann.2.prop1
query-maker: "Yices 2"
query-time: 2.776000 ms
query-class: abstract
query-category: oneshot
query-type: cti
status: sat
Benchmark
Size131155
Compressed Size14566
License Creative Commons Attribution 4.0 International (CC-BY-4.0)
Categoryindustrial
First Occurrence2018-07-14
Generated ByAman Goel (amangoel@umich.edu), Karem A. Sakallah (karem@umich.edu)
Generated On2018-04-06 00:00:00
Generator
Dolmen OK1
strict Dolmen OK1
check-sat calls1
Query 1
Status sat
Inferred Status sat
Size 131147
Compressed Size14379
Max. Term Depth3
Asserts 1130
Declared Functions3
Declared Constants1794
Declared Sorts 3
Defined Functions0
Defined Recursive Functions 0
Defined Sorts0
Constants0
Declared Datatypes0

Symbols

ite56 not524 and462 =1200
distinct4

Evaluations

Evaluation Rating Solver Variant Result Wallclock CPU Time
SMT-COMP 2018 0.14 (6/7) CVC4 master-2018-06-10-b19c840-competition-default_default sat ✅ 0.15688 0.15706
MathSAT mathsat-5.5.2-linux-x86_64-Main_default sat ✅ 0.03519 0.03620
OpenSMT opensmt2_default unknown ❌ 0.22339 0.22329
SMTInterpol SMTInterpol-2.5-19-g0d39cdee_default sat ✅ 0.56275 1.41385
veriT veriT_default sat ✅ 0.02900 0.02900
Yices2 Yices 2.6.0_default sat ✅ 0.00969 0.00960
Z3 z3-4.7.1_default sat ✅ 0.05872 0.05865
SMT-COMP 2021 MathSAT mathsat-5.6.6_default sat ✅ 0.03873 0.03871
Par4 Par4-wrapped-sq_default sat ✅ 0.01875 0.00673
SMTInterpol smtinterpol-2.5-823-g881e8631_default sat ✅ 0.73490 2.01406
veriT veriT_default sat ✅ 0.03204 0.03229
Yices2 Yices 2.6.2 bug fix_default sat ✅ 0.01072 0.00879
Yices 2.6.2 for SMTCOMP2020_default sat ✅ 0.01273 0.00873
Z3 z3-4.8.11_default sat ✅ 0.04031 0.04022
z3-4.8.8_default sat ✅ 0.05763 0.05753
SMT-COMP 2022 cvc5 cvc5-default-2022-07-02-b15e116-wrapped_sq sat ✅ 0.25799 0.25856
MathSAT MathSAT-5.6.8_default sat ✅ 0.03668 0.03663
veriT veriT_default sat ✅ 0.03262 0.03283
Yices2 Yices 2.6.2 for SMTCOMP 2021_default sat ✅ 0.01298 0.00869
Z3 z3-4.8.17_default sat ✅ 0.05072 0.05212
z3-4.8.11_default sat ✅ 0.04012 0.04007
SMT-COMP 2023 cvc5 cvc5-default-2023-05-16-ea045f305_sq sat ✅ 0.09707 0.09744
OpenSMT OpenSMT a78dcf01_default sat ✅ 0.04898 0.04873
SMTInterpol smtinterpol-2.5-1272-g2d6d356c_default sat ✅ 0.79033 2.13589
Yices2 Yices 2 for SMTCOMP 2023_default sat ✅ 0.01046 0.00899
Yices 2.6.2 for SMTCOMP 2021_default sat ✅ 0.01048 0.00872
SMT-COMP 2025 cvc5 cvc5 sat ✅ 0.33648 0.20833
OpenSMT OpenSMT sat ✅ 0.29675 0.16056
SMTInterpol SMTInterpol sat ✅ 0.81562 1.80130
Yices2 Yices2 sat ✅ 0.34147 0.20052