RELATED APPLICATIONSThis is a continuation of application Ser. No. 09/677,262 filed Sep. 30, 2000, which is currently pending.[0001]
FIELD OF THE INVENTIONThis invention relates generally to automated design verification, and in particular to more efficient use of binary decision diagrams to perform automated symbolic model checking for very large scale integrated circuit designs and other finite state systems.[0002]
BACKGROUND OF THE INVENTIONModern design of very large-scale integrated circuits often involves years of research and the efforts of hundreds of engineers. Automated formal verification methods are an essential part of the design effort, reducing errors, lost time and risk to financial investment. Formal verification involves building a finite model of a system as a set of states and state transitions and checking that a desired property holds in the model. An exhaustive search of all possible states of the model may be performed in order to verify a desired property.[0003]
As the size and complexity of designs increase, much effort is expended to improve the efficiency of automated formal verification methods. One technique used in symbolic model checking to improve efficiency is to employ binary decision diagrams (BDDs). A BDD is a directed acyclic graph that represents a Boolean expression. For each Boolean variable, there are two outgoing edges representing true or false assignments to the variable. The use of BDDs permits computation times, which are some polynomial function of the number of expression variables. Alternative representations such as clauses or truth tables require execution times, which are some exponential function of the number of expression variables. Therefore, use of BDDs has been popular in the formal verification community since the late 1980's.[0004]
BDDs, however, are not without drawbacks. The ordering of variables is critical to an efficient use of BDDs. Poor variable ordering can increase a BDDs size and cause exponential execution times. One method for symbolic model checking using BDDs comes from Carnegie Mellon University and is known as Symbolic Model Verifier (SMV).[0005]
Alternatively SMV uses a well known heuristic based procedure named simplify_assuming that is aimed at reducing BDD representations by simplifying a predicate using an invariant assumption but introduces a proof obligation, which must also be verified. Since the assumption is static it may also be ineffective in pruning a model.[0006]
Over the years, techniques have been developed to improve performance and capacity of BDD-based algorithms. One technique is called Cone of Influence (COI) reduction. In COI reduction, an abstraction is built for a circuit model consisting of next state functions only for variables in the dependency closure of variables of interest in the circuit specification. One drawback is that all variables in the dependency closure do not necessarily influence the variables of interest in the circuit specification. A second drawback is that the abstraction that is built and used for each model-checking step may include portions that are useful in only a few of the model checking steps. Therefore needless extra computations are potentially performed, resulting in little benefit to the circuit verification.[0007]
Some methods have attempted to improve upon COI reduction by starting from a small portion of the dependency closure and extending the portion only when model checking fails to produce a satisfactory result. But these techniques also perform unnecessary computations on portions that are not relevant to the particular model-checking step being performed.[0008]
One method called the bounded cone of influence (BCOI) was proposed by A. Biere et al for symbolic model checking without BDDs [A. Biere, E. Clark, R. Raimi, and Y. Zhu; Verifying safety properties of a PowerPC™ microprocessor using symbolic model checking without BDDs; CAV'99; 1999]. However, even the BCOI method potentially includes irrelevant variables in the abstraction it builds, and the technique is not applicable to improve the widely used BDD-based approaches.[0009]
BRIEF DESCRIPTION OF THE DRAWINGSThe present invention is illustrated by way of example and not limitation in the figures of the accompanying drawings.[0010]
FIG. 1 illustrates an example of a circuit.[0011]
FIG. 2[0012]agraphically illustrates a transition relation for a circuit.
FIG. 2[0013]bshows another transition relation built as part of a lazy pre-image computation.
FIG. 3[0014]aillustrates one embodiment of a method for performing lazy pre-image computations.
FIG. 3[0015]billustrates one embodiment of a more detailed method for performing lazy pre-image computations.
FIG. 4[0016]aillustrates one embodiment of a method for computing a fixpoint using lazy pre-image computations.
FIG. 4[0017]bshows an example of a lazy fixpoint computation for a circuit.
FIG. 5[0018]aillustrates a parsing of a property to be evaluated.
FIG. 5[0019]b.illustrates one embodiment of a method for producing and propagating assumptions from sub-properties to be evaluated.
FIG. 5[0020]c.shows an example of producing and propagating assumptions from sub-properties to be evaluated.
FIG. 6[0021]agraphically illustrates the state space of a model with its transition relation built using lazy pre-image computations to evaluatesub-property530.
FIG. 6[0022]bgraphically illustrates the state space of a dynamically pruned model with its transition relation built using lazy pre-image computations to evaluatesub-property531 underassumption541.
FIG. 6[0023]cgraphically illustrates the state space of a dynamically pruned model with its transition relation built using lazy pre-image computations to evaluatesub-property532 underassumption542.
FIG. 6[0024]dgraphically illustrates the state space of a dynamically pruned model with its transition relation built using lazy pre-image computations to evaluatesub-property533 underassumption543.
FIG. 7[0025]aillustrates one embodiment of a method for dynamically pruning a model by producing and propagating assumptions from sub-properties to be evaluated and pruning the transition relation under the assumptions generated.
FIG. 7[0026]ban example of dynamically pruning a transition relation for a model under assumptions generated from sub-properties to be evaluated.
FIG. 8 depicts a computing system for automated lazy symbolic model checking of finite state systems using symbolic variable reduction.[0027]
DETAILED DESCRIPTIONEmbodiments of the present invention may be realized in accordance with the following teachings and it should be evident that various modifications and changes may be made in the following teachings without departing from the broader spirit and scope of the invention. The specification and drawings are, accordingly, to be regarded in an illustrative rather than restrictive sense and the invention measured only in terms of the claims.[0028]
Methods for formal verification of circuits and other finite-state systems are disclosed herein providing improved efficiency and capacity of popular binary decision diagram (BDD) based algorithms. For one embodiment of a lazy pre-image computation, a method is disclosed that builds new transition relation partitions on-demand only for relevant next internal variables of a state predicate, and conjoins only next state relations for relevant next internal variables to a pre-image including the state predicate. For one embodiment of a lazy fixpoint computation, a method is disclosed that makes iterative use of lazy pre-image computation to compute conditions that must necessarily be satisfied to produce a given set of states. For one embodiment of forward assumption propagation, a method is disclosed that generates assumptions to characterize a set of interesting states for sub-properties of the property being evaluated at one or more evaluation stages. For one embodiment of a dynamic transition relation pruning technique, a method is disclosed that improves the efficiency for symbolic model checking computations by pruning transition relations under assumptions dynamically generated from the properties being evaluated, thereby providing means to handle very large scale integrated circuits and other finite state systems of problematic complexity for prior methods. The teachings of these disclosed methods provide for symbolic model checking of circuits and other finite state systems previously too large to be completed successfully using BDD based algorithms.[0029]
FIG. 1 illustrates an example of a[0030]circuit101 having internal state variables c, d, e and f; and input variables a and b. According to the logical combination of inputs tomemory element102, the value of internal state variable c at its next transition can be determined to be the value of the Boolean expression a AND c. From the logical combination of inputs tomemory element103, the value of internal state variable d at its next transition can be determined to be the value of the input variable b. From the logical combination of inputs tomemory element104, the value of internal state variable e at its next transition can be determined to be the value of the Boolean expression c OR e. Finally, from the logical combination of inputs tomemory element105, the value of internal state variable f at its next transition can be determined to be the value of the Boolean expression c NAND d.
A model of a circuit or other finite state system can be formally defined as: a nonempty finite set of Boolean variables, V=V[0031]S∪VI, consisting of a union V of internal state variables VSwith input variables VI; and a next state function N(v) for each v in VS, which is an assignment mapping of internal state variables according to Boolean (true or false) valued expressions on V called state predicates.
For one embodiment, a model may be represented as a nonempty transition relation, R, on state predicate pairs, where (S1, S2) is an element of the transition relation, R, if there exists a transition in the finite state system from a state satisfying predicate S1 to state satisfying predicate S2. R is also referred to as a model of the finite state system.[0032]
A partitioned transition relation, R, on a partitioning of the internal state variables {V1, V2, . . . , Vk} has the implicitly conjoined form:[0033]
R(V,V′)=R1(V,V1′) ANDR2(V,V2′) . . . ANDRk(V,Vk′)
where the ith partition is Ri(V,Vi′)=AND[0034]for all v′ in Vi′ (v′=N(v)). The assertion v′=N(v) is called the next state relation for v and v′ is a copy of v to record the value taken on by v at the next transition.
The set of states, S, may be represented using a Boolean state predicate S(V). Operations on sets may be carried out as algebraic manipulations of state predicates. The set of states that can move to S in one transition is called the pre-image of S and written[0035]
Pre(S(V))=∃V′.[ANDfor all v′ in Vs′ (v′=N(v)) ANDS(V′)].
An existential operation ∃V′.[S(V′)] represents a quantification of state predicate S(V′) over the variables in V′. Typically, in order to more efficiently use computation resources, the operation of computing the pre-image of a set of states is carried out as a relation product of state predicates using early variable quantification for partitioned transition relations, thereby permitting staged reductions of Boolean expressions, as follows:
[0036] | |
| |
| Pre(S(V)) = ∃V1′.[R1(V,V1′) AND ( |
| ∃V2′.[R2(V, V2′) AND ( |
| . . . |
| ∃Vk.[Rk(V,Vk′) AND ( |
| ∃VI′.S(V′) )] |
| . . . )] |
| )]. |
| |
An alternative definition for a model can be set forth as a pair of induced transformers, Pre and Post, such that Pre(S2) includes S1 and Post(S1) includes S2 if (S1,S2) is an element of R. In other words, the Pre transformer identifies any states satisfying predicate S, for which there exists a transition to some state satisfying predicate S′. Pre is called a pre-image transformer. The Post transformer identifies any states satisfying predicate S′, for which there exists a transition from some state satisfying predicate S. Post is called a post-image transformer.[0037]
One drawback of a typical pre-image computation is that it involves the entire partitioned transition relation. But S(V) may involve only a few variables. Consequently, not all next state relations are relevant in any particular invocation of a pre-image computation.[0038]
For example, FIG. 2[0039]agraphically illustrates apossible transition relation201 forcircuit101 having VS={c, d, e, f} and VI={a, b}. The next state function for variable c is N(c)=a AND c. Therefore, in order for the circuit to reach a state, S(c), where c=1 it must have madetransition211 from a state S(a AND c) where a=1 and c=1. The next state function for variable d is N(d)=b. Therefore, in order for the circuit to reach a state, S(d), where d=1 it must have madetransition219 from a state S(b) where b=1. The next state function for variable e is N(e)=e OR c. Therefore, in order for the circuit to reach a state, S(e), where e=1 it must have madetransition212 from a state S(c) where c=1 or it must have madetransition213 from a state S(e) where e=1. The next state function for variable f is N(f)=d NAND c. Therefore, in order for the circuit to reach a state, S(f), where f=1 it must have madetransition215 from a state S(NOT c) where c=0 or it must have madetransition218 from a state S(NOT d) where d=0.
Computing all states reachable to S(e) in two or more transitions includes the next state function for variable c, which has already been shown as N(c)=a AND c represented by[0040]transition211. The next state function for variable NOT c is N(NOT c)=NOT(a AND c)=(NOT a) OR (NOT c). Therefore, in order for thecircuit101 to reach a state, S(NOT c), where c=0 it must have madetransition214 from a state S(NOT a) where a=0 or it must have madetransition216 from a state S(NOT c) where c=0. The next state function for variable NOT d is N(NOT d)=NOT b. Therefore, in order for the circuit to reach a state, S(NOT d), where d=0 it must have madetransition217 from a state S(NOT b) where b=0.
For a given state predicate, an invocation of a pre-image computation that uses[0041]transition relation201 may result in computations that are not relevant to that state predicate. For one embodiment, a lazy pre-image computation is disclosed which provides a relevant transition relation abstraction for each pre-image computation according to the state predicate of the invocation. Such a lazy pre-image computation may be performed for a state predicate S(W), where W is contained in V and WS′ is the set of next internal variables in the set of next variables W′, as follows:
Pre(S(W))=∃W′.[ANDfor all v′ in Ws′(v′=N(v)) ANDS(W′)].
The approach provided by the lazy pre-image computation disclosed above differs from previous COI reduction approaches in that it is not statically derived from a model specification and then used throughout. Instead, it dynamically provides an abstraction for each pre-image computation that is relevant to the particular state predicate associated with the invocation. Accordingly, lazy pre-image computation provides for greater efficiency and capacity improvements in popular BDD-based symbolic model checking methods than previously used pre-image computation methods.[0042]
For example, the lazy pre-image of a state predicate S(e) for
[0043]circuit101 where e=1 can be computed:
FIG. 2[0044]bgraphically illustrates apossible transition relation202 forcircuit101 built as a result of an invocation of the lazy pre-image computation Pre(S(e)) on the state predicate S(e) where e=1. The next state function for variable e is N(e)=e OR c. Therefore, in order for the circuit to reach a state, S(e), where e=1 it must have madetransition222 from a state S(c) where c=1 or it must have madetransition223 from a state S(e) where e=1. Since no other transitions are relevant to reaching state S(e), the lazy pre-image method need not build them. As seen in the above example, this lazy pre-image method potentially reduces the number of transition relation partitions involved and also the sizes of partitions. Therefore computations required to explicitly build a BDD for a desired function may be significantly reduced.
For one embodiment, FIG. 3[0045]aillustrates performing a lazy pre-image computation. Inprocessing block311 transition relation partitions are updated as needed by adding new transition relations for only the relevant next internal variables. In processing block312 a pre-image is initialized to the next state predicate of the invocation and existentially quantified over the relevant next input variables. Inprocessing block313, partitions with relevant next variables are identified. Finally inprocessing block314, next state relations for relevant variables from the partitions identified inprocessing block313 are conjoined to the pre-image and quantified.
The lazy pre-image method disclosed above provides for greater efficiency and capacity for symbolic model checking operations, particularly on circuits with a large number of variables. In a BDD based implementation, building transition relation partitions only as needed and only for relevant next internal variables is especially beneficial since the next state function for an internal variable is efficiently and implicitly encoded, but a BDD for the function must be explicitly built for symbolic model checking. Explicitly building BDDs unnecessarily may become computationally expensive.[0046]
FIG. 3[0047]bdetails one embodiment of a method for performing a lazy pre-image computation on a state predicate S(W) involving a set W of internal variables and input variables. Inprocessing block320, WS′ is initialized to be the set of next internal variables in W′. Inprocessing block321, WI′ is initialized to be the set of next input variables in W′. Inprocessing block322, the next internal variables are checked to identify some variable w′ that has not been evaluated. If one is identified, w′=N(w) is conjoined to the partition including w′ and flow returns to processing block322 to look for more next variables that have not been evaluated. Thus the transition relation partitions are built as needed for the relevant next internal variables. When no more are found, flow proceeds atprocessing block324. Inprocessing block324 the pre-image is initialized to the state predicate existentially quantified for the relevant next input variables and partition counter i is set to k+1. Inprocessing block325, i is decremented. Then in processingblock326, partition counter i is tested to see if it has reached zero. If partition counter i has not reached zero, inprocessing block327 partition Vi′ is checked against W′ to identify relevant variables. If no relevant variables are found, partition Vi′ is skipped and flow proceeds atprocessing block325. Otherwise inprocessing block328, all next variables in Vi′ that are not in W′ are existentially quantified out from partition Vi′ and the remaining relevant variables are evaluated according to their next state relations and assigned to Ra. Then in processingblock329, Ra is conjoined with the pre-image Pre and flow proceeds with the next i at processingblock325. When i=0 indicating no more partitions remain atprocessing block326, the processing terminates and pre-image Pre is complete.
In one embodiment, the lazy pre-image computation disclosed above provides for potential improvements in key model checking techniques. For example one embodiment of a lazy pre-image method provides an efficient general operation that may also be used effectively in performing fixpoint computations.[0048]
FIG. 4
[0049]aillustrates one embodiment of a fixpoint computation method which uses lazy pre-image computations. In
processing block411, a partial fixpoint state predicate, Fix
0, and an initial frontier predicate, Front
0, are both set to the input predicate S(W), and counter i is initialized to 1. In
processing block412, the new frontier predicate, Front
i, is set to the lazy pre-image of the previous frontier predicate, Front
i−1, intersected with the negated partial fixpoint predicate,
Fix
i−1, in order to exclude any states whose pre-images have already been computed. This computation is expressed symbolically as Pre(Front
i−1) Λ
Fix
i−1. In processing block
413 a new fixpoint predicate Fix
iis set to the union of the new frontier predicate, Front
i, and the previous partial fixpoint predicate, Fix
i−1. Counter i is then incremented. In
processing block419, Front
iis tested to see if any states from the previous iteration that need to have pre-images computed remain in the frontier. If so, processing beginning at
processing block412 repeats until Front
iis emptied of such states, in which case processing terminates at
processing block419.
FIG. 4[0050]billustrates an example for one embodiment of performing a lazy fixpoint computation for state predicate, S(e), where e=1, oncircuit101. The fixpoint Fix0predicate420 for the states reachable to S(e) in zero transitions and the frontier Front0are initially set to e. Since no pre-image computation is required, no transition relation is built. To compute the fixpoint Fix1predicate421 for the states reachable to S(e) in one transition a lazy pre-image of the frontier predicate Front0is computed and combined with NOT Fix0. Since frontier predicate Front0only involves signal e, lazy transition relation building only computes a transition relation partition for e, as [N(e)=e OR c]. Lazy pre-image Pre(S(e)) can be computed as previously shown, and the lazy pre-image computation returns e OR c based on the partially computed transition relation. The new frontier predicate Front1is set to (e OR c) AND NOT e in accordance withprocessing block412, which reduces to c AND NOT e. Fixpoint Fix1predicate421 for states reachable to S(e) in one transition is set to (c AND NOT e) OR e, which becomes e OR c.
To compute the fixpoint Fix
[0051]2predicate
422 for those states reachable to S(e) in two transitions, the lazy pre-image of the frontier predicate Front
1is computed and combined with NOT Fix
1. The pre-image is calculated as follows:
Predicate (c AND NOT e) requires lazy transition relation building of the translation relation partition for c, as [N(c)=c AND a]. Lazy pre-image computation returns (c AND a) AND NOT (e OR c) based on the partially computed transition relation. The new frontier predicate Front[0052]2is set to (c AND a) AND NOT (e OR c) in accordance withprocessing block412, which reduces to (c AND a AND NOT e AND NOT c)=0. Fixpoint Fix2Predicate422 for states reachable to S(e) in two transitions becomes just (e OR c).
Since frontier predicate Front[0053]2=0 the lazy fixpoint computation terminates. The transition relations for b, d and f are not needed and therefore they are not built.
It will be appreciated that a symbolic model checking operation may benefit significantly from a reduction in the number of transition relations used to represent a model. A further reduction may be accomplished if for each property to be evaluated, only the necessary portions of the model are represented. For one embodiment of a dynamic transition relation pruning technique, a method is disclosed that improves the efficiency for symbolic model checking computations by pruning transition relations under assumptions dynamically generated from the properties being evaluated, thereby providing means to handle very large scale integrated circuits and other finite state systems of problematic complexity for prior methods.[0054]
FIG. 5
[0055]aillustrates a parsing of a
property510, a
(b
X(Xf)). At the first stage the property is parsed into a
root500 representing the logical implication operation, a
left sub-property506 representing the variable, a, and a
right sub-property511 representing b
X(Xf). The operator X indicates that its predicate argument holds at the next transition.
At the second stage the sub-property[0056]511 is parsed into aroot501 representing the logical implication operation, aleft sub-property507 representing the variable, b, and aright sub-property512 representing X(Xf). At the third stage the property is parsed into aroot502 representing the next state operator X, and aright sub-property513 representing Xf. Finally at the fourth stage the sub-property513 is parsed into aroot503 representing the next state operator X, and aright sub-property514 representing f. Given a parsing of the property assumptions may be generated for the sup-properties to be evaluated.
FIG. 5[0057]billustrates one embodiment of a method for producing and propagating assumptions from sub-properties to be evaluated. In processing block521, the assumption for iteration zero, Assum0, is initialized to the value one (true) and Node is set to theroot500 of the property to be evaluated. The iteration counter i is then incremented and processing proceeds toprocessing block522. Inprocessing block522, the Node is tested to see if it consists of a single variable, in which case processing terminates atprocessing block522. If not, processing proceeds toprocessing block523. Inprocessing block523 the type of the Node operation is identified. If it is an implication operation processing proceeds at processing block524. On the other hand, if it is a next state operator X, then processing proceeds to processing block525.
In processing block[0058]524 the assumption for iteration i, Assumi, is set to the assumption for iteration i−1, Assumi−1, combined with the left sub-property of Node using the logical AND operation. In processing block525 the assumption for iteration i, Assumi, is set to post-image of the assumption for iteration i−1, Assumi−1. Processing then proceeds to processing block526 from processing block524 or from processing block525, where Node is set to the right sub-property of Node. The iteration counter i is then incremented and processing proceeds toprocessing block522.
FIG. 5
[0059]cshows an example of producing and propagating assumptions from sub-properties to be evaluated. In iteration zero,
assumption540 is set to the value one and
sub-property530 is set to the state predicate for property to be evaluated (a
(b
X(Xf))). In iteration one,
assumption541 is set to a=(1 AND a) in accordance with processing block
524 and
sub-property531 is set to the right sub-property of
sub-property530, b
X(Xf) in accordance with processing block
526. In iteration two,
assumption542 is set to a AND b and
sub-property532 is set to the right sub-property of
sub-property531, X(Xf). In iteration three,
assumption543 is set to Post(a AND b), which may be evaluated as d since N(d)=b, and
sub-property533 is set to the right sub-property of
sub-property532, Xf. In iteration four,
assumption544 is set to Post(Post(a AND b) which may be evaluated to one (true) and
sub-property534 is set to the right sub-property of
sub-property533, f.
The number of variables in a transition relation may be reduced according to a dynamically generated assumption as the transition relation is built. For instance the next state function for a variable f, N(f)=c NAND d may be pruned according to an assumption including (d=1) to N(f)=NOT c.[0060]
FIG. 6[0061]agraphically illustrates the state space of a model with a transition relation that was built according to one embodiment of an iterative lazy pre-image computation method to evaluate sub-property530 or534. It includes five variables that may be exhaustively searched for an assignment that satisfies the sub-property530 or534. These variables are f atblock611, c atblock613, d atblock615, a atblock617 and b atblock618. The three internal variables for which next state functions are included in the transition relation are, N(f) atblock612, N(c) atblock614, and N(d) atblock616. Since theassumptions540 and544 are trivial no pruning may be performed on the next state functions.
FIG. 6[0062]bgraphically illustrates the state space of a dynamically pruned model with a transition relation that was built according to one embodiment of a lazy pre-image computation method to evaluate sub-property531 underassumption541. It includes four variables that may be exhaustively searched for an assignment that satisfies the sub-properties531 or532. These variables are f atblock621, c atblock623, d atblock625, and b atblock628. The three internal variables for which next state functions are included in the transition relation are, N(f) atblock622, N(c) atblock624, and N(d) atblock626. Since theassumption541 includes only the input variable, a, pruning of the transition relation may be performed only on the next state function for c, producing N(c)=c instead of N(c)=c AND a.
FIG. 6[0063]cgraphically illustrates the state space of a dynamically pruned model with a transition relation that was built according to one embodiment of a lazy pre-image computation method to evaluate sub-property532 underassumption542. It includes three variables that may be exhaustively searched for an assignment that satisfies the sub-property531. These variables are f atblock621, c atblock623, d atblock625. The three internal variables for which next state functions are included in the transition relation are, N(f) atblock622, N(c) atblock624, and N(d) atblock626. Since theassumption541 includes the input variables, a and b, pruning of the transition relation may be performed on the next state function for c, producing N(c)=c instead of N(c)=c AND a and on the next state function for d, producing N(d)=1 instead of N(d)=b.
FIG. 6[0064]dgraphically illustrates the state space of a dynamically pruned model with a transition relation that was built according to one embodiment of a lazy pre-image computation method to evaluate sub-property533 underassumption543. It includes only two variables that may be exhaustively searched for an assignment that satisfies the sub-property533. These variables are f atblock631 and c atblock633. Both of these internal variables' next state functions are included in the transition relation. They are, N(f) atblock632, and N(c) atblock634. Since theassumption543 includes the input variable, a, and the internal variable, d, pruning of the transition relation may be performed on the next state functions for c, producing N(c)=c instead of N(c)=c AND a and for f, producing N(f)=NOT c instead of N(f)=c NAND d.
It will be appreciated that a property may be considered a sub-property of itself. It will also be appreciated that an assumption produced from a sub-property to be evaluated may be used to prune a transition relation in a variety of ways. It can be observed in FIG. 6[0065]athrough FIG. 6ethat a significant reduction in state storage may be achieved through dynamic model pruning.
Alternatively the computational complexity of checking the model may be reduced according to the assumption. For instance some model checking methods exhaustively search a state space in order to verify a property or sub-property. If the transition relation is pruned by one or more assignments of variables according to the assumption produced from a sub-property of the property being evaluated, the computational complexity of the search may be significantly reduced.[0066]
It will also be appreciated that one may desire to reduce the overall size of the model representation, perhaps at the cost of some addition computational complexity. Alternatively the number of variables in a model or the complexity of evaluating a sub-property may be reduced according to the assumption as the sub-property is being evaluated on the transition relation.[0067]
For example, a lazy pre-image of a state predicate S(Xf) AND Assum
[0068]3for
circuit101 can be computed:
In general each sub-property may need to be evaluated according to its corresponding assumption. This operation may be performed in such a way as to substantially reduce the size of transition relations by pruning next state functions as the transition relations are dynamically built by a lazy pre-image calculation.[0069]
FIG. 7[0070]aillustrates one embodiment of a method for dynamically pruning a model by producing and propagating assumptions as shown in FIG. 5bfrom sub-properties to be evaluated and then pruning the transition relation under the assumptions generated.Processing block711 is entered when a singular Node is identified inprocessing block522. Inprocessing block711, the iteration counter i is decremented and then the sub-property for iteration i, SPiis computed from the singular Node variable under the assumption for iteration i, Assumi. Processing then proceeds to processing block712 where the iteration counter is tested. Processing beginning inprocessing block713 is repeated until the iteration counter is equal to zero inprocessing block712, in which case processing terminates and returns the predicate SPi. Inprocessing block713 the Node is set to its parent Node and the iteration counter, i, is decremented. Processing then proceeds inprocessing block714.
In[0071]processing block714 the type of the Node operation is identified. If it is an implication operation processing proceeds atprocessing block715. On the other hand, if it is a next state operator X, then processing proceeds toprocessing block716.
In[0072]processing block715 the state predicate for iteration i, SPi, is set to the state predicate for iteration i+1, SPi+1, combined with the negated left sub-property of Node using the logical OR operation and evaluated under the assumption Assumi. Inprocessing block716 the state predicate for iteration i, SPi, is set to lazy pre-image of the state predicate for iteration i+1, SPi+1and evaluated under the assumption Assumi. Processing then proceeds to processing block712 from processingblock715 or from processingblock716.
FIG. 7[0073]bshows an example of dynamically pruning a transition relation as it is built in a lazy pre-image computation according to assumptions generated from sub-properties to be evaluated. Initeration 4,state predicate724 is set to the singular Node variable, f, and evaluated under the assumption 1 (true) which leaves the predicate unchanged. Initeration 3,state predicate723 is set to the lazy pre-image of the predicate of the predicate f and evaluated under the assumption, d, which in accordance withprocessing block716, reduces the next state function N(f)=c NAND d to N(f)=NOT c. Initeration 2,state predicate722 is set to the lazy pre-image of the predicate, NOT c, and evaluated under the assumption, a AND b, which in accordance withprocessing block716, reduces the next state function N(c)=a AND c to N(c)=c. Initeration 1,state predicate721 is set to the logical OR combination of predicate, NOT c, and negated left sub-property, b, resulting in (NOT b OR NOT c) and then evaluated under the assumption, a, in accordance withprocessing block715, which leaves the predicate unchanged. Initeration 0,state predicate720 is set to the logical OR combination of predicate, (NOT b OR NOT c), and negated left sub-property, a, resulting in (NOT a OR NOT b OR NOT c) and then evaluated under the assumption, a, in accordance withprocessing block715, which leaves the predicate unchanged.
In general, each sub-property may need to evaluated according to its corresponding assumption. If the lazy pre-image computation Pre(SP[0074]i+1) under the assumption Assumiis equal to 1 then no succeeding evaluations are needed as each will trivially be satisfied inprocessing block715 orprocessing block716.
It will be appreciated that the methods herein disclosed or methods substantially similar to those herein disclosed may be implemented in one of many programming languages for performing automated computations including but not limited to lazy pre-image computations, dynamic production and propagation of assumptions, dynamic pruning of transition relations, lazy fixpoint computations using dynamic model pruning, and lazy model checking using dynamic model pruning on high-speed computing devices.[0075]
For example, FIG. 8 illustrates a computer system to perform computations, for one such embodiment.[0076]Processing device822 is connectable with various recordable storage media, transmission media and I/O devices to receive data structures and programmed methods.Representative data structures801 may includecircuit descriptions811,transition relations812, andfinite state models813. Representative programmedmethods802 may includeassumption propagation programs814, lazypre-image programs815, transitionrelation pruning programs816, and model checking programs817. Components of either or both of the data structures and programmed methods may be stored or transmitted on recordable media such asremovable storage disks825, which may be accessed through anaccess device826 inprocessing device822 or in astorage serving system821.Storage serving system821 orprocessing device822 may also include other removable storage media or non-removable storage media suitable for storing or transmittingdata structures801 or programmedmethods802. Component data structures and programmed methods may also be stored or transmitted on transmission media such asnetwork824 for access by processingdevice822 or entered by users through I/O device823. It will be appreciated that systems such as the one illustrated are commonly available and widely used in the art of designing finite state hardware and software systems. It will also be appreciated that the complexity, capabilities, and physical forms of such design systems improves and changes rapidly, and therefore understood that the design system illustrated is by way of example and not limitation.
The above description is intended to illustrate preferred embodiments of the present invention. From the discussion above it should also be apparent that the invention can be modified in arrangement and detail by those skilled in the art without departing from the principles of the present invention within the scope of the accompanying claims.[0077]