The DFS and BFS strategies for OR trees and graphs can be adapted for And-Or trees The main difference lies in the way termination conditions are determined, since all goals following an And node must be realized, whereas a single goal node following an Or node will do

A more general optimal strategy is AO* (O for ordered) algorithm

As in the case of the A* algorithm, we use the open list to hold nodes that have been generated but not expanded and the closed list to hold nodes that have been expanded

The algorithm is a variation of the original given by Nilsson It requires that nodes traversed in the tree be labeled as solved or unsolved in the solution process to account for And node solutions which require solutions to all successors nodes.

A solution is found when the start node is labeled as solved The AO* algorithm**Step 1:** Place the start node s on open

**Step 2:** Using the search tree constructed thus far, compute the most promising solution tree T0

**Step 3**:Select a node n that is both on open and a part of T0. Remove n from open and place it on closed

**Step 4:** If n ia terminal goal node, label n as solved. If the solution of n results in any of n’s ancestors being solved, label all the ancestors as solved. If the start node s is solved, exit with success where T0 is the solution tree. Remove from open all nodes with a solved ancestor

**Step 5:** If n is not a solvable node, label n as unsolvable. If the start node is labeled as unsolvable, exit with failure. If any of n’s ancestors become unsolvable because n is, label them unsolvable as well. Remove from open all nodes with unsolvable ancestors Otherwise, expand node n generating all of its successors.

For each such successor node that contains more than one subproblem, generate their successors to give individual subproblems. Attach to each newly generated node a back pointer to its predecessor. Compute the cost estimate h* for each newly generated node and place all such nodes thst do not yet have descendents on open. Next recomputed the values oh h* at n and each ancestors of n

**Step 7:** Return to step 2 It can be shown that AO* will always find a minimum-cost solution tree if one exists, provided only that h*(n) ≤ h(n), and all arc costs are positive. Like A*, the efficiency depends on how closely h* approximates h