TheGarsia–Wachs algorithm is an efficient method for computers to constructoptimal binary search trees andalphabetic Huffman codes, inlinearithmic time. It is named afterAdriano Garsia andMichelle L. Wachs.
The input to the problem, for an integer, consists of a sequence of non-negative weights. The output is a rootedbinary tree with internal nodes, each having exactly two children. Such a tree has exactly leaf nodes, which can be identified (in the order given by the binary tree) with the input weights.The goal of the problem is to find a tree, among all of the possible trees with internal nodes, that minimizes the weighted sum of theexternal path lengths. These path lengths are the numbers of steps from the root to each leaf. They are multiplied by the weight of the leaf and then summed to give the quality of the overall tree.[1]
This problem can be interpreted as a problem of constructing abinary search treefor ordered keys, with the assumption that the tree will be used only to search for values that are not already in the tree. In this case, the keys partition the space of search values into intervals, and the weight of one of these intervals can be taken as the probability of searching for a value that lands in that interval. The weighted sum of external path lengths controls theexpected time for searching the tree.[1]
Alternatively, the output of the problem can be used as aHuffman code, a method for encoding given values unambiguously by using variable-length sequences ofbinary values. In this interpretation, the code for a value is given by the sequence of left and right steps from a parent to the child on the path from the root to a leaf in the tree (e.g. with 0 for left and 1 for right). Unlike standard Huffman codes, the ones constructed in this way arealphabetical, meaning that the sorted order of these binary codes is the same as the input ordering of the values. If the weight of a value is its frequency in a message to be encoded, then the output of the Garsia–Wachs algorithm is the alphabetical Huffman code thatcompresses the message to the shortest possible length.[1]
Overall, the algorithm consists of three phases:[1]
The first phase of the algorithm is easier to describe if the input is augmented with twosentinel values, (or any sufficiently large finite value) at the start and end of the sequence.[2]
The first phase maintains a forest of trees, initially a single-node tree for each non-sentinel input weight, which will eventually become the binary tree that it constructs. Each tree is associated with a value, the sum of the weights of its leavesmakes a tree node for each non-sentinel input weight. The algorithm maintains a sequence of these values, with the two sentinel values at each end. The initial sequence is just the order in which the leaf weights were given as input.It then repeatedly performs the following steps, each of which reduces the length of the input sequence, until there is only one tree containing all the leaves:[1]
To implement this phase efficiently, the algorithm can maintain its current sequence of values in anyself-balancing binary search tree structure. Such a structure allows the removal of and, and the reinsertion of their new parent, in logarithmic time. In each step, the weights up to in the even positions of the array form a decreasing sequence, and the weights in the odd positions form another decreasing sequence. Therefore, the position to reinsert may be found in logarithmic time by using the balanced tree to perform twobinary searches, one for each of these two decreasing sequences. The search for the first position for which can be performed in linear total time by using asequential search that begins at the from the previous triple.[1]
It is nontrivial to prove that, in the third phase of the algorithm, another tree with the same distances exists and that this tree provides the optimal solution to the problem. But assuming this to be true, the second and third phases of the algorithm are straightforward to implement in linear time. Therefore, the total time for the algorithm, on an input of length, is.
The Garsia–Wachs algorithm is named afterAdriano Garsia andMichelle L. Wachs, who published it in 1977.[1][3] Their algorithm simplified an earlier method ofT. C. Hu andAlan Tucker,[1][4] and (although it is different in internal details) it ends up making the same comparisons in the same order as the Hu–Tucker algorithm.[5] The original proof of correctness of the Garsia–Wachs algorithm was complicated, and was later simplified byKingston (1988)[1][2]andKarpinski, Larmore & Rytter (1997).[6]