Movatterモバイル変換


[0]ホーム

URL:


Jump to content
WikipediaThe Free Encyclopedia
Search

Wagner–Fischer algorithm

From Wikipedia, the free encyclopedia
Programming algorithm

Incomputer science, theWagner–Fischer algorithm is adynamic programming algorithm that computes theedit distance between two strings of characters.

History

[edit]

The Wagner–Fischer algorithm has a history ofmultiple invention. Navarro lists the following inventors of it, with date of publication, and acknowledges that the list is incomplete:[1]: 43 

Calculating distance

[edit]

The Wagner–Fischer algorithm computes edit distance based on the observation that if we reserve amatrix to hold the edit distances between allprefixes of the first string and all prefixes of the second, then we can compute the values in the matrix byflood filling the matrix, and thus find the distance between the two full strings as the last value computed.

A straightforward implementation, aspseudocode for a functionDistance that takes two strings,s of lengthm, andt of lengthn, and returns theLevenshtein distance between them, looks as follows. The input strings are one-indexed, while the matrixd is zero-indexed, and[i..k] is a closed range.

functionDistance(chars[1..m],chart[1..n]):// for all i and j, d[i,j] will hold the distance between// the first i characters of s and the first j characters of t// note that d has (m+1)*(n+1) valuesdeclareintd[0..m,0..n]seteachelementindtozero// source prefixes can be transformed into empty string by// dropping all charactersforifrom1tom:d[i,0]:=i// target prefixes can be reached from empty source prefix// by inserting every characterforjfrom1ton:d[0,j]:=jforjfrom1ton:forifrom1tom:ifs[i]=t[j]:substitutionCost:=0else:substitutionCost:=1d[i,j]:=minimum(d[i-1,j]+1,// deletiond[i,j-1]+1,// insertiond[i-1,j-1]+substitutionCost)// substitutionreturnd[m,n]

Two examples of the resulting matrix (hovering over an underlined number reveals the operation performed to get that number):

kitten
0123456
s1123456
i2212345
t3321234
t4432123
i5543223
n6654332
g7765443
Saturday
012345678
S101234567
u211223456
n322233456
d433334345
a543444434
y654455543

Theinvariant maintained throughout the algorithm is that we can transform the initial segments[1..i] intot[1..j] using a minimum ofd[i,j] operations. At the end, the bottom-right element of the array contains the answer.

Proof of correctness

[edit]

As mentioned earlier, theinvariant is that we can transform the initial segments[1..i] intot[1..j] using a minimum ofd[i,j] operations. This invariant holds since:

  • It is initially true on row and column 0 becauses[1..i] can be transformed into the empty stringt[1..0] by simply dropping alli characters. Similarly, we can transforms[1..0] tot[1..j] by simply adding allj characters.
  • Ifs[i] = t[j], and we can transforms[1..i-1] tot[1..j-1] ink operations, then we can do the same tos[1..i] and just leave the last character alone, givingk operations.
  • Otherwise, the distance is the minimum of the three possible ways to do the transformation:
    • If we can transforms[1..i] tot[1..j-1] ink operations, then we can simply addt[j] afterwards to gett[1..j] ink+1 operations (insertion).
    • If we can transforms[1..i-1] tot[1..j] ink operations, then we can removes[i] and then do the same transformation, for a total ofk+1 operations (deletion).
    • If we can transforms[1..i-1] tot[1..j-1] ink operations, then we can do the same tos[1..i], and exchange the originals[i] fort[j] afterwards, for a total ofk+1 operations (substitution).
  • The operations required to transforms[1..n] intot[1..m] is of course the number required to transform all ofs into all oft, and sod[n,m] holds our result.

This proof fails to validate that the number placed ind[i,j] is in fact minimal; this is more difficult to show, and involves anargument by contradiction in which we assumed[i,j] is smaller than the minimum of the three, and use this to show one of the three is not minimal.

Possible modifications

[edit]

Possible modifications to this algorithm include:

  • We can adapt the algorithm to use less space,O(m) instead ofO(mn), since it only requires that the previous column and current column be stored at any one time.
  • We can store the number of insertions, deletions, and substitutions separately, or even the positions at which they occur, which is alwaysj.
  • We can normalize the distance to the interval[0,1].
  • If we are only interested in the distance if it is smaller than a thresholdk, then it suffices to compute a diagonal stripe of width2k+1{\displaystyle 2k+1} in the matrix. In this way, the algorithm can be run inO(kl) time, wherel is the length of the shortest string.[2]
  • We can give different penalty costs to insertion, deletion and substitution. We can also give penalty costs that depend on which characters are inserted, deleted or substituted.
  • This algorithmparallelizes poorly, due to a large number ofdata dependencies. However, all thecost values can be computed in parallel, and the algorithm can be adapted to perform theminimum function in phases to eliminate dependencies.
  • By examining diagonals instead of rows, and by usinglazy evaluation, we can find the Levenshtein distance inO(m (1 +d)) time (whered is the Levenshtein distance), which is much faster than the regular dynamic programming algorithm if the distance is small.[3]

Seller's variant for string search

[edit]

By initializing the first row of the matrix with zeros, we obtain a variant of the Wagner–Fischer algorithm that can be used forfuzzy string search of a string in a text.[1] This modification gives the end-position of matching substrings of the text. To determine the start-position of the matching substrings, the number of insertions and deletions can be stored separately and used to compute the start-position from the end-position.[4]

The resulting algorithm is by no means efficient, but was at the time of its publication (1980) one of the first algorithms that performed approximate search.[1]

References

[edit]
  1. ^abcNavarro, Gonzalo (2001)."A guided tour to approximate string matching"(PDF).ACM Computing Surveys.33 (1):31–88.CiteSeerX 10.1.1.452.6317.doi:10.1145/375360.375365.S2CID 207551224.
  2. ^Gusfield, Dan (1997).Algorithms on strings, trees, and sequences: computer science and computational biology. Cambridge, UK: Cambridge University Press.ISBN 978-0-521-58519-4.
  3. ^Allison L (September 1992)."Lazy Dynamic-Programming can be Eager".Inf. Proc. Letters.43 (4):207–12.doi:10.1016/0020-0190(92)90202-7.
  4. ^Bruno Woltzenlogel Paleo.An approximate gazetteer for GATE based on levenshtein distanceArchived 2013-05-08 at theWayback Machine. Student Section of the European Summer School in Logic, Language and Information (ESSLLI), 2007.
String metric
String-searching algorithm
Multiple string searching
Regular expression
Sequence alignment
Data structure
Other
Retrieved from "https://en.wikipedia.org/w/index.php?title=Wagner–Fischer_algorithm&oldid=1313513399"
Categories:
Hidden categories:

[8]ページ先頭

©2009-2026 Movatter.jp