Bernard Widrow | |
---|---|
![]() Widrow demonstrating the "Knobby Adaline" device (1963) | |
Born | (1929-12-24)December 24, 1929 (age 95) |
Nationality | American |
Alma mater | Massachusetts Institute of Technology[1] |
Scientific career | |
Fields | Electrical engineering |
Institutions | Stanford University |
Doctoral advisor | William Linvill |
Doctoral students | |
Bernard Widrow (born December 24, 1929) is a U.S. professor ofelectrical engineering atStanford University.[1] He is the co-inventor of the Widrow–Hoffleast mean squares filter (LMS) adaptive algorithm with his then doctoral studentTed Hoff.[2] The LMS algorithm led to theADALINE andMADALINEartificial neural networks and to thebackpropagation technique. He made other fundamental contributions to the development ofsignal processing in the fields of geophysics, adaptive antennas, andadaptive filtering. A summary of his work is.[3]
He is the namesake of "Uncle Bernie's Rule": the training sample size should be 10 times the number of weights in a network.[4][5]
This section is based on.[6][7][8]
He was born inNorwich, Connecticut. While young, he was interested in electronics. During WWII, he found an entry on "Radios" in theWorld Book Encyclopedia, and built a one-tube radio.
He entered MIT in 1947, studied electrical engineering and electronics, and graduated in 1951. After that, he got a research assistantship in the MIT Digital Computer Laboratory, in themagnetic core memory group. The DCL was a division of the Servomechanisms Laboratory,[9] which was building theWhirlwind I computer. The experience of building magnetic core memory shaped his understanding of computers into a "memory's eye view", that is, he "look for the memory and see what you have to connect around it".
For his masters thesis (1953, advised byWilliam Linvill), he worked on raising thesignal-to-noise ratio of the sensing signal of magnetic core memory. Back then, the hysteresis loops for magnetic core memory was not square enough, making sensing signal noisy.
For his PhD (1956, advised by William Linvill), he worked on the statistical theory ofquantization noise,[10] inspired by work by William Linvill and David Middleton.[11]
During PhD, he learned theWiener filter fromLee Yuk-wing. To design a Wiener filter, one must know the statistics of the noiseless signal that one wants to recover. However, if the statistics of the noiseless signal is unknown, this cannot be designed. Widrow thus designed an adaptive filter that uses gradient descent to minimize the mean square error. He also attended theDartmouth workshop in 1956 and was inspired to work on AI.
In 1959, he got his first graduate student,Ted Hoff. They improved the previous adaptive filter so that it makes a gradient descent for each datapoint, resulting in thedelta rule and theADALINE. To avoid having to hand-tune the weights in ADALINE, they invented the memistor, withconductance (ADALINE weights) being the thickness of the copper on the graphite.
During a meeting withFrank Rosenblatt, Widrow argued that the S-units in theperceptron machine should not be connected randomly to the A-units. Instead, the S-units should be removed, so that the photocell inputs would be directly inputted into the A-units. Rosenblatt objected that "the human retina is built that way".
Despite many attempts, they never succeeded in developing a training algorithm for a multilayered neural network. The furthest they got was withMadaline Rule I (1962), which had two weight layers. The first was trainable, but the second was fixed. Widrow stated their problem would have been solved by the backpropagation algorithm. "This was long beforePaul Werbos. Backprop to me is almost miraculous."
Unable to train multilayered neural networks, Widrow turned to adaptive filtering and adaptive signal processing, using techniques based on the LMS filter for applications such as adaptive antenna,[12] adaptive noise canceling,[13] and applications to medicine.[14]
At a 1985 conference inSnowbird, Utah, he noticed that neural network research was returning, and he also learned of the backpropagation algorithm. After that, he returned to neural network research.
He was one of the Board of Governors of the International Neural Network Society (INNIS) in 2003.
Awards | ||
---|---|---|
Preceded by | IEEE Alexander Graham Bell Medal 1986 | Succeeded by |