Movatterモバイル変換


[0]ホーム

URL:


Jump to content
WikipediaThe Free Encyclopedia
Search

Frank Rosenblatt

From Wikipedia, the free encyclopedia
American psychologist (1928–1971)
Frank Rosenblatt
Rosenblatt in 1950
Born
Frank Rosenblatt

(1928-07-11)July 11, 1928
DiedJuly 11, 1971(1971-07-11) (aged 43)
Known forPerceptron
Academic background
Alma materCornell University
ThesisThe k-Coefficient: Design and Trial Application of a New Technique for Multivariate Analysis (1956)
InfluencesWalter Pitts,Warren Sturgis McCulloch,Donald O. Hebb,Friedrich Hayek,Karl Lashley
Academic work
Doctoral studentsGeorge Nagy (1962)

Frank Rosenblatt (July 11, 1928 – July 11, 1971) was an American psychologist notable in the field ofartificial intelligence. He is sometimes called the father ofdeep learning[1] for his pioneering work onartificial neural networks.

Life and career

[edit]

Rosenblatt was born into a Jewish family inNew Rochelle, New York as the son of Dr. Frank and Katherine Rosenblatt.[2]

After graduating fromThe Bronx High School of Science in 1946, he attendedCornell University, where he obtained hisA.B. in 1950 and hisPh.D. in 1956.[3]

For his PhD thesis he built a custom-made computer, the Electronic Profile Analyzing Computer (EPAC), to performmultidimensional analysis forpsychometrics. He used it between 1951 and 1953 to analyze psychometric data collected for his PhD thesis. The data were collected from a paid, 600 item survey of more than 200 Cornell undergraduates. The total computational cost was 2.5 million arithmetic operations, necessitating the use of anIBM CPC as well.[4] It was said that 15 minutes of data processing took just 2 seconds.[5]: 32

He subsequently moved toCornell Aeronautical Laboratory inBuffalo, New York, where he was successively a research psychologist, senior psychologist, and head of thecognitive systems section. It was there that he also conducted the early work onperceptrons, which culminated in the development and hardware construction in 1960 of theMark I Perceptron,[2] essentially the first computer that could learn new skills by trial and error, using a type of neural network that simulates human thought processes.

Rosenblatt's research interests were exceptionally broad. In 1959 he went to Cornell's Ithaca campus as director of the Cognitive Systems Research Program and lecturer in the Psychology Department. In 1966 he joined the Section ofNeurobiology and Behavior within the newly formed Division ofBiological Sciences, as associate professor.[2] Also in 1966, he became fascinated with the transfer of learned behavior from trained to naive rats by the injection of brain extracts, a subject on which he would publish extensively in later years.[3]

In 1970 he became field representative for the Graduate Field of Neurobiology and Behavior, and in 1971 he shared the acting chairmanship of the Section of Neurobiology and Behavior. Frank Rosenblatt died in July 1971 on his 43rd birthday, in a boating accident inChesapeake Bay.[3] He was eulogized on the floor of theHouse of Representatives, among others by former SenatorEugene McCarthy.[4]

Academic interests

[edit]

Perceptron

[edit]

Rosenblatt is best known for thePerceptron, an electronic device which was constructed in accordance with biological principles and showed an ability to learn. Rosenblatt's perceptrons were initially simulated on anIBM 704 computer at Cornell Aeronautical Laboratory in 1957.[6] When a triangle was held before the perceptron's eye, it would pick up the image and convey it along a random succession of lines to the response units, where the image was registered.[7]

He developed and extended this approach in numerous papers and a book calledPrinciples of Neurodynamics: Perceptrons and the Theory of Brain Mechanisms, published by Spartan Books in 1962.[8] He received international recognition for the Perceptron.The New York Times billed it as a revolution, with the headline "New Navy Device Learns By Doing",[9] andThe New Yorker similarly admired the technological advancement.[7]

An elementary Rosenblatt's perceptron. A-units are linear threshold element with fixed input weights. R-unit is also a linear threshold element but with ability to learn according to Rosenblatt'slearning rule. Redrawn in[10] from the original Rosenblatt's book.[11]

Rosenblatt proved four main theorems. The first theorem states that elementary perceptrons can solve any classification problem if there are no discrepancies in the training set (and sufficiently many independent A-elements). The fourth theorem states convergence of learning algorithm if this realisation of elementary perceptron can solve the problem. His work was done in collaboration with colleagues, especially H. D. Block.[12]

Rosenblatt also studied the problem of generalization: Given a model that has learned to recognize a pattern, the model should still recognize the pattern under translation or rotation, or some other transformation. He studied both the case where the generalization is hardwired, and the case where it is learned.[13][12]

Research on comparable devices was also being conducted in other places such asSRI, and many researchers had big expectations on what they could do. The initial excitement became somewhat reduced, however, when in 1969Marvin Minsky andSeymour Papert published the book"Perceptrons". Minsky and Papert considered elementary perceptrons with restrictions on the neural inputs: a bounded number of connections or a relatively small diameter of A-units receptive fields. They proved that under these constraints, an elementary perceptron cannot solve some problems, such as the connectivity of input images or the parity of pixels in them. Thus, Rosenblatt proved omnipotence of the unrestricted elementary perceptrons, whereas Minsky and Papert demonstrated that abilities of perceptrons with restrictions are limited. These results are not contradictory, but the Minsky and Papert book was widely (and wrongly) cited as the proof of strong limitations of perceptrons. (For detailed elementary discussion of the first Rosenblatt's theorem and its relation to Minsky and Papert work we refer to a recent note.[10])

After research on neural networks returned to the mainstream in the 1980s, new researchers started to study Rosenblatt's work again. This new wave of study on neural networks is interpreted by some researchers as being a contradiction of hypotheses presented in the book Perceptrons, and a confirmation of Rosenblatt's expectations.

The Mark I Perceptron, which is generally recognized as a forerunner to artificial intelligence, currently resides in theSmithsonian Institution inWashington D.C.[3] The Mark I was able to learn, recognize letters, and solve quite complex problems.

Isometric view of Tobermory, Phase I.[14]

Tobermory, a scaled up perceptron machine, was built between 1961 and 1967, with a focus onspeech recognition.[15] It occupied an entire room.[16] It was a neural network with 4 layers, with 12,000 weights implemented by toroidalmagnetic cores. By the time of its completion, simulation on digital computers had become faster than purpose-built perceptron machines.[17]

George Nagy received a PhD in 1962 under Rosenblatt, primarily for work on Tobermory.[18]

Principles of Neurodynamics (1962)

The neuron model employed is a direct descendant of that originally proposed byMcCulloch andPitts. The basic philosophical approach has been heavily influenced by the theories ofHebb andHayek and the experimental findings ofLashley. The probabilistic approach is shared with theorists such asAshby,Uttley,Minsky,MacKay, andvon Neumann.

— Frank Rosenblatt, Principles Of Neurodynamics, page 5

Rosenblatt's bookPrinciples of Neurodynamics: Perceptrons and the Theory of Brain Mechanisms, published by Spartan Books in 1962, summarized his work on perceptrons at the time.[11] The book was previously issued as an unclassified report No. 1196-G-8, on 1961 March 15, through theDefense Technical Information Center.[19]

The book is divided into four parts. The first gives an historical review of alternative approaches to brain modeling, the physiological and psychological considerations, and the basic definitions and concepts of the perceptron approach. The second covers three-layer series-coupled perceptrons: the mathematical underpinnings, performance results in psychological experiments, and a variety of perceptron variations. The third covers multi-layer and cross-coupled perceptrons, and the fourth back-coupled perceptrons and problems for future study.

The cross-coupled perceptron machines are currently known asHopfield networks. Rosenblatt proved some conditions under which it would settle into an equilibrium.[12]

Rosenblatt used the book to teach an interdisciplinary course entitled "Theory of Brain Mechanisms" that drew students from Cornell's Engineering and Liberal Arts colleges.[1]

Rat brain experiments

[edit]

Beginning in 1963, Rosenblatt's interest turned to the theoretical explanation of memories, such as the question of how a human memory works over a lifetime. He designed and mathematically analyzed some neural network models of memory, but no convincing simulation experiment was conducted.[20][12]

Around the late 1960s, inspired byJames V. McConnell's experiments withmemory transfer inplanarians, Rosenblatt began experiments within the Cornell Department ofEntomology on the transfer of learned behavior via rat brain extracts. Rats were taughtdiscrimination tasks such asY-maze andtwo-lever Skinner box. Their brains were then extracted, and the extracts and their antibodies were injected into untrained rats that were subsequently tested in the discrimination tasks to determine whether or not there was behavior transfer from the trained to the untrained rats.[21] Rosenblatt spent his last several years on this problem and showed convincingly that the initial reports of larger effects were wrong and that any memory transfer was at most very small.[3][22] He also supervised some PhD students who investigated the role of DNA on memory.[12]

Other interests

[edit]

Astronomy

[edit]

Rosenblatt also had a serious research interest inastronomy and proposed a new technique to detect the presence of stellar satellites.[23] He built anobservatory on a hilltop behind his house in Brooktondale about 6 miles east of Ithaca. The possibility ofextraterrestrial intelligence fascinated him; he joined thesearch for extraterrestrial intelligence from his observatory.[3] He also studied photometry and developed a technique for "detecting low-level laser signals against a relatively intense background of non-coherent light".[21]

Politics

[edit]

Rosenblatt was very active in liberal politics. He worked in theEugene McCarthy primary campaigns for president inNew Hampshire andCalifornia in 1968 and in a series ofVietnam protest activities in Washington.[24]

IEEE Frank Rosenblatt Award

[edit]

TheInstitute of Electrical and Electronics Engineers (IEEE), the world's largest professional association dedicated to advancing technological innovation and excellence for the benefit of humanity, presents annually aIEEE Frank Rosenblatt Award.

See also

[edit]

References

[edit]
Wikiquote has quotations related toFrank Rosenblatt.
  1. ^abTappert, Charles C. (2019). "Who is the Father of Deep Learning?".2019 International Conference on Computational Science and Computational Intelligence (CSCI). IEEE. pp. 343–348.doi:10.1109/CSCI49370.2019.00067.ISBN 978-1-7281-5584-5.S2CID 216043128.
  2. ^abcCarey, Hugh L. (1971)."Tribute to Dr. Frank Rosenblatt"(PDF).Congressional Record: Proceedings and Debates of the 92d Congress, First Session. US Government Printing Office. pp. 1–7. Archived fromthe original(PDF) on 26 February 2014. Retrieved24 Dec 2021.
  3. ^abcdefEmlen, Stephen T.; Howland, Howard C.; O'Brien, Richard D."Frank Rosenblatt, July 11, 1928 — July 11, 1971"(PDF). Cornell University. Retrieved24 Dec 2021.
  4. ^abPenn, Jonathan (2021-01-11).Inventing Intelligence: On the History of Complex Information Processing and Artificial Intelligence in the United States in the Mid-Twentieth Century (Thesis). [object Object].doi:10.17863/cam.63087.
  5. ^"Editor Miscellany",American Scientist 42, no. 1 (January 1954): 32.
  6. ^"Hyping Artificial Intelligence, Yet Again". newyorker.com. 31 December 2013.
  7. ^abMason, Harding; Stewart, D.; Brendan, Gill (28 November 1958)."Rival".The New Yorker.
  8. ^Preprint as a military report in 1961-03-15 asReport #1196-0-8
  9. ^"New Navy Device Learns By Doing".The New York Times. 8 July 1958.
  10. ^abKirdin A, Sidorov S, Zolotykh N (2022)."Rosenblatt's First Theorem and Frugality of Deep Learning".Entropy.24 (11): 1635.arXiv:2208.13778.Bibcode:2022Entrp..24.1635K.doi:10.3390/e24111635.PMC 9689667.PMID 36359726.
  11. ^abRosenblatt, F.Principles of Neurodynamics: Perceptrons and the Theory of Brain Mechanisms; Spartan Books: Washington, DC, USA, 1962.
  12. ^abcdeNagy, George (March 1991)."Neural networks—then and now"(PDF).IEEE Transactions on Neural Networks.2 (2):316–318.doi:10.1109/72.80343.ISSN 1941-0093.
  13. ^Rosenblatt, Frank (1960). "Perceptual generalization over transformation groups". In Yovits; Cameron (eds.).Self-Organizing Systems. New York: Pergamon Press. pp. 63–96.
  14. ^Nagy, George. 1963.System and circuit designs for the Tobermory perceptron. Technical report number 5, Cognitive Systems Research Program, Cornell University, Ithaca New York.
  15. ^Rosenblatt, Frank (1962). “A Description of the Tobermory Perceptron.” Cognitive Research Program. Report No. 4. Collected Technical Papers, Vol. 2. Edited by Frank Rosenblatt. Ithaca, NY: Cornell University.
  16. ^Nagy, George. 1963.System and circuit designs for the Tobermory perceptron. Technical report number 5, Cognitive Systems Research Program, Cornell University, Ithaca New York.
  17. ^Nagy, George. "Neural networks-then and now."IEEE Transactions on Neural Networks 2.2 (1991): 316-318.
  18. ^"George Nagy's Homepage".sites.ecse.rpi.edu. Retrieved2025-08-09.
  19. ^Defense Technical Information Center (1961-03-15).DTIC AD0256582: Principles of Neurodynamics: Perceptrons and the Theory of Brain Mechanisms.
  20. ^Rosenblatt, Frank (1964). "A model for experiential storage in neural networks". In Tou, J.T.; Wilcox, R.H. (eds.).Computer and Information Sciences. Washington, D.C.: Spartan Books. pp. 16–66.
  21. ^abRosenblatt, Frank, and Cornell Univ Ithaca NY.Cognitive Systems Research Program. Technical report, Cornell University, 72, 1971.
  22. ^Rosenblatt, F. (1967). "Recent work on theoretical models of biological memory". In Tou, J.T. (ed.).Computer and Information Sciences-II. Second Symposium on Computer and Information Sciences. New York: Academic Press.
  23. ^"Frank Rosenblatt - July 11, 1928-July 11, 1971"(PDF). dspace.library.cornell.edu.
  24. ^"Frank Rosenblatt - July 11, 1928-July 11, 1971"(PDF). dspace.library.cornell.edu.
  • Mason, Harding; Harding Mason, D. Stewart,Brendan Gill (November 29, 1958)."Rival".The New Yorker.{{cite magazine}}: CS1 maint: multiple names: authors list (link) An interview with Frank Rosenblatt, and Marshall C. Yovits of the Office of Naval Research.
International
National
Academics
Other
Retrieved from "https://en.wikipedia.org/w/index.php?title=Frank_Rosenblatt&oldid=1306723275"
Categories:
Hidden categories:

[8]ページ先頭

©2009-2025 Movatter.jp