Christopher David Manning (born September 18, 1965) is acomputer scientist andapplied linguist whose research in the areas ofnatural language processing,artificial intelligence andmachine learning is considered highly influential. He is the current Director of theStanford Artificial Intelligence Laboratory (SAIL).
Manning has been described as “the leading researcher in natural language processing”,[1] well known for co-developingGloVe word vectors; the bilinear or multiplicative form ofattention, now widely used inartificial neural networks including thetransformer; tree-structured recursive neural networks; and approaches to and systems forTextual entailment. His main educational contributions are his textbooksFoundations of Statistical Natural Language Processing (1999) andIntroduction to Information Retrieval (2008), and his course CS224N Natural Language Processing with Deep Learning, which is available online. Manning also pioneered the development of well-maintained open sourcecomputational linguistics software packages, including CoreNLP, Stanza, and GloVe.[2][3][4][5]
Manning is theThomas M. Siebel Professor in Machine Learning and a professor ofLinguistics and Computer Science atStanford University. He received a BA (Hons) degree majoring in mathematics, computer science, and linguistics from theAustralian National University (1989) and a PhD in linguistics from Stanford (1994), under the guidance ofJoan Bresnan.[6][7] He was an assistant professor atCarnegie Mellon University (1994–96) and a lecturer at theUniversity of Sydney (1996–99) before returning to Stanford as an assistant professor. At Stanford, he was promoted to associate professor in 2006 and to full professor in 2012. He was elected anAAAI Fellow in 2010.[8]He was previously President of theAssociation for Computational Linguistics (2015) and he has received an honorary doctorate from theUniversity of Amsterdam (2023). Manning was awarded the IEEE John von Neumann Medal “for advances in computational representation and analysis of natural language” in 2024.[9][1]
Manning's linguistic work includes his dissertationErgativity: Argument Structure and Grammatical Relations (1996), a monographComplex Predicates and Information Spreading in LFG (1999),[10] and his work developingUniversal Dependencies,[11] from which he is the namesake ofManning's Law.
Manning's PhD students includeDan Klein,Sepandar Kamvar, Richard Socher, andDanqi Chen.[7] In 2021, he joined AIX Ventures[12] as an Investing Partner. AIX Ventures is a venture capital fund that invests in artificial intelligence startups.
This article about an Australian scientist is astub. You can help Wikipedia byexpanding it. |