| History of computing |
|---|
| Hardware |
| Software |
| Computer science |
| Modern concepts |
| By country |
| Timeline of computing |
| Glossary of computer science |
Thehistory of computing extends beyond thehistory of computing hardware andmodern computing technology including earlier methods that relied on pen and paper or chalk and slate, with or without the aid of tables.
Digitalcomputing is intimately tied to the representation ofnumbers.[1] But long beforeabstractions likethe number arose, there were mathematical concepts to serve the purposes of civilization. These concepts are implicit in concrete practices such as:
Eventually, the concept of numbers became concrete and familiar enough for counting to arise, at times with sing-songmnemonics to teachsequences to others. All known human languages, except thePiraha language, have words for at least thenumerals "one" and "two", and even some animals like theblackbird can distinguish a surprising number of items.[5]
Advances in thenumeral system andmathematical notation eventually led to the discovery of mathematical operations such as addition, subtraction, multiplication, division, squaring, square root, and so forth. Eventually, the operations were formalized, and concepts about the operations became understood well enough to bestated formally, and evenproven. See, for example,Euclid's algorithm for finding the greatest common divisor of two numbers.
By theHigh Middle Ages, thepositionalHindu–Arabic numeral system had reachedEurope, which allowed for the systematic computation of numbers. During this period, the representation of a calculation onpaper allowed the calculation ofmathematical expressions, and the tabulation ofmathematical functions such as thesquare root and thecommon logarithm (for use in multiplication and division), and thetrigonometric functions. By the time ofIsaac Newton's research, paper or vellum was an importantcomputing resource, and even in our present time, researchers likeEnrico Fermi would cover random scraps of paper with calculation, to satisfy their curiosity about an equation.[6] Even into the period of programmable calculators,Richard Feynman would unhesitatingly compute any steps that overflowed thememory of the calculators, by hand, just to learn the answer; by 1976 Feynman had purchased anHP-25 calculator with a 49 program-step capacity; if a differential equation required more than 49 steps to solve, he could just continue his computation by hand.[7]
Mathematical statements need not be abstract only; when a statement can be illustrated with actual numbers, the numbers can be communicated and a community can arise. This allows the repeatable, verifiable statements which are the hallmark of mathematics and science. These kinds of statements have existed for thousands of years, and across multiple civilizations, as shown below:
The earliest known tool used for computation is theSumerianabacus, believed to have been invented inBabylonc. 2700–2300 BC. Its original style of usage was by lines drawn in sand with pebbles.[citation needed]
Inc. 1050–771 BC, thesouth-pointing chariot was invented inancient China. It was the first knowngeared mechanism to use adifferential gear, which was later used inanalog computers. TheChinese also invented a more sophisticated abacus from around the 2nd century BC known as theChinese abacus.[citation needed]
In the 3rd century BC,Archimedes used the mechanical principle of balance (seeArchimedes Palimpsest § The Method of Mechanical Theorems) to calculate mathematical problems, such as the number of grains of sand in the universe (The sand reckoner), which also required a recursive notation for numbers (e.g., themyriadmyriad).
TheAntikythera mechanism is believed to be the earliest known geared computing device. It was designed to calculate astronomical positions. It was discovered in 1901 in theAntikythera wreck off the Greek island of Antikythera, between Kythera andCrete, and has been dated tocirca 100 BC.[8]
According toSimon Singh,Muslim mathematicians also made important advances incryptography, such as the development ofcryptanalysis andfrequency analysis byAlkindus.[9][10]Programmable machines were also invented byMuslim engineers, such as the automaticflute player by theBanū Mūsā brothers.[11]
During the Middle Ages, several European philosophers made attempts to produce analog computer devices. Influenced by the Arabs andScholasticism, Majorcan philosopherRamon Llull (1232–1315) devoted a great part of his life to defining and designing severallogical machines that, by combining simple and undeniable philosophical truths, could produce all possible knowledge. These machines were never actually built, as they were more of athought experiment to produce new knowledge in systematic ways; although they could make simple logical operations, they still needed a human being for the interpretation of results. Moreover, they lacked a versatile architecture, each machine serving only very concrete purposes. Despite this, Llull's work had a strong influence onGottfried Leibniz (early 18th century), who developed his ideas further and built several calculating tools using them.
The apex of this early era of mechanical computing can be seen in theDifference Engine and its successor theAnalytical Engine both byCharles Babbage. Babbage never completed constructing either engine, but in 2002Doron Swade and a group of other engineers at theScience Museum in London completed Babbage's Difference Engine using only materials that would have been available in the 1840s.[12] By following Babbage's detailed design they were able to build a functioning engine, allowing historians to say, with some confidence, that if Babbage had been able to complete his Difference Engine it would have worked.[13] The additionally advanced Analytical Engine combined concepts from his previous work and that of others to create a device that, if constructed as designed, would have possessed many properties of a modern electronic computer, such as an internal "scratch memory" equivalent toRAM, multiple forms of output including a bell, a graph-plotter, and simple printer, and a programmable input-output "hard" memory ofpunch cards which it could modify as well as read. The key advancement that Babbage's devices possessed beyond those created before him was that each component of the device was independent of the rest of the machine, much like the components of a modern electronic computer. This was a fundamental shift in thought; previous computational devices served only a single purpose but had to be at best disassembled and reconfigured to solve a new problem. Babbage's devices could be reprogrammed to solve new problems by the entry of new data and act upon previous calculations within the same series of instructions.Ada Lovelace took this concept one step further, by creating a program for the Analytical Engine to calculateBernoulli numbers, a complex calculation requiring a recursive algorithm. This is considered to be the first example of a true computer program, a series of instructions that act upon data not known in full until the program is run.
Following Babbage, although unaware of his earlier work,Percy Ludgate[14][15] in 1909 published the 2nd of the only two designs for mechanical analytical engines in history.[16] Two other inventors,Leonardo Torres Quevedo[17] andVannevar Bush,[18] also did follow-on research based on Babbage's work. In hisEssays on Automatics (1914) Torres presented the design of an electromechanical calculating machine and introduced the idea ofFloating-point arithmetic.[19][20] In 1920, to celebrate the 100th anniversary of the invention of thearithmometer, Torres presented in Paris theElectromechanical Arithmometer, an arithmetic unit connected to a remote typewriter, on which commands could be typed and the results printed automatically.[21][22] Bush's paperInstrumental Analysis (1936) discussed using existing IBM punch card machines to implement Babbage's design. In the same year, he started the Rapid Arithmetical Machine project to investigate the problems of constructing an electronic digital computer.
Several examples of analog computation survived into recent times. Aplanimeter is a device that does integrals, usingdistance as the analog quantity. Until the 1980s,HVAC systems usedair both as the analog quantity and the controlling element. Unlike modern digital computers, analog computers are not very flexible and need to be reconfigured (i.e., reprogrammed) manually to switch them from working on one problem to another. Analog computers had an advantage over early digital computers in that they could be used to solve complex problems using behavioral analogues while the earliest attempts at digital computers were quite limited.

Since computers were rare in this era, the solutions were oftenhard-coded into paper forms such asnomograms,[23] which could then produce analog solutions to these problems, such as the distribution of pressures and temperatures in a heating system.
The "brain" [computer] may one day come down to our level [of the common people] and help with our income-tax and book-keeping calculations. But this is speculation and there is no sign of it so far.
— British newspaperThe Star in a June 1949 news article about theEDSAC computer, long before the era of the personal computers.[24]
In an 1886 letter,Charles Sanders Peirce described how logical operations could be carried out by electrical switching circuits.[25] During 1880-81 he showed thatNOR gates alone (orNAND gates alone) can be used to reproduce the functions of all the otherlogic gates, but this work on it was unpublished until 1933.[26] The first published proof was byHenry M. Sheffer in 1913, so the NAND logical operation is sometimes calledSheffer stroke; thelogical NOR is sometimes calledPeirce's arrow.[27] Consequently, these gates are sometimes calleduniversal logic gates.[28]
Eventually,vacuum tubes replaced relays for logic operations.Lee De Forest's modification, in 1907, of theFleming valve can be used as a logic gate.Ludwig Wittgenstein introduced a version of the 16-rowtruth table as proposition 5.101 ofTractatus Logico-Philosophicus (1921).Walther Bothe, inventor of thecoincidence circuit, got part of the 1954Nobel Prize in physics, for the first modern electronic AND gate in 1924.Konrad Zuse designed and built electromechanical logic gates for his computerZ1 (from 1935 to 1938).
The first recorded idea of usingdigital electronics for computing was the 1931 paper "The Use of Thyratrons for High Speed Automatic Counting of Physical Phenomena" byC. E. Wynn-Williams.[29] From 1934 to 1936,NEC engineerAkira Nakashima,Claude Shannon, andVictor Shestakov published papers introducingswitching circuit theory, using digital electronics forBoolean algebraic operations.[30][31][32][33]
In 1936Alan Turing published his seminal paperOn Computable Numbers, with an Application to the Entscheidungsproblem[34] in which he modeled computation in terms of a one-dimensional storage tape, leading to the idea of theUniversal Turing machine andTuring-complete systems.[citation needed]
The first digital electronic computer was developed between April 1936 and June 1939 at the IBM Patent Department in Endicott, New York, by Arthur Halsey Dickinson.[35][36][37] In this computer IBM introduced, a calculating device with a keyboard, processor and electronic output (display). The competitor to IBM was the digital electronic computer NCR3566, developed in NCR, Dayton, Ohio by Joseph Desch and Robert Mumma in the period April 1939 - August 1939.[38][39] The IBM and NCR machines were decimal, executing addition and subtraction in binary position code.
In December 1939John Atanasoff andClifford Berry completed their experimental model to prove the concept of theAtanasoff–Berry computer (ABC) which began development in 1937.[40] This experimental model is binary, executed addition and subtraction in octal binary code and is the first binary digitalelectronic computing device. The Atanasoff–Berry computer was intended to solve systems of linear equations, though it was not programmable. The computer was never truly completed due to Atanasoff's departure fromIowa State University in 1942 to work for the United States Navy.[41][42] Many people credit ABC with many of the ideas used in later developments during the age of early electronic computing.[43]
TheZ3 computer, built byGerman inventorKonrad Zuse in 1941, was the first programmable, fully automatic computing machine, but it was not electronic.
During World War II, ballistics computing was done by women, who were hired as "computers." The term computer remained one that referred to mostly women (now seen as "operator") until 1945, after which it took on the modern definition of machinery it presently holds.[44]
TheENIAC (Electronic Numerical Integrator And Computer) was the first electronic general-purpose computer, announced to the public in 1946. It was Turing-complete,[45] digital, and capable of being reprogrammed to solve a full range of computing problems. Women implemented the programming for machines like the ENIAC, and men created the hardware.[44]
TheManchester Baby was the first electronicstored-program computer. It was built at theVictoria University of Manchester byFrederic C. Williams,Tom Kilburn andGeoff Tootill, and ran its first program on 21 June 1948.[46]
William Shockley,John Bardeen andWalter Brattain atBell Labs invented the first workingtransistor, thepoint-contact transistor, in 1947, followed by thebipolar junction transistor in 1948.[47][48] At theUniversity of Manchester in 1953, a team under the leadership ofTom Kilburn designed and built the firsttransistorized computer, called theTransistor Computer, a machine using the newly developed transistors instead of valves.[49] The first stored-program transistor computer was the ETL Mark III, developed by Japan's Electrotechnical Laboratory[50][51][52] from 1954[53] to 1956.[51] However, early junction transistors were relatively bulky devices that were difficult to manufacture on amass-production basis, which limited them to a number of specialized applications.[54]
In 1954, 95% of computers in service were being used for engineering and scientific purposes.[55]
Themetal–oxide–silicon field-effect transistor (MOSFET), also known as the MOS transistor, was invented at Bell Labs between 1955 and 1960,[56][57][58][59][60][61][62] It was the first truly compact transistor that could beminiaturised andmass-produced for a wide range of uses.[54] The MOSFET made it possible to buildhigh-densityintegrated circuit chips.[63][64] The MOSFET is the most widely used transistor in computers,[65][66] and is the fundamental building block ofdigital electronics.[67]
Thesilicon-gate MOS integrated circuit was developed byFederico Faggin atFairchild Semiconductor in 1968.[68] This led to the development of the first single-chipmicroprocessor, theIntel 4004.[69] The Intel 4004 was developed as a single-chip microprocessor from 1969 to 1970, led by Intel's Federico Faggin,Marcian Hoff, andStanley Mazor, and Busicom's Masatoshi Shima.[70] The chip was mainly designed and realized by Faggin, with his silicon-gate MOS technology.[69] The microprocessor led to the microcomputer revolution, with the development of themicrocomputer, which would later be called thepersonal computer (PC).
Most early microprocessors, such as theIntel 8008 andIntel 8080, were8-bit. Texas Instruments released the first fully16-bit microprocessor, theTMS9900 processor, in June 1976.[71] They used the microprocessor in the TI-99/4 andTI-99/4A computers.
The 1980s brought about significant advances with microprocessors that greatly impacted the fields of engineering and other sciences. TheMotorola 68000 microprocessor had a processing speed that was far superior to the other microprocessors being used at the time. Because of this, having a newer, faster microprocessor allowed for the newermicrocomputers that came along after to be more efficient in the amount of computing they were able to do. This was evident in the 1983 release of theApple Lisa. The Lisa was one of the first personal computers with agraphical user interface (GUI) that was sold commercially. It ran on the Motorola 68000 CPU and used both dual floppy disk drives and a 5 MB hard drive for storage. The machine also had 1MB ofRAM used for running software from disk without rereading the disk persistently.[72] After the failure of the Lisa in terms of sales, Apple released itsfirst Macintosh computer, still running on the Motorola 68000 microprocessor, but with only 128KB of RAM, one floppy drive, and no hard drive to lower the price.
In the late 1980s and early 1990s, computers became more useful for personal and work purposes, such asword processing.[73] In 1989, Apple released theMacintosh Portable, it weighed 7.3 kg (16 lb) and was extremely expensive, costing US$7,300. At launch, it was one of the most powerful laptops available, but due to the price and weight, it was not met with great success and was discontinued only two years later. That same year Intel introduced the Touchstone Deltasupercomputer, which had 512 microprocessors. This technological advancement was very significant, as it was used as a model for some of the fastest multi-processor systems in the world. It was even used as a prototype for Caltech researchers, who used the model for projects like real-time processing of satellite images and simulating molecular models for various fields of research.
In terms of supercomputing, the first widely acknowledged supercomputer was theControl Data Corporation (CDC)6600[74] built in 1964 bySeymour Cray. Its maximum speed was 40 MHz or 3 million floating point operations per second (FLOPS). The CDC 6600 was replaced by theCDC 7600 in 1969;[75] although its normal clock speed was not faster than the 6600, the 7600 was still faster due to its peak clock speed, which was approximately 30 times faster than that of the 6600. Although CDC was a leader in supercomputers, their relationship with Seymour Cray (which had already been deteriorating) completely collapsed. In 1972, Cray left CDC and began his own company,Cray Research Inc.[76] With support from investors in Wall Street, an industry fueled by the Cold War, and without the restrictions he had within CDC, he created theCray-1 supercomputer. With a clock speed of 80 MHz or 136 megaFLOPS, Cray developed a name for himself in the computing world. By 1982, Cray Research produced theCray X-MP equipped with multiprocessing and in 1985 released theCray-2, which continued with the trend of multiprocessing and clocked at 1.9 gigaFLOPS. Cray Research developed theCray Y-MP in 1988, however afterward struggled to continue to produce supercomputers. This was largely because the Cold War had ended, and the demand for cutting-edge computing by colleges and the government declined drastically and the demand for microprocessing units increased.
In 1998,David Bader developed the firstLinux supercomputer using commodity parts.[77] While at the University of New Mexico, Bader sought to build a supercomputer running Linux using consumer off-the-shelf parts and a high-speed low-latency interconnection network. The prototype utilized an Alta Technologies "AltaCluster" of eight dual, 333 MHz, Intel Pentium II computers running a modified Linux kernel. Bader ported a significant amount of software to provide Linux support for necessary components as well as code from members of the National Computational Science Alliance (NCSA) to ensure interoperability, as none of it had been run on Linux previously.[78] Using the successful prototype design, he led the development of "RoadRunner," the first Linux supercomputer for open use by the national science and engineering community via the National Science Foundation's National Technology Grid. RoadRunner was put into production use in April 1999. At the time of its deployment, it was considered one of the 100 fastest supercomputers in the world.[78][79] Though Linux-based clusters using consumer-grade parts, such asBeowulf, existed before the development of Bader's prototype and RoadRunner, they lacked the scalability, bandwidth, andparallel computing capabilities to be considered "true" supercomputers.[78]
Today, supercomputers are still used by the governments of the world and educational institutions for computations such as simulations of natural disasters, genetic variant searches within a population relating to disease, and more. As of November 2024[update], the fastest supercomputer isEl Capitan.
Starting with known special cases, the calculation of logarithms and trigonometric functions can be performed by looking up numbers in amathematical table, andinterpolating between known cases. For small enough differences, this linear operation was accurate enough for use innavigation andastronomy in theAge of Exploration. The uses of interpolation have thrived in the past 500 years: by the twentieth centuryLeslie Comrie andW.J. Eckert systematized the use of interpolation in tables of numbers for punch card calculation.
The numerical solution of differential equations, notably theNavier-Stokes equations was an important stimulus to computing,withLewis Fry Richardson's numerical approach to solving differential equations. The first computerized weather forecast was performed in 1950 by a team composed of American meteorologistsJule Charney,Philip Duncan Thompson, Larry Gates, and Norwegian meteorologistRagnar Fjørtoft, applied mathematicianJohn von Neumann, andENIAC programmerKlara Dan von Neumann.[80][81][82] To this day, some of the most powerful computer systems on Earth are used forweather forecasts.[83]
By the late 1960s, computer systems could performsymbolic algebraic manipulations well enough to pass college-levelcalculus courses.[citation needed]
Women are often underrepresented inSTEM fields when compared to their male counterparts.[84] In the modern era before the 1960s, computing was widely seen as "women's work" since it was associated with the operation oftabulating machines and other mechanical office work.[85][86] The accuracy of this association varied from place to place. In America,Margaret Hamilton recalled an environment dominated by men,[87] whileElsie Shutt recalled surprise at seeing even half of the computer operators at Raytheon were men.[88] Machine operators in Britain were mostly women into the early 1970s.[89] As these perceptions changed and computing became a high-status career, the field became more dominated by men.[90][91][92] ProfessorJanet Abbate, in her bookRecoding Gender, writes:
Yet women were a significant presence in the early decades of computing. They made up the majority of the first computer programmers during World War II; they held positions of responsibility and influence in the early computer industry; and they were employed in numbers that, while a small minority of the total, compared favorably with women's representation in many other areas of science and engineering. Some female programmers of the 1950s and 1960s would have scoffed at the notion that programming would ever be considered a masculine occupation, yet these women’s experiences and contributions were forgotten all too quickly.[93]
Some notable examples of women in the history of computing are:
{{cite web}}: CS1 maint: multiple names: authors list (link){{cite book}}:ISBN / Date incompatibility (help){{cite book}}: CS1 maint: location missing publisher (link) (3+207+1 pages)10:00 min{{cite journal}}:ISBN / Date incompatibility (help)