Movatterモバイル変換


[0]ホーム

URL:


Jump to content
WikipediaThe Free Encyclopedia
Search

History of computing hardware

From Wikipedia, the free encyclopedia
Developments from devices for simple calculations to complex analog and digital computers

History of computing
Hardware
Software
Computer science
Modern concepts
By country
Timeline of computing
Glossary of computer science

Thehistory of computing hardware spans developments from early devices used for simple calculations to today's complex computers, encompassing advances in both analog and digital technology.

The first aids to computation were purely mechanical devices which required the operator to set up the initial values of an elementaryarithmetic operation, then manipulate the device to obtain the result. In later stages, computing devices began representing numbers in continuous forms, such as by distance along a scale, rotation of a shaft, or a specific voltage level. Numbers could also be represented in the form of digits, automatically manipulated by a mechanism. Although this approach generally required more complex mechanisms, it greatly increased the precision of results. The development of transistor technology, followed by the invention of integrated circuit chips, led to revolutionary breakthroughs.

Transistor-based computers and, later, integrated circuit-based computers enabled digital systems to gradually replace analog systems, increasing both efficiency and processing power.Metal-oxide-semiconductor (MOS)large-scale integration (LSI) then enabledsemiconductor memory and themicroprocessor, leading to another key breakthrough, the miniaturizedpersonal computer (PC), in the 1970s. The cost of computers gradually became so low that personal computers by the 1990s, and thenmobile computers (smartphones andtablets) in the 2000s, became ubiquitous.

Early devices

[edit]
See also:Timeline of computing hardware before 1950

Ancient and medieval

[edit]
TheIshango bone is thought to be a Paleolithic tally stick.[a]
Suanpan (The number represented on this abacus is 6,302,715,408.)

Devices have been used to aid computation for thousands of years, often usingone-to-one correspondence withfingers. The earliest counting device was probably a form oftally stick. TheLebombo bone from the mountains betweenEswatini andSouth Africa may be the oldest known mathematical artifact.[2] It dates from 35,000 BCE and consists of 29 distinct notches that were deliberately cut into ababoon'sfibula.[3][4] Later record keeping aids throughout theFertile Crescent included calculi (clay spheres, cones, etc.) which represented counts of items, probably livestock or grains, sealed in hollow unbaked clay containers.[b][6][c] The use ofcounting rods is one example. Theabacus was used early for arithmetic tasks. What is now called theRoman abacus was used inBabylonia as early asc. 2700–2300 BC. Since then, many other forms of reckoning boards or tables have been invented. In a medieval Europeancounting house, a checkered cloth would be placed on a table, and markers moved around on it according to certain rules, as an aid to calculating sums of money.

Severalanalog computers were constructed in ancient and medieval times to perform astronomical calculations. These included theastrolabe andAntikythera mechanism from theHellenistic world (c. 150–100 BC).[8]

A Greek bronzecombination lock from theAugustan orHadrianic period operated on a primitive form of mechanical logic: the central bolt was physically blocked from retracting until the notches of two independent rotary dials were correctly aligned.[9] InRoman Egypt,Hero of Alexandria (c. 10–70 AD) made mechanical devices includingautomata and a programmablecart.[10] The steam-powered automatic flute described by theBook of Ingenious Devices (850) by the Persian-BaghdadiBanū Mūsā brothers may have been the first programmable device.[11]

Other early mechanical devices used to perform one or another type of calculations include theplanisphere and other mechanical computing devices invented byAl-Biruni (c. AD 1000); theequatorium and universal latitude-independent astrolabe byAl-Zarqali (c. AD 1015); the astronomical analog computers of other medievalMuslim astronomers and engineers; and the astronomicalclock tower ofSu Song (1094) during theSong dynasty. Thecastle clock, ahydropowered mechanicalastronomical clock invented byIsmail al-Jazari in 1206, was the firstprogrammable analog computer.[disputed (for: The cited source doesn't support the claim, and the claim is misleading.)  –discuss][12][13][14]Ramon Llull invented the Lullian Circle: a notional machine for calculating answers to philosophical questions (in this case, to do with Christianity) via logical combinatorics. This idea was taken up byLeibniz centuries later, and is thus one of the founding elements in computing andinformation science.

Renaissance calculating tools

[edit]

Scottish mathematician and physicistJohn Napier discovered that the multiplication and division of numbers could be performed by the addition and subtraction, respectively, of thelogarithms of those numbers. While producing the first logarithmic tables, Napier needed to perform many tedious multiplications. It was at this point that he designed his 'Napier's bones', an abacus-like device that greatly simplified calculations that involved multiplication and division.[d]

A modern slide rule

Sincereal numbers can be represented as distances or intervals on a line, theslide rule was invented in the 1620s, shortly after Napier's work, to allow multiplication and division operations to be carried out significantly faster than was previously possible.[15]Edmund Gunter built a calculating device with a single logarithmic scale at theUniversity of Oxford. His device greatly simplified arithmetic calculations, including multiplication and division.William Oughtred greatly improved this in 1630 with his circular slide rule. He followed this up with the modern slide rule in 1632, essentially a combination of twoGunter rules, held together with the hands. Slide rules were used by generations of engineers and other mathematically involved professional workers, until the invention of thepocket calculator.[16]

Mechanical calculators

[edit]

In 1609,Guidobaldo del Monte made a mechanical multiplier to calculate fractions of a degree. Based on a system of four gears, the rotation of an index on one quadrant corresponds to 60 rotations of another index on an opposite quadrant.[17] Thanks to this machine, errors in the calculation of first, second, third and quarter degrees can be avoided. Guidobaldo is the first to document the use of gears for mechanical calculation.

Wilhelm Schickard, a Germanpolymath, designed a calculating machine in 1623 which combined a mechanized form of Napier's rods with the world's first mechanical adding machine built into the base. Because it made use of a single-tooth gear there were circumstances in which its carry mechanism would jam.[18] A fire destroyed at least one of the machines in 1624 and it is believed Schickard was too disheartened to build another.

View through the back ofPascal's calculator.Pascal invented his machine in 1642.

In 1642, while still a teenager,Blaise Pascal started some pioneering work on calculating machines and after three years of effort and 50 prototypes[19] he invented amechanical calculator.[20][21] He built twenty of these machines (calledPascal's calculator or Pascaline) in the following ten years.[22] Nine Pascalines have survived, most of which are on display in European museums.[e] A continuing debate exists over whether Schickard or Pascal should be regarded as the "inventor of the mechanical calculator" and the range of issues to be considered is discussed elsewhere.[23]

A set ofJohn Napier's calculating tables from around 1680

Gottfried Wilhelm von Leibniz invented thestepped reckoner and hisfamous stepped drum mechanism around 1672. He attempted to create a machine that could be used not only for addition and subtraction but would use a moveable carriage to enable multiplication and division. Leibniz once said "It is unworthy of excellent men to lose hours like slaves in the labour of calculation which could safely be relegated to anyone else if machines were used."[24] However, Leibniz did not incorporate a fully successful carry mechanism. Leibniz also described thebinary numeral system,[25] a central ingredient of all modern computers. However, up to the 1940s, many subsequent designs (includingCharles Babbage's machines of 1822 and evenENIAC of 1945) were based on the decimal system.[f]

Detail of an arithmometer built before 1851. The one-digit multiplier cursor (ivory top) is the leftmost cursor.

Around 1820,Charles Xavier Thomas de Colmar created what would over the rest of the century become the first successful, mass-produced mechanical calculator, the ThomasArithmometer. It could be used to add and subtract, and with a moveable carriage the operator could also multiply, and divide by a process of long multiplication and long division.[26] It utilised a stepped drum similar in conception to that invented by Leibniz. Mechanical calculators remained in use until the 1970s.

Punched-card data processing

[edit]

In 1804, French weaverJoseph Marie Jacquard developeda loom in which the pattern being woven was controlled by a paper tape constructed frompunched cards. The paper tape could be changed without altering the mechanical design of the loom. This was a landmark achievement in programmability. His machine was an improvement over similar weaving looms. Punched cards were preceded by punch bands, as in the machine proposed byBasile Bouchon. These bands would inspire information recording for automatic pianos and more recentlynumerical control machine tools.

IBM punched-card accounting machines, 1936

In the late 1880s, the AmericanHerman Hollerith invented data storage onpunched cards that could then be read by a machine.[27] To process these punched cards, he invented thetabulator and thekeypunch machine. His machines used electromechanicalrelays andcounters.[28] Hollerith's method was used in the1890 United States census. That census was processed two years faster than the prior census had been.[29] Hollerith's company eventually became the core ofIBM.

By 1920, electromechanical tabulating machines could add, subtract, and print accumulated totals.[30] Machine functions were directed by inserting dozens of wire jumpers into removablecontrol panels. When the United States institutedSocial Security in 1935, IBM punched-card systems were used to process records of 26 million workers.[31] Punched cards became ubiquitous in industry and government for accounting and administration.

Leslie Comrie's articles on punched-card methods[32] andW. J. Eckert's publication ofPunched Card Methods in Scientific Computation in 1940, described punched-card techniques sufficiently advanced to solve some differential equations or perform multiplication and division using floating-point representations, all on punched cards andunit record machines.[33] Such machines were used during World War II for cryptographic statistical processing,[34] as well as a vast number of administrative uses. The Astronomical Computing Bureau ofColumbia University performed astronomical calculations representing the state of the art incomputing.[35][36]

Calculators

[edit]
Main article:Calculator
TheCurta calculator could also do multiplication and division.

By the 20th century, earlier mechanical calculators, cash registers, accounting machines, and so on were redesigned to use electric motors, with gear position as the representation for the state of a variable. The word "computer" was a job title assigned to primarily women who used these calculators to perform mathematical calculations.[37] By the 1920s, British scientistLewis Fry Richardson's interest in weather prediction led him to proposehuman computers andnumerical analysis to model the weather; to this day, the most powerful computers onEarth are needed to adequately model its weather using theNavier–Stokes equations.[38]

Companies likeFriden,Marchant Calculator andMonroe made desktop mechanical calculators from the 1930s that could add, subtract, multiply and divide.[39] In 1948, theCurta was introduced by Austrian inventorCurt Herzstark. It was a small, hand-cranked mechanical calculator and as such, a descendant ofGottfried Leibniz'sStepped Reckoner andThomas'Arithmometer.

The world's firstall-electronic desktop calculator was the BritishBell PunchANITA, released in 1961.[40][41] It usedvacuum tubes, cold-cathode tubes andDekatrons in its circuits, with 12 cold-cathode"Nixie" tubes for its display. TheANITA sold well since it was the only electronic desktop calculator available, and was silent and quick. The tube technology was superseded in June 1963 by the U.S. manufacturedFriden EC-130, which had an all-transistor design, a stack of four 13-digit numbers displayed on a 5-inch (13 cm)CRT, and introducedreverse Polish notation (RPN).

First proposed general-purpose computing device

[edit]
Main article:Analytical Engine
A portion ofBabbage'sDifference Engine
Trial model of a part of the Analytical Engine, built by Babbage, as displayed at the Science Museum, London

TheIndustrial Revolution (late 18th to early 19th century) had a significant impact on the evolution of computing hardware, as the era's rapid advancements in machinery and manufacturing laid the groundwork for mechanized and automated computing. Industrial needs for precise, large-scale calculations—especially in fields such as navigation, engineering, and finance—prompted innovations in both design and function, setting the stage for devices likeCharles Babbage'sdifference engine (1822).[42][43] This mechanical device was intended to automate the calculation of polynomial functions and represented one of the earliest applications of computational logic.[44]

Babbage, often regarded as the "father of the computer," envisioned a fully mechanical system of gears and wheels, powered by steam, capable of handling complex calculations that previously required intensive manual labor.[45] His difference engine, designed to aid navigational calculations, ultimately led him to conceive theanalytical engine in 1833.[46] This concept, far more advanced than his difference engine, included anarithmetic logic unit, control flow through conditional branching and loops, and integrated memory.[47] Babbage's plans made his analytical engine the first general-purpose design that could be described asTuring-complete in modern terms.[48][49]

The analytical engine was programmed usingpunched cards, a method adapted from theJacquard loom invented byJoseph Marie Jacquard in 1804, which controlled textile patterns with a sequence of punched cards.[50] These cards became foundational in later computing systems as well.[51] Babbage's machine would have featured multiple output devices, including a printer, a curve plotter, and even a bell, demonstrating his ambition for versatile computational applications beyond simple arithmetic.[52]

Ada Lovelace expanded on Babbage's vision by conceptualizing algorithms that could be executed by his machine.[53] Her notes on the analytical engine, written in the 1840s, are now recognized as the earliest examples of computer programming.[54] Lovelace saw potential in computers to go beyond numerical calculations, predicting that they might one day generate complex musical compositions or perform tasks like language processing.[55]

Although Babbage's designs were never fully realized due to technical and financial challenges, they influenced a range of subsequent developments in computing hardware. Notably, in the 1890s,Herman Hollerith adapted the idea of punched cards for automated data processing, which was utilized in the U.S. Census and sped up data tabulation significantly, bridging industrial machinery with data processing.[56]

The Industrial Revolution's advancements in mechanical systems demonstrated the potential for machines to conduct complex calculations, influencing engineers likeLeonardo Torres Quevedo andVannevar Bush in the early 20th century. Torres Quevedo designed an electromechanical machine with floating-point arithmetic,[57] while Bush's later work explored electronic digital computing.[58] By the mid-20th century, these innovations paved the way for the first fully electronic computers.[59]

Analog computers

[edit]
Main article:Analog computer
Further information:Mechanical computer
Sir William Thomson's third tide-predicting machine design, 1879–81

In the first half of the 20th century,analog computers were considered by many to be the future of computing. These devices used the continuously changeable aspects of physical phenomena such aselectrical,mechanical, orhydraulic quantities tomodel the problem being solved, in contrast todigital computers that represented varying quantities symbolically, as their numerical values change. As an analog computer does not use discrete values, but rather continuous values, processes cannot be reliably repeated with exact equivalence, as they can withTuring machines.[60]

The first modern analog computer was atide-predicting machine, invented bySir William Thomson, later Lord Kelvin, in 1872. It used a system of pulleys and wires to automatically calculate predicted tide levels for a set period at a particular location and was of great utility to navigation in shallow waters. His device was the foundation for further developments in analog computing.[61]

Thedifferential analyser, a mechanical analog computer designed to solve differential equations by integration using wheel-and-disc mechanisms, was conceptualized in 1876 byJames Thomson, the brother of the more famous Lord Kelvin. He explored the possible construction of such calculators, but was stymied by the limited output torque of theball-and-disk integrators.[62] In a differential analyzer, the output of one integrator drove the input of the next integrator, or a graphing output.

A notable series of analog calculating machines were developed byLeonardo Torres Quevedo since 1895, including one that was able to compute the roots of arbitrarypolynomials of order eight, including the complex ones, with a precision down to thousandths.[63][64][65]

A Mk. I Drift Sight. The lever just in front of the bomb aimer's fingertips sets the altitude, the wheels near his knuckles set the wind and airspeed.

An important advance in analog computing was the development of the firstfire-control systems for long rangeshipgunlaying. When gunnery ranges increased dramatically in the late 19th century it was no longer a simple matter of calculating the proper aim point, given the flight times of the shells. Various spotters on board the ship would relay distance measures and observations to a central plotting station. There the fire direction teams fed in the location, speed and direction of the ship and its target, as well as various adjustments forCoriolis effect, weather effects on the air, and other adjustments; the computer would then output a firing solution, which would be fed to the turrets for laying. In 1912, British engineerArthur Pollen developed the first electrically powered mechanicalanalogue computer (called at the time the Argo Clock).[citation needed] It was used by theImperial Russian Navy inWorld War I.[citation needed] The alternativeDreyer Table fire control system was fitted to British capital ships by mid-1916.

Mechanical devices were also used to aid theaccuracy of aerial bombing.Drift Sight was the first such aid, developed byHarry Wimperis in 1916 for theRoyal Naval Air Service; it measured thewind speed from the air, and used that measurement to calculate the wind's effects on the trajectory of the bombs. The system was later improved with theCourse Setting Bomb Sight, and reached a climax withWorld War II bomb sights,Mark XIV bomb sight (RAF Bomber Command) and theNorden[66] (United States Army Air Forces).

The art of mechanical analog computing reached its zenith with thedifferential analyzer,[67] built by H. L. Hazen andVannevar Bush atMIT starting in 1927, which built on the mechanical integrators ofJames Thomson and thetorque amplifiers invented by H. W. Nieman. A dozen of these devices were built before their obsolescence became obvious; the most powerful was constructed at theUniversity of Pennsylvania'sMoore School of Electrical Engineering, where theENIAC was built.

A fully electronic analog computer was built byHelmut Hölzer in 1942 atPeenemünde Army Research Center.[68][69][70]

By the 1950s the success of digital electronic computers had spelled the end for most analog computing machines, buthybrid analog computers, controlled by digital electronics, remained in substantial use into the 1950s and 1960s, and later in some specialized applications.

Advent of the digital computer

[edit]
Parts from four early computers, 1962. From left to right:ENIAC board,EDVAC board,ORDVAC board, andBRLESC-I board, showing the trend towardminiaturization.

The principle of the modern computer was first described by computer scientistAlan Turing, who set out the idea in his seminal 1936 paper,[71]On Computable Numbers. Turing reformulatedKurt Gödel's 1931 results on the limits of proof and computation, replacing Gödel's universal arithmetic-based formal language with the formal and simple hypothetical devices that became known asTuring machines. He proved that some such machine would be capable of performing any conceivable mathematical computation if it were representable as analgorithm. He went on to prove that there was no solution to theEntscheidungsproblem by first showing that thehalting problem for Turing machines isundecidable: in general, it is not possible to decide algorithmically whether a given Turing machine will ever halt.

He also introduced the notion of a "universal machine" (now known as auniversal Turing machine), with the idea that such a machine could perform the tasks of any other machine, or in other words, it is provably capable of computing anything that is computable by executing a program stored on tape, allowing the machine to be programmable.John von Neumann acknowledged that the central concept of the modern computer was due to this paper.[72] Turing machines are to this day a central object of study intheory of computation. Except for the limitations imposed by their finite memory stores, modern computers are said to beTuring-complete, which is to say, they havealgorithm execution capability equivalent to auniversal Turing machine.

Electromechanical computers

[edit]
Further information:Mechanical computer § Electro-mechanical computers

The era of modern computing began with a flurry of development before and during World War II. Most digital computers built in this period were electromechanical – electric switches drove mechanical relays to perform the calculation. These mechanical components had a low operating speed due to their mechanical nature and were eventually superseded by much faster all-electric components, originally usingvacuum tubes and latertransistors.

TheZ2 was one of the earliest examples of an electric operated digitalcomputer built with electromechanical relays and was created by civil engineerKonrad Zuse in 1940 in Germany. It was an improvement on his earlier, mechanicalZ1; although it used the same mechanicalmemory, it replaced the arithmetic and control logic with electricalrelay circuits.[73]

In the same year, electro-mechanical devices calledbombes were built by Britishcryptologists to help decipherGermanEnigma-machine-encrypted secret messages duringWorld War II. The bombe's initial design was created in 1939 at the UKGovernment Code and Cypher School atBletchley Park byAlan Turing,[74] with an important refinement devised in 1940 byGordon Welchman.[75] The engineering design and construction was the work ofHarold Keen of theBritish Tabulating Machine Company. It was a substantial development from a device that had been designed in 1938 byPolish Cipher Bureau cryptologistMarian Rejewski, and known as the "cryptologic bomb" (Polish:"bomba kryptologiczna").

Replica ofZuse'sZ3, the first fully automatic, digital (electromechanical) computer

In 1941, Zuse followed his earlier machine up with theZ3,[73] the world's first workingelectromechanicalprogrammable, fully automatic digital computer.[76] The Z3 was built with 2000relays, implementing a 22-bitword length that operated at aclock frequency of about 5–10 Hz.[77] Program code and data were stored on punchedfilm. It was similar to modern machines in several respects, pioneering numerous advances such asfloating-point numbers. Replacement of the hard-to-implement decimal system (used inCharles Babbage's earlier design) by the simplerbinary system meant that Zuse's machines were easier to build and potentially more reliable, given the technologies available at that time.[78] Despite lacking explicit conditional execution, the Z3 was proven to have been a theoreticallyTuring-complete machine in 1998 byRaúl Rojas.[79] In two 1936patent applications, Zuse also anticipated that machine instructions could be stored in the same storage used for data—the key insight of what became known as thevon Neumann architecture, first implemented in 1948 in America in theelectromechanicalIBM SSEC and in Britain in the fully electronicManchester Baby.[80]

Zuse suffered setbacks during World War II when some of his machines were destroyed in the course ofAllied bombing campaigns. Apparently his work remained largely unknown to engineers in the UK and US until much later, although at least IBM was aware of it as it financed his post-war startup company in 1946 in return for an option on Zuse's patents.

In 1944, theHarvard Mark I was constructed at IBM's Endicott laboratories.[81] It was a similar general purpose electro-mechanical computer to the Z3, but was not quite Turing-complete.

Digital computation

[edit]

The term digital was first suggested byGeorge Robert Stibitz and refers to where a signal, such as a voltage, is not used to directly represent a value (as it would be in ananalog computer), but to encode it. In November 1937, Stibitz, then working at Bell Labs (1930–1941),[82] completed a relay-based calculator he later dubbed the "Model K" (for "kitchen table", on which he had assembled it), which became the firstbinary adder.[83] Typically signals have two states – low (usually representing 0) and high (usually representing 1), but sometimesthree-valued logic is used, especially in high-density memory. Modern computers generally usebinary logic, but many early machines weredecimal computers. In these machines, the basic unit of data was the decimal digit, encoded in one of several schemes, includingbinary-coded decimal or BCD,bi-quinary,excess-3, andtwo-out-of-five code.

The mathematical basis of digital computing isBoolean algebra, developed by the British mathematicianGeorge Boole in his workThe Laws of Thought, published in 1854. His Boolean algebra was further refined in the 1860s byWilliam Jevons andCharles Sanders Peirce, and was first presented systematically byErnst Schröder andA. N. Whitehead.[84] In 1879 Gottlob Frege developed the formal approach to logic and proposes the first logic language for logical equations.[85]

In the 1930s and working independently, Americanelectronic engineerClaude Shannon and SovietlogicianVictor Shestakov both showed aone-to-one correspondence between the concepts ofBoolean logic and certain electrical circuits, now calledlogic gates, which are now ubiquitous in digital computers.[86] They showed that electronic relays and switches can realize theexpressions ofBoolean algebra.[87] This thesis essentially founded practicaldigital circuit design. In addition Shannon's paper gives a correct circuit diagram for a 4 bit digital binary adder.[88]

Electronic data processing

[edit]
Atanasoff–Berry Computer replica at first floor of Durham Center,Iowa State University

Purelyelectronic circuit elements soon replaced their mechanical and electromechanical equivalents, at the same time that digital calculation replaced analog. Machines such as theZ3, theAtanasoff–Berry Computer, theColossus computers, and theENIAC were built by hand, using circuits containing relays or valves (vacuum tubes), and often usedpunched cards orpunched paper tape for input and as the main (non-volatile) storage medium.[89]

EngineerTommy Flowers joined the telecommunications branch of theGeneral Post Office in 1926. While working at theresearch station inDollis Hill in the 1930s, he began to explore the possible use of electronics for thetelephone exchange. Experimental equipment that he built in 1934 went into operation 5 years later, converting a portion of thetelephone exchange network into an electronic data processing system, using thousands ofvacuum tubes.[61]

In the US, in 1940 Arthur Dickinson (IBM) invented the first digital electronic computer.[90] This calculating device was fully electronic – control, calculations and output (the first electronic display).[91] John Vincent Atanasoff and Clifford E. Berry of Iowa State University developed the Atanasoff–Berry Computer (ABC) in 1942,[92] the first binary electronic digital calculating device.[93] This design was semi-electronic (electro-mechanical control and electronic calculations), and used about 300 vacuum tubes, with capacitors fixed in a mechanically rotating drum for memory. However, its paper card writer/reader was unreliable and the regenerative drum contact system was mechanical. The machine's special-purpose nature and lack of changeable,stored program distinguish it from modern computers.[94]

Computers whose logic was primarily built using vacuum tubes are now known asfirst generation computers.

The electronic programmable computer

[edit]
Main articles:Colossus computer andENIAC
Colossus was the firstelectronicdigitalprogrammable computing device, and was used to break German ciphers during World War II. It remained unknown, as a military secret, well into the 1970s.

During World War II, British codebreakers atBletchley Park, 40 miles (64 km) north of London, achieved a number of successes at breaking encrypted enemy military communications. The German encryption machine,Enigma, was first attacked with the help of the electro-mechanicalbombes.[95] They ruled out possible Enigma settings by performing chains of logical deductions implemented electrically. Most possibilities led to a contradiction, and the few remaining could be tested by hand.

The Germans also developed a series of teleprinter encryption systems, quite different from Enigma. TheLorenz SZ 40/42 machine was used for high-level Army communications, code-named "Tunny" by the British. The first intercepts of Lorenz messages began in 1941. As part of an attack on Tunny,Max Newman and his colleagues developed theHeath Robinson, a fixed-function machine to aid in code breaking.[96]Tommy Flowers, a senior engineer at thePost Office Research Station[97] was recommended to Max Newman by Alan Turing[98] and spent eleven months from early February 1943 designing and building the more flexibleColossus computer (which superseded theHeath Robinson).[99][100] After a functional test in December 1943, Colossus was shipped to Bletchley Park, where it was delivered on 18 January 1944[101] and attacked its first message on 5 February.[102] By the time Germany surrendered in May 1945, there were tenColossi working at Bletchley Park.[103]

Wartime photo of Colossus No. 10

Colossus was the world's firstelectronicdigitalprogrammablecomputer.[61] It used a large number of valves (vacuum tubes). It had paper-tape input and was capable of being configured to perform a variety ofBoolean logical operations on its data,[104] but it was notTuring-complete. Data input to Colossus was byphotoelectric reading of a paper tape transcription of the enciphered intercepted message. This was arranged in a continuous loop so that it could be read and re-read multiple times – there being no internal store for the data. The reading mechanism ran at 5,000 characters per second with the paper tape moving at 40 ft/s (12.2 m/s; 27.3 mph). Colossus Mark 1 contained 1500 thermionic valves (tubes), but Mark 2 with 2400 valves and five processors in parallel, was both 5 times faster and simpler to operate than Mark 1, greatly speeding the decoding process. Mark 2 was designed while Mark 1 was being constructed.Allen Coombs took over leadership of the Colossus Mark 2 project whenTommy Flowers moved on to other projects.[105] The first Mark 2 Colossus became operational on 1 June 1944, just in time for the AlliedInvasion of Normandy onD-Day.

Most of the use of Colossus was in determining the start positions of the Tunny rotors for a message, which was called "wheel setting". Colossus included the first-ever use ofshift registers andsystolic arrays, enabling five simultaneous tests, each involving up to 100Boolean calculations. This enabled five different possible start positions to be examined for one transit of the paper tape.[106] As well as wheel setting some laterColossi included mechanisms intended to help determine pin patterns known as "wheel breaking". Both models were programmable using switches and plug panels in a way their predecessors had not been.

ENIAC was the first Turing-complete electronic device, and performed ballistics trajectory calculations for theUnited States Army.[107]

Without the use of these machines, theAllies would have been deprived of the very valuableintelligence that was obtained from reading the vast quantity ofenciphered high-leveltelegraphic messages between theGerman High Command (OKW) and theirarmy commands throughout occupied Europe. Details of their existence, design, and use were kept secret well into the 1970s.Winston Churchill personally issued an order for their destruction into pieces no larger than a man's hand, to keep secret that the British were capable of crackingLorenz SZ cyphers (from German rotor stream cipher machines) during the oncoming Cold War. Two of the machines were transferred to the newly formedGCHQ and the others were destroyed. As a result, the machines were not included in many histories of computing.[g] A reconstructed working copy of one of the Colossus machines is now on display at Bletchley Park.

TheENIAC (Electronic Numerical Integrator and Computer) was the first electronic programmable computer built in the US. Although the ENIAC used similar technology to theColossi, it was much faster and more flexible and was Turing-complete. Like the Colossi, a "program" on the ENIAC was defined by the states of its patch cables and switches, a far cry from thestored-program electronic machines that came later. Once a program was ready to be run, it had to be mechanically set into the machine with manual resetting of plugs and switches. The programmers of the ENIAC were women who had been trained as mathematicians.[108]

It combined the high speed of electronics with the ability to be programmed for many complex problems. It could add or subtract 5000 times a second, a thousand times faster than any other machine. It also had modules to multiply, divide, and square root. High-speed memory was limited to 20 words (equivalent to about 80 bytes). Built under the direction ofJohn Mauchly andJ. Presper Eckert at the University of Pennsylvania, ENIAC's development and construction lasted from 1943 to full operation at the end of 1945. The machine was huge, weighing 30 tons, using 200 kilowatts of electric power and contained over 18,000 vacuum tubes, 1,500 relays, and hundreds of thousands of resistors, capacitors, and inductors.[109] One of its major engineering feats was to minimize the effects of tube burnout, which was a common problem in machine reliability at that time. The machine was in almost constant use for the next ten years.

Stored-program computer

[edit]
Main article:Stored-program computer
Further information:List of vacuum-tube computers
Design of thevon Neumann architecture, 1947

The theoretical basis for the stored-program computer was proposed byAlan Turing in his 1936 paperOn Computable Numbers.[71] Whilst Turing was atPrinceton University working on his PhD,John von Neumann got to know him and became intrigued by his concept of a universal computing machine.[110]

Early computing machines executed the set sequence of steps, known as a 'program', that could be altered by changing electrical connections using switches or apatch panel (orplugboard). However, this process of 'reprogramming' was often difficult and time-consuming, requiring engineers to create flowcharts and physically re-wire the machines.[111] Stored-program computers, by contrast, were designed to store a set of instructions (aprogram), in memory – typically the same memory as stored data.

ENIAC inventorsJohn Mauchly andJ. Presper Eckert proposed, in August 1944, the construction of a machine called the Electronic Discrete Variable Automatic Computer (EDVAC) and design work for it commenced at theUniversity of Pennsylvania'sMoore School of Electrical Engineering, before the ENIAC was fully operational. The design implemented a number of important architectural and logical improvements conceived during the ENIAC's construction, and a high-speedserial-access memory.[112] However, Eckert and Mauchly left the project and its construction floundered.

In 1945, von Neumann visited the Moore School and wrote notes on what he saw, which he sent to the project. The U.S. Army liaison there had them typed and circulated as theFirst Draft of a Report on the EDVAC. The draft did not mention Eckert and Mauchly and, despite its incomplete nature and questionable lack of attribution of the sources of some of the ideas,[61] the computer architecture it outlined became known as the 'von Neumann architecture'.

In 1945, Turing joined theUK National Physical Laboratory and began work on developing an electronic stored-program digital computer. His late-1945 report 'Proposed Electronic Calculator' was the first reasonably detailed specification for such a device. Turing presented a more detailed paper to theNational Physical Laboratory (NPL) Executive Committee in March 1946, giving the first substantially complete design of astored-program computer, a device that was called theAutomatic Computing Engine (ACE).

Turing considered that the speed and the size ofcomputer memory were crucial elements,[113]: p.4  so he proposed a high-speed memory of what would today be called 25KB, accessed at a speed of 1MHz. The ACE implementedsubroutine calls, whereas the EDVAC did not, and the ACE also usedAbbreviated Computer Instructions, an early form ofprogramming language.

Manchester Baby

[edit]
Main article:Manchester Baby
Three tall racks containing electronic circuit boards
A section of the rebuiltManchester Baby, the first electronic stored-program computer

TheManchester Baby (Small Scale Experimental Machine, SSEM) was the world's first electronicstored-program computer. It was built at theVictoria University of Manchester byFrederic C. Williams,Tom Kilburn and Geoff Tootill, and ran its first program on 21 June 1948.[114]

The machine was not intended to be a practical computer, but was instead designed as atestbed for theWilliams tube, the firstrandom-access digital storage device.[115] Invented byFreddie Williams andTom Kilburn[116][117] at the University of Manchester in 1946 and 1947, it was acathode-ray tube that used an effect calledsecondary emission to temporarily store electronicbinary data, and was used successfully in several early computers.

Described as small and primitive in a 1998 retrospective, the Baby was the first working machine to contain all of the elements essential to a modern electronic computer.[118] As soon as it had demonstrated the feasibility of its design, a project was initiated at the university to develop the design into a more usable computer, theManchester Mark 1. The Mark 1 in turn quickly became the prototype for theFerranti Mark 1, the world's first commercially available general-purpose computer.[119]

The Baby had a32-bitword length and amemory of 32 words. As it was designed to be the simplest possible stored-program computer, the only arithmetic operations implemented inhardware weresubtraction andnegation; other arithmetic operations were implemented in software. The first of three programs written for the machine found the highestproper divisor of 218 (262,144), a calculation that was known would take a long time to run—and so prove the computer's reliability—by testing every integer from 218 − 1 downwards, as division was implemented by repeated subtraction of the divisor. The program consisted of 17 instructions and ran for 52 minutes before producing the correct answer of 131,072, after the Baby had performed 3.5 million operations (for an effective CPU speed of 1.1kIPS). The successive approximations to the answer were displayed as a pattern of dots on the outputCRT which mirrored the pattern held on the Williams tube used for storage.

Manchester Mark 1

[edit]

The SSEM led to the development of theManchester Mark 1 at the University of Manchester.[120] Work began in August 1948, and the first version was operational by April 1949; a program written to search forMersenne primes ran error-free for nine hours on the night of 16/17 June 1949. The machine's successful operation was widely reported in the British press, which used the phrase "electronic brain" in describing it to their readers.

The computer is especially historically significant because of its pioneering inclusion ofindex registers, an innovation which made it easier for a program to read sequentially through an array ofwords in memory. Thirty-four patents resulted from the machine's development, and many of the ideas behind its design were incorporated in subsequent commercial products such as theIBM 701 and702 as well as the Ferranti Mark 1. The chief designers,Frederic C. Williams andTom Kilburn, concluded from their experiences with the Mark 1 that computers would be used more in scientific roles than in pure mathematics. In 1951 they started development work onMeg, the Mark 1's successor, which would include afloating-point unit.

EDSAC

[edit]
EDSAC

The other contender for being the first recognizably modern digital stored-program computer[121] was theEDSAC,[122] designed and constructed byMaurice Wilkes and his team at theUniversity of Cambridge Mathematical Laboratory inEngland at theUniversity of Cambridge in 1949. The machine was inspired byJohn von Neumann's seminalFirst Draft of a Report on the EDVAC and was one of the first usefully operational electronic digitalstored-program computers.[h]

EDSAC ran its first programs on 6 May 1949, when it calculated a table of squares[125] and a list ofprime numbers.The EDSAC also served as the basis for the first commercially applied computer, theLEO I, used by food manufacturing companyJ. Lyons & Co. Ltd. EDSAC 1 was finally shut down on 11 July 1958, having been superseded by EDSAC 2 which stayed in use until 1965.[126]

The "brain" [computer] may one day come down to our level [of the common people] and help with our income-tax and book-keeping calculations. But this is speculation and there is no sign of it so far.

— British newspaperThe Star in a June 1949 news article about theEDSAC computer, long before the era of the personal computers.[127]

EDVAC

[edit]
EDVAC

ENIAC inventorsJohn Mauchly andJ. Presper Eckert proposed theEDVAC's construction in August 1944, and design work for the EDVAC commenced at theUniversity of Pennsylvania'sMoore School of Electrical Engineering, before theENIAC was fully operational. The design implemented a number of important architectural and logical improvements conceived during the ENIAC's construction, and a high-speedserial-access memory.[112] However, Eckert and Mauchly left the project and its construction floundered.

It was finally delivered to theU.S. Army'sBallistics Research Laboratory at theAberdeen Proving Ground in August 1949, but due to a number of problems, the computer only began operation in 1951, and then only on a limited basis.

Commercial computers

[edit]

The first commercial electronic computer was theFerranti Mark 1, built byFerranti and delivered to theUniversity of Manchester in February 1951. It was based on theManchester Mark 1. The main improvements over the Manchester Mark 1 were in the size of theprimary storage (usingrandom accessWilliams tubes),secondary storage (using amagnetic drum), a faster multiplier, and additional instructions. The basic cycle time was 1.2 milliseconds, and a multiplication could be completed in about 2.16 milliseconds. The multiplier used almost a quarter of the machine's 4,050 vacuum tubes (valves).[128] A second machine was purchased by theUniversity of Toronto, before the design was revised into theMark 1 Star. At least seven of these later machines were delivered between 1953 and 1957, one of them toShell labs in Amsterdam.[129]

In October 1947, the directors ofJ. Lyons & Company, a British catering company famous for its teashops but with strong interests in new office management techniques, decided to take an active role in promoting the commercial development of computers. TheLEO I computer (Lyons Electronic Office) became operational in April 1951[130] and ran the world's first regular routine office computerjob. On 17 November 1951, the J. Lyons company began weekly operation of a bakery valuations job on the LEO – the first businessapplication to go live on a stored-program computer.[i]

In June 1951, theUNIVAC I (Universal Automatic Computer) was delivered to theU.S. Census Bureau. Remington Rand eventually sold 46 machines at more thanUS$1 million each ($12.1 million as of 2024).[131] UNIVAC was the first "mass-produced" computer. It used 5,200 vacuum tubes and consumed125 kW of power. Its primary storage wasserial-access mercury delay lines capable of storing 1,000 words of 11 decimal digits plus sign (72-bit words).

In 1952,Compagnie des Machines Bull released theGamma 3 computer, which became a large success in Europe, eventually selling more than 1,200 units, and the first computer produced in more than 1,000 units.[132] The Gamma 3 had innovative features for its time including a dual-mode, software switchable, BCD and binary ALU, as well as a hardwired floating-point library for scientific computing.[132] In its E.T configuration, the Gamma 3 drum memory could fit about 50,000 instructions for a capacity of 16,384 words (around 100 kB), a large amount for the time.[132]

Front panel of theIBM 650

Compared to the UNIVAC, IBM introduced a smaller, more affordable computer in 1954 that proved very popular.[j][134] TheIBM 650 weighed over900 kg, the attached power supply weighed around1350 kg and both were held in separate cabinets of roughly 1.5 × 0.9 × 1.8 m. The system costUS$500,000[135] ($5.85 million as of 2024) or could be leased forUS$3,500 a month ($40,000 as of 2024).[131] Its drum memory was originally 2,000 ten-digit words, later expanded to 4,000 words. Memory limitations such as this were to dominate programming for decades afterward. The program instructions were fetched from the spinning drum as the code ran. Efficient execution using drum memory was provided by a combination of hardware architecture – the instruction format included the address of the next instruction – and software: theSymbolic Optimal Assembly Program, SOAP,[136] assigned instructions to the optimal addresses (to the extent possible by static analysis of the source program). Thus many instructions were, when needed, located in the next row of the drum to be read and additional wait time for drum rotation was reduced.

Microprogramming

[edit]

In 1951, British scientistMaurice Wilkes developed the concept ofmicroprogramming from the realisation that thecentral processing unit of a computer could be controlled by a miniature, highly specializedcomputer program in high-speedROM. Microprogramming allows the base instruction set to be defined or extended by built-in programs (now calledfirmware ormicrocode).[137] This concept greatly simplified CPU development. He first described this at theUniversity of Manchester Computer Inaugural Conference in 1951, then published in expanded form inIEEE Spectrum in 1955.[citation needed]

It was widely used in the CPUs andfloating-point units ofmainframe and other computers; it was implemented for the first time inEDSAC 2,[138] which also used multiple identical "bit slices" to simplify design. Interchangeable, replaceable tube assemblies were used for each bit of the processor.[k]

Magnetic memory

[edit]
Diagram of a 4×4 plane ofmagnetic-core memory in an X/Y line coincident-current setup. X and Y are drive lines, S is sense, Z is inhibit. Arrows indicate the direction of current for writing.

Magneticdrum memories were developed for the US Navy during WW II with the work continuing atEngineering Research Associates (ERA) in 1946 and 1947. ERA, then a part of Univac included a drum memory in its1103, announced in February 1953. The first mass-produced computer, theIBM 650, also announced in 1953 had about 8.5 kilobytes of drum memory.

Magnetic-core memory patented in 1949[140] with its first usage demonstrated for theWhirlwind computer in August 1953.[141] Commercialization followed quickly. Magnetic core was used in peripherals of the IBM 702 delivered in July 1955, and later in the 702 itself. TheIBM 704 (1955) and the Ferranti Mercury (1957) used magnetic-core memory. It went on to dominate the field into the 1970s, when it was replaced with semiconductor memory. Magnetic core peaked in volume about 1975 and declined in usage and market share thereafter.[142]

As late as 1980, PDP-11/45 machines using magnetic-core main memory and drums for swapping were still in use at many of the original UNIX sites.

Early digital computer characteristics

[edit]
Further information:Analytical Engine § Comparison to other early computers
Defining characteristics of some early digital computers of the 1940s(In the history of computing hardware)
NameCountryFirst operationalNumeral systemComputing mechanismProgrammingTuring-complete
Arthur H. DickinsonIBMUSAJan 1940DecimalElectronicNot programmableNo
Joseph DeschNCRUSAMarch 1940DecimalElectronicNot programmableNo
ZuseZ3GermanyMay 1941Binaryfloating pointElectro-mechanicalProgram-controlled by punched35 mmfilm stock (but no conditional branch)In theory(1998)
Atanasoff–Berry ComputerUSA1942BinaryElectronicNot programmable — single purposeNo
Colossus Mark 1UKFeb 1944BinaryElectronicProgram-controlled by patch cables and switchesNo
Harvard Mark I – IBM ASCCUSAMay 1944DecimalElectro-mechanicalProgram-controlled by 24-channelpunched paper tape (but no conditional branch)Debatable
Colossus Mark 2UKJune 1944BinaryElectronicProgram-controlled by patch cables and switchesConjectured[143]
ZuseZ4GermanyMarch 1945Binary floating pointElectro-mechanicalProgram-controlled by punched35 mm film stockIn 1950
ENIACUSADecember 1945DecimalElectronicProgram-controlled by patch cables and switchesYes
Modified ENIACUSAApril 1948DecimalElectronicRead-only stored-programming mechanism using the Function Tables as programROMYes
ARC2 (SEC)UKMay 1948BinaryElectronicStored-program inrotating drum memoryYes
Manchester BabyUKJune 1948BinaryElectronicStored-program inWilliams cathode-ray tube memoryYes
Manchester Mark 1UKApril 1949BinaryElectronicStored-program in Williams cathode-ray tube memory andmagnetic drum memoryYes
EDSACUKMay 1949BinaryElectronicStored-program in mercurydelay-line memoryYes
CSIRACAustraliaNov 1949BinaryElectronicStored-program in mercury delay-line memoryYes

Transistor computers

[edit]
Main article:Transistor computer
Further information:List of transistorized computers
Abipolar junction transistor

The bipolartransistor was invented in 1947. From 1955 onward transistors replacedvacuum tubes in computer designs,[144] giving rise to the "second generation" of computers. Compared to vacuum tubes, transistors have many advantages: they are smaller, and require less power than vacuum tubes, so give off less heat. Silicon junction transistors were much more reliable than vacuum tubes and had longer service life. Transistorized computers could contain tens of thousands of binary logic circuits in a relatively compact space. Transistors greatly reduced computers' size, initial cost, andoperating cost. Typically, second-generation computers were composed of large numbers ofprinted circuit boards such as theIBM Standard Modular System,[145] each carrying one to fourlogic gates orflip-flops.

At theUniversity of Manchester, a team under the leadership ofTom Kilburn designed and built a machine using the newly developedtransistors instead of valves (vacuum tubes). Initially the only devices available weregermaniumpoint-contact transistors, less reliable than the valves they replaced but which consumed far less power.[146] Their firsttransistorized computer, and the first in the world, wasoperational by 1953,[147] and a second version was completed there in April 1955.[147] The 1955 version used 200 transistors, 1,300solid-statediodes, and had a power consumption of 150 watts. However, the machine did make use of valves to generate its 125 kHz clock waveforms and in the circuitry to read and write on its magnetic drum memory, so it was not the first completely transistorized computer.

That distinction goes to theHarwell CADET of 1955,[148] built by the electronics division of theAtomic Energy Research Establishment atHarwell. The design featured a 64-kilobyte magnetic drum memory store with multiple moving heads that had been designed at theNational Physical Laboratory, UK. By 1953 this team had transistor circuits operating to read and write on a smaller magnetic drum from theRoyal Radar Establishment. The machine used a low clock speed of only 58 kHz to avoid having to use any valves to generate the clock waveforms.[149][148]

CADET used 324-point-contact transistors provided by the UK companyStandard Telephones and Cables; 76junction transistors were used for the first stage amplifiers for data read from the drum, since point-contact transistors were too noisy. From August 1956, CADET was offering a regular computing service, during which it often executed continuous computing runs of 80 hours or more.[150][151] Problems with the reliability of early batches of point contact and alloyed junction transistors meant that the machine'smean time between failures was about 90 minutes, but this improved once the more reliablebipolar junction transistors became available.[152]

The Manchester University Transistor Computer's design was adopted by the local engineering firm ofMetropolitan-Vickers in theirMetrovick 950, the first commercial transistor computer anywhere.[153] Six Metrovick 950s were built, the first completed in 1956. They were successfully deployed within various departments of the company and were in use for about five years.[147] A second generation computer, theIBM 1401, captured about one third of the world market. IBM installed more than ten thousand 1401s between 1960 and 1964.

Transistor peripherals

[edit]

Transistorized electronics improved not only the CPU (Central Processing Unit), but also theperipheral devices. The second generationdisk data storage units were able to store tens of millions of letters and digits. Next to thefixed disk storage units, connected to the CPU via high-speed data transmission, were removable disk data storage units. A removabledisk pack can be easily exchanged with another pack in a few seconds. Even if the removable disks' capacity is smaller than fixed disks, their interchangeability guarantees a nearly unlimited quantity of data close at hand.Magnetic tape provided archival capability for this data, at a lower cost than disk.

Many second-generation CPUs delegated peripheral device communications to a secondary processor. For example, while the communication processor controlledcard reading and punching, the main CPU executed calculations and binarybranch instructions. Onedatabus would bear data between the main CPU and core memory at the CPU'sfetch-execute cycle rate, and other databusses would typically serve the peripheral devices. On thePDP-1, the core memory's cycle time was 5 microseconds; consequently most arithmetic instructions took 10 microseconds (100,000 operations per second) because most operations took at least two memory cycles; one for the instruction, one for theoperand data fetch.

During the second generationremote terminal units (often in the form ofTeleprinters like aFriden Flexowriter) saw greatly increased use.[l] Telephone connections provided sufficient speed for early remote terminals and allowed hundreds of kilometers separation between remote-terminals and the computing center. Eventually these stand-alone computer networks would be generalized into an interconnectednetwork of networks—the Internet.[m]

Transistor supercomputers

[edit]
The University of Manchester Atlas in January 1963

The early 1960s saw the advent ofsupercomputing. TheAtlas was a joint development between theUniversity of Manchester,Ferranti, andPlessey, and was first installed at Manchester University and officially commissioned in 1962 as one of the world's firstsupercomputers – considered to be the most powerful computer in the world at that time.[156] It was said that whenever Atlas went offline half of the United Kingdom's computer capacity was lost.[157] It was a second-generation machine, usingdiscretegermaniumtransistors. Atlas also pioneered theAtlas Supervisor, "considered by many to be the first recognisable modernoperating system".[158]

In the US, a series of computers atControl Data Corporation (CDC) were designed bySeymour Cray to use innovative designs and parallelism to achieve superior computational peak performance.[159] TheCDC 6600, released in 1964, is generally considered the first supercomputer.[160][161] The CDC 6600 outperformed its predecessor, theIBM 7030 Stretch, by about a factor of 3. With performance of about 1 megaFLOPS, the CDC 6600 was the world's fastest computer from 1964 to 1969, when it relinquished that status to its successor, theCDC 7600.

Integrated circuit computers

[edit]
Main article:History of computing hardware (1960s–present) § Third generation

The "third-generation" of digital electronic computers usedintegrated circuit (IC) chips as the basis of their logic.

The idea of an integrated circuit was conceived by a radar scientist working for theRoyal Radar Establishment of theMinistry of Defence,Geoffrey W.A. Dummer.

The first working integrated circuits were invented byJack Kilby atTexas Instruments andRobert Noyce atFairchild Semiconductor.[162] Kilby recorded his initial ideas concerning the integrated circuit in July 1958, successfully demonstrating the first working integrated example on 12 September 1958.[163] Kilby's invention was ahybrid integrated circuit (hybrid IC).[164] It had external wire connections, which made it difficult to mass-produce.[165]

Noyce came up with his own idea of an integrated circuit half a year after Kilby.[166] Noyce's invention was amonolithic integrated circuit (IC) chip.[167][165] His chip solved many practical problems that Kilby's had not. Produced at Fairchild Semiconductor, it was made ofsilicon, whereas Kilby's chip was made ofgermanium. The basis for Noyce's monolithic IC was Fairchild'splanar process, which allowed integrated circuits to be laid out using the same principles as those ofprinted circuits. The planar process was developed by Noyce's colleagueJean Hoerni in early 1959, based onMohamed M. Atalla's work on semiconductor surface passivation by silicon dioxide atBell Labs in the late 1950s.[168][169][170]

Third generation (integrated circuit) computers first appeared in the early 1960s in computers developed for government purposes, and then in commercial computers beginning in the mid-1960s. The first silicon IC computer was theApollo Guidance Computer or AGC.[171] Although not the most powerful computer of its time, the extreme constraints on size, mass, and power of the Apollo spacecraft required the AGC to be much smaller and denser than any prior computer, weighing in at only 70 pounds (32 kg). Each lunar landing mission carried two AGCs, one each in the command and lunar ascent modules.

Semiconductor memory

[edit]
Main article:Semiconductor memory

TheMOSFET (metal–oxide–semiconductor field-effect transistor, or MOS transistor) was invented byMohamed M. Atalla andDawon Kahng atBell Labs in 1959.[172] In addition to data processing, the MOSFET enabled the practical use of MOS transistors asmemory cell storage elements, a function previously served bymagnetic cores.Semiconductor memory, also known asMOS memory, was cheaper and consumed less power thanmagnetic-core memory.[173] MOSrandom-access memory (RAM), in the form ofstatic RAM (SRAM), was developed by John Schmidt atFairchild Semiconductor in 1964.[173][174] In 1966,Robert Dennard at theIBM Thomas J. Watson Research Center developed MOSdynamic RAM (DRAM).[175] In 1967, Dawon Kahng andSimon Sze at Bell Labs developed thefloating-gate MOSFET, the basis for MOSnon-volatile memory such asEPROM,EEPROM andflash memory.[176][177]

Microprocessor computers

[edit]
Main article:History of computing hardware (1960s–present) § Fourth generation

The "fourth-generation" of digital electronic computers usedmicroprocessors as the basis of their logic. The microprocessor has origins in theMOS integrated circuit (MOS IC) chip.[178] Due to rapidMOSFET scaling, MOS IC chips rapidly increased in complexity at a rate predicted byMoore's law, leading tolarge-scale integration (LSI) with hundreds of transistors on a single MOS chip by the late 1960s. The application of MOS LSI chips tocomputing was the basis for the first microprocessors, as engineers began recognizing that a completecomputer processor could be contained on a single MOS LSI chip.[178]

The subject of exactly which device was the first microprocessor is contentious, partly due to lack of agreement on the exact definition of the term "microprocessor". The earliest multi-chip microprocessors were theFour-Phase Systems AL-1 in 1969 andGarrett AiResearchMP944 in 1970, developed with multiple MOS LSI chips.[178] The first single-chip microprocessor was theIntel 4004,[179] developed on a singlePMOS LSI chip.[178] It was designed and realized byTed Hoff,Federico Faggin,Masatoshi Shima andStanley Mazor atIntel, and released in 1971.[n]

Thedie from an Intel8742, an 8-bitmicrocontroller that includes a CPU running at 12 MHz, RAM, EPROM, and I/O

While the earliest microprocessor ICs literally contained only the processor, i.e. the central processing unit, of a computer, their progressive development naturally led to chips containing most or all of the internal electronic parts of a computer. The integrated circuit in the image on the right, for example, anIntel 8742, is an8-bitmicrocontroller that includes a CPU running at 12 MHz, 128 bytes ofRAM, 2048 bytes ofEPROM, andI/O in the same chip.

During the 1960s, there was considerable overlap between second and third generation technologies.[o] IBM implemented itsIBM Solid Logic Technology modules inhybrid circuits for the IBM System/360 in 1964. As late as 1975, Sperry Univac continued the manufacture of second-generation machines such as the UNIVAC 494. TheBurroughs large systems such as the B5000 werestack machines, which allowed for simpler programming. Thesepushdown automatons were also implemented in minicomputers and microprocessors later, which influenced programming language design. Minicomputers served as low-cost computer centers for industry, business and universities.[181] It became possible to simulate analog circuits with thesimulation program with integrated circuit emphasis, orSPICE (1971) on minicomputers, one of the programs for electronic design automation (EDA). The microprocessor led to the development ofmicrocomputers, small, low-cost computers that could be owned by individuals and small businesses. Microcomputers, the first of which appeared in the 1970s, became ubiquitous in the 1980s and beyond.

Altair 8800

While which specific product is considered the first microcomputer system is a matter of debate, one of the earliest is R2E'sMicral N (François Gernelle,André Truong) launched "early 1973" using the Intel 8008.[182] The first commercially available microcomputer kit was theIntel 8080-basedAltair 8800, which was announced in the January 1975 cover article ofPopular Electronics. However, the Altair 8800 was an extremely limited system in its initial stages, having only 256 bytes ofDRAM in its initial package and no input-output except its toggle switches and LED register display. Despite this, it was initially surprisingly popular, with several hundred sales in the first year, and demand rapidly outstripped supply. Several early third-party vendors such asCromemco andProcessor Technology soon began supplying additionalS-100 bus hardware for the Altair 8800.

In April 1975, at theHannover Fair,Olivetti presented theP6060, the world's first complete, pre-assembled personal computer system. The central processing unit consisted of two cards, code named PUCE1 and PUCE2, and unlike most other personal computers was built withTTL components rather than a microprocessor. It had one or two 8"floppy disk drives, a 32-characterplasma display, 80-column graphicalthermal printer, 48 Kbytes ofRAM, andBASIC language. It weighed 40 kg (88 lb). As a complete system, this was a significant step from the Altair, though it never achieved the same success. It was in competition with a similar product by IBM that had an external floppy disk drive.

From 1975 to 1977, most microcomputers, such as theMOS Technology KIM-1, theAltair 8800, and some versions of theApple I, were sold as kits for do-it-yourselfers. Pre-assembled systems did not gain much ground until 1977, with the introduction of theApple II, the TandyTRS-80, the firstSWTPC computers, and theCommodore PET. Computing has evolved with microcomputer architectures, with features added from their larger brethren, now dominant in most market segments.

A NeXT Computer and itsobject-oriented development tools and libraries were used byTim Berners-Lee andRobert Cailliau atCERN to develop the world's firstweb server software,CERN httpd, and also used to write the firstweb browser,WorldWideWeb.

Systems as complicated as computers require very highreliability. ENIAC remained on, in continuous operation from 1947 to 1955, for eight years before being shut down. Although a vacuum tube might fail, it would be replaced without bringing down the system. By the simple strategy of never shutting down ENIAC, the failures were dramatically reduced. The vacuum-tubeSAGE air-defense computers became remarkably reliable – installed in pairs, one off-line, tubes likely to fail did so when the computer was intentionally run at reduced power to find them.Hot-pluggable hard disks, like the hot-pluggable vacuum tubes of yesteryear, continue the tradition of repair during continuous operation. Semiconductor memories routinely have no errors when they operate, although operating systems like Unix have employed memory tests on start-up to detect failing hardware. Today, the requirement of reliable performance is made even more stringent whenserver farms are the delivery platform.[183] Google has managed this by using fault-tolerant software to recover from hardware failures, and is even working on the concept of replacing entire server farms on-the-fly, during a service event.[184][185]

In the 21st century,multi-core CPUs became commercially available.[186]Content-addressable memory (CAM)[187] has become inexpensive enough to be used in networking, and is frequently used for on-chipcache memory in modern microprocessors, although no computer system has yet implemented hardware CAMs for use in programming languages. Currently, CAMs (or associative arrays) in software are programming-language-specific. Semiconductor memory cell arrays are very regular structures, and manufacturers prove their processes on them; this allows price reductions on memory products. During the 1980s,CMOSlogic gates developed into devices that could be made as fast as other circuit types; computer power consumption could therefore be decreased dramatically. Unlike the continuous current draw of a gate based on other logic types, a CMOS gate only draws significant current, except for leakage, during the 'transition' between logic states.[188]

CMOS circuits have allowed computing to become a commercialproduct which is now ubiquitous, embedded inmany forms, from greeting cards andtelephones tosatellites. Thethermal design power which is dissipated during operation has become as essential as computing speed of operation. In 2006 servers consumed 1.5% of the total U.S. electricity consumption.[189] The energy consumption of computer data centers was expected to double to 3% of world consumption by 2011. TheSoC (system on a chip) has compressed even more of theintegrated circuitry into a single chip; SoCs are enabling phones and PCs to converge into single hand-held wirelessmobile devices.[190]

Quantum computing is an emerging technology in the field of computing.MIT Technology Review reported 10 November 2017 that IBM has created a 50-qubit computer; currently its quantum state lasts 50 microseconds.[191] Google researchers have been able to extend the 50 microsecond time limit, as reported 14 July 2021 inNature;[192] stability has been extended 100-fold by spreading a single logical qubit over chains of data qubits forquantum error correction.[192]Physical Review X reported a technique for 'single-gate sensing as a viable readout method for spin qubits' (a singlet-triplet spin state in silicon) on 26 November 2018.[193] A Google team has succeeded in operating their RF pulse modulator chip at 3 kelvins, simplifying the cryogenics of their 72-qubit computer, which is set up to operate at 0.3 K; but the readout circuitry and another driver remain to be brought into the cryogenics.[194][p]See:Quantum supremacy[196][197] Silicon qubit systems have demonstratedentanglement atnon-local distances.[198]

Computing hardware and its software have even become a metaphor for the operation of the universe.[199]

Epilogue

[edit]

An indication of the rapidity of development of this field can be inferred from the history of the seminal 1947 article by Burks, Goldstine and von Neumann.[200] By the time that anyone had time to write anything down, it was obsolete. After 1945, others read John von Neumann'sFirst Draft of a Report on the EDVAC, and immediately started implementing their own systems. To this day, the rapid pace of development has continued, worldwide.[q][r]

See also

[edit]

Notes

[edit]
  1. ^TheIshango bone is abone tool, dated to theUpper Paleolithic era, about 18,000 to 20,000 BC. It is a dark brown length of bone, thefibula of a baboon. It has a series of tally marks carved in three columns running the length of the tool. It was found in 1960 in Belgian Congo.[1]
  2. ^According toSchmandt-Besserat 1981, these clay containers contained tokens, the total of which were the count of objects being transferred. The containers thus served as something of abill of lading or an accounts book. In order to avoid breaking open the containers, first, clay impressions of the tokens were placed on the outside of the containers, for the count; the shapes of the impressions were abstracted into stylized marks; finally, the abstract marks were systematically used as numerals; these numerals were finally formalized as numbers. Eventually (Schmandt-Besserat estimates it took 5000 years.[5]) the marks on the outside of the containers were all that were needed to convey the count, and the clay containers evolved into clay tablets with marks for the count.
  3. ^Robson has recommended at least one supplement toSchmandt-Besserat (1981), e.g., a review,Englund, R. (1993). "The origins of script".Science.260 (5114):1670–1671.doi:10.1126/science.260.5114.1670.PMID 17810210.[7]
  4. ^A Spanish implementation ofNapier's bones (1617), is documented inMontaner & Simon 1887, pp. 19–20.
  5. ^All nine machines are described inVidal & Vogt 2011.
  6. ^Binary-coded decimal (BCD) is a numeric representation, orcharacter encoding, which is still widely used.
  7. ^The existence of Colossus was kept secret by the UK Government for 30 years and so was not known to American computer scientists, such asGordon Bell andAllen Newell. And was not inBell & Newell (1971)Computing Structures, a standard reference work in the 1970s.
  8. ^The Manchester Baby predated EDSAC as astored-program computer, but was built as a test bed for theWilliams tube and not as a machine for practical use.[123] However, the Manchester Mark 1 of 1949 (not to be confused with the 1948 prototype, the Baby) was available for university research in April 1949 despite being still under development.[124]
  9. ^Martin 2008, p. 24 notes thatDavid Caminer (1915–2008) served as the first corporate electronic systems analyst, for this first business computer system. LEO would calculate an employee's pay, handle billing, and other office automation tasks.
  10. ^For example, Kara Platoni's article onDonald Knuth stated that "there was something special about the IBM 650".[133]
  11. ^The microcode was implemented asextracode on Atlas.[139]
  12. ^Allen Newell used remote terminals to communicate cross-country with theRAND computers.[154]
  13. ^Bob Taylor conceived of a generalized protocol to link together multiple networks to be viewed as a single session regardless of the specific network: "Wait a minute. Why not just have one terminal, and it connects to anything you want it to be connected to? And, hence, the Arpanet was born."[155]
  14. ^The Intel 4004 (1971) die was 12 mm2, composed of 2300 transistors; by comparison, the Pentium Pro was 306 mm2, composed of 5.5 million transistors.[180]
  15. ^In the defense field, considerable work was done in the computerized implementation of equations such asKalman 1960, pp. 35–45.
  16. ^IBM's 127-qubit computer cannot be simulated on traditional computers.[195]
  17. ^DBLP summarizes theAnnals of the History of Computing, year by year, back to 1979.[201]
  18. ^The fastestsupercomputer of thetop 500 is now Frontier (of Oak Ridge National Laboratory) at 1.102 ExaFlops,[202] which is 2.66 times faster than Fugaku, now number two of the top 500.[203]
  1. ^Schultz, Phill (7 September 1999)."A very brief history of pure mathematics: The Ishango Bone". University of Western Australia School of Mathematics. Archived fromthe original on 2008-07-21.
  2. ^Selin, Helaine (12 March 2008).Encyclopaedia of the History of Science, Technology, and Medicine in Non-Western Cultures. Springer Science & Business Media. p. 1356.Bibcode:2008ehst.book.....S.ISBN 978-1-4020-4559-2. Retrieved2020-05-27.
  3. ^Pegg, Ed Jr."Lebombo Bone".MathWorld.
  4. ^Darling, David (2004).The Universal Book of Mathematics From Abracadabra to Zeno's Paradoxes. John Wiley & Sons.ISBN 978-0-471-27047-8.
  5. ^Schmandt-Besserat, Denise."The Evolution of Writing"(PDF).Archived from the original on 2012-01-30.
  6. ^Robson, Eleanor (2008).Mathematics in Ancient Iraq. Princeton University Press.ISBN 978-0-691-09182-2. p. 5:calculi were in use in Iraq for primitive accounting systems as early as 3200–3000 BCE, with commodity-specific counting representation systems. Balanced accounting was in use by 3000–2350 BCE, and asexagesimal number system was in use 2350–2000 BCE.
  7. ^Robson, Eleanor."Bibliography of Mesopotamian Mathematics". Archived fromthe original on 2016-06-16. Retrieved2016-07-06.
  8. ^Lazos 1994.
  9. ^Hoepfner, Wolfram (1970)."Ein Kombinationsschloss aus dem Kerameikos"(PDF).Archäologischer Anzeiger (in German).85 (2):210–213.doi:10.11588/propylaeumdok.00005321.
  10. ^Sharkey, Noel (4 July 2007),A programmable robot from 60 AD, vol. 2611, New Scientist, archived fromthe original on 2017-12-13
  11. ^Koetsier, Teun (2001), "On the prehistory of programmable machines: musical automata, looms, calculators",Mechanism and Machine Theory,36 (5), Elsevier:589–603,doi:10.1016/S0094-114X(01)00005-2.
  12. ^"Episode 11: Ancient Robots",Ancient Discoveries,History Channel, archived fromthe original on 2014-03-01, retrieved2008-09-06
  13. ^Turner, Howard R. (1997).Science in Medieval Islam: An Illustrated Introduction. Austin: University of Texas press. p. 184.ISBN 978-0-292-78149-8.
  14. ^Hill, Donald Routledge (May 1991). "Mechanical Engineering in the Medieval Near East".Scientific American. pp. 64–69. (cf.Hill, Donald Routledge."IX. Mechanical Engineering".History of Sciences in the Islamic World. Archived fromthe original on 2007-12-25.)
  15. ^Kells, Kern & Bland 1943, p. 92.
  16. ^Kells, Kern & Bland 1943, p. 82.
  17. ^Meli, Domenico Bertolini (1992). "Guidobaldo Dal Monte and the Archimedean Revival".Nuncius.7 (1):3–34.doi:10.1163/182539192x00019.
  18. ^Williams 1997, p. 128 "...the single-tooth gear, like that used by Schickard, would not do for a general carry mechanism. The single-tooth gear works fine if the carry is only going to be propagated a few places but, if the carry has to be propagated several places along the accumulator, the force needed to operate the machine would be of such magnitude that it would do damage to the delicate gear works."
  19. ^Pascal, Blaise (1645).La Machine d'arithmétique (in French).
  20. ^Marguin 1994, p. 48.
  21. ^Ocagne 1893, p. 245.
  22. ^Mourlevat 1988, p. 12.
  23. ^Falk, Jim."Schickard versus Pascal - an empty debate?". Archived fromthe original on 2014-04-08. Retrieved2014-05-15.
  24. ^Smith 1929, pp. 180–181.
  25. ^Leibniz 1703.
  26. ^"Discovering the Arithmometer".Cornell University. 2005. Archived fromthe original on 2006-09-13. Retrieved2023-08-26.
  27. ^"Herman Hollerith".Columbia University Computing History. Columbia University ACIS.Archived from the original on 2011-05-13. Retrieved2010-01-30.
  28. ^Truesdell, Leon E. (1965).The Development of Punch Card Tabulation in the Bureau of the Census 1890–1940. US GPO. pp. 47–55.
  29. ^Report of the Commissioner of Labor In Charge of The Eleventh Census to the Secretary of the Interior for the Fiscal Year Ending June 30, 1895. Washington, DC:United States Government Publishing Office. 29 July 1895. p. 9.hdl:2027/osu.32435067619882.OCLC 867910652. "You may confidently look for the rapid reduction of the force of this office after the 1st of October, and the entire cessation of clerical work during the present calendar year. ... The condition of the work of the Census Division and the condition of the final reports show clearly that the work of the Eleventh Census will be completed at least two years earlier than was the work of the Tenth Census." — Carroll D. Wright, Commissioner of Labor in Charge
  30. ^"1920".IBM Archives. 23 January 2003.Archived from the original on 2020-10-29. Retrieved2020-12-01.
  31. ^"Chronological History of IBM: 1930s".IBM Archives. 23 January 2003.Archived from the original on 2020-12-03. Retrieved2020-12-01.
  32. ^Leslie Comrie(1928) On the Construction of Tables by Interpolation
  33. ^Eckert 1935.
  34. ^Erskine, Ralph;Smith, Michael, eds. (2011),The Bletchley Park Codebreakers, Biteback Publishing Ltd, p. 134,ISBN 978-184954078-0 Updated and extended version ofAction This Day: From Breaking of the Enigma Code to the Birth of the Modern Computer Bantam Press 2001
  35. ^Frank da Cruz."A Chronology of Computing at Columbia University".Columbia University Computing History. Columbia University.Archived from the original on 2023-08-22. Retrieved2023-08-31.
  36. ^Eckert 1940, pp. 101–114.
  37. ^Light, Jennifer S. (July 1999). "When Computers Were Women".Technology and Culture.40 (3):455–483.doi:10.1353/tech.1999.0128.S2CID 108407884.
  38. ^Hunt 1998.
  39. ^"Friden Model STW-10 Electro-Mechanical Calculator".Archived from the original on 2011-05-14. Retrieved2015-08-11.
  40. ^"Simple and Silent".Office Magazine. December 1961. p. 1244.
  41. ^"'Anita' der erste tragbare elektonische Rechenautomat" ['Anita' the first portable electronic computer].Buromaschinen Mechaniker. November 1961. p. 207.
  42. ^Babbage, Charles (12 October 2011).Passages from the Life of a Philosopher. Cambridge University Press.doi:10.1017/cbo9781139103671.ISBN 978-1-108-03788-4.
  43. ^Babbage, Charles (4 March 2010).On the Economy of Machinery and Manufactures. Cambridge University Press.doi:10.1017/cbo9780511696374.ISBN 978-1-108-00910-2.
  44. ^Hutton, D.M. (1 August 2002)."The Difference Engine: Charles Babbage and the Quest to Build the First Computer".Kybernetes.31 (6).doi:10.1108/k.2002.06731fae.009.ISSN 0368-492X.
  45. ^Tropp, Henry S. (December 1975)."The Origins of Digital Computers: Selected Papers. Brian Randell".Isis.66 (4):572–573.doi:10.1086/351520.ISSN 0021-1753.
  46. ^W., J. W.; Hyman, Anthony (April 1986)."Charles Babbage, Pioneer of the Computer".Mathematics of Computation.46 (174): 759.doi:10.2307/2008013.ISSN 0025-5718.JSTOR 2008013.
  47. ^Campbell-Kelly, Martin; Aspray, William; Ensmenger, Nathan; Yost, Jeffrey R. (20 April 2018).Computer.doi:10.4324/9780429495373.ISBN 978-0-429-49537-3.
  48. ^Turing, Alan (9 September 2004),"Computing Machinery and Intelligence (1950)",The Essential Turing, Oxford University PressOxford, pp. 433–464,doi:10.1093/oso/9780198250791.003.0017,ISBN 978-0-19-825079-1, retrieved2024-10-30{{citation}}: CS1 maint: work parameter with ISBN (link)
  49. ^Davis, Martin (28 February 2018).the Universal Computer.doi:10.1201/9781315144726.ISBN 978-1-315-14472-6.
  50. ^d'Ucel, Jeanne; Dib, Mohammed (1958)."Le métier à tisser".Books Abroad.32 (3): 278.doi:10.2307/40098349.ISSN 0006-7431.JSTOR 40098349.
  51. ^Heide, Lars (2009).Punched-Card Systems and the Early Information Explosion, 1880–1945. Johns Hopkins University Press.doi:10.1353/book.3454.ISBN 978-0-8018-9143-4.
  52. ^Bromley, A.G. (1998)."Charles Babbage's Analytical Engine, 1838".IEEE Annals of the History of Computing.20 (4):29–45.Bibcode:1998IAHC...20d..29B.doi:10.1109/85.728228.ISSN 1058-6180.
  53. ^Toole, Betty Alexandra (March 1991)."Ada, an analyst and a metaphysician".ACM SIGAda Ada Letters.XI (2):60–71.doi:10.1145/122028.122031.ISSN 1094-3641.
  54. ^Howard, Emily; De Roure, David (2015)."Turning numbers into notes".Ada Lovelace Symposium 2015- Celebrating 200 Years of a Computer Visionary on - Ada Lovelace Symposium '15. New York, New York, USA: ACM Press. p. 13.doi:10.1145/2867731.2867746.ISBN 978-1-4503-4150-9.
  55. ^Haugtvedt, Erica; Abata, Duane (2021)."Ada Lovelace: First Computer Programmer and Hacker?".2021 ASEE Virtual Annual Conference Content Access Proceedings. ASEE Conferences.doi:10.18260/1-2--36646.
  56. ^Blodgett, John H. (1968).Herman Hollerith, data processing pioneer (Thesis). Drexel University Libraries.doi:10.17918/00004750.
  57. ^Torres y Quevedo, Leonardo (1982),"Essays on Automatics",The Origins of Digital Computers, Berlin, Heidelberg: Springer Berlin Heidelberg, pp. 89–107,doi:10.1007/978-3-642-61812-3_6,ISBN 978-3-642-61814-7, retrieved2024-10-30{{citation}}: CS1 maint: work parameter with ISBN (link)
  58. ^"6 Vannevar Bush, from "As We May Think" (1945)",Information, Columbia University Press, 2021,doi:10.7312/hayo18620-032,ISBN 978-0-231-54654-6{{citation}}: CS1 maint: work parameter with ISBN (link)
  59. ^Haigh, Thomas; Ceruzzi, Paul E. (14 September 2021).A New History of Modern Computing. The MIT Press.doi:10.7551/mitpress/11436.001.0001.ISBN 978-0-262-36648-9.
  60. ^Chua 1971, pp. 507–519.
  61. ^abcd"The Modern History of Computing".Stanford Encyclopedia of Philosophy. Metaphysics Research Lab, Stanford University. 2017.Archived from the original on 2010-07-12. Retrieved2014-01-07.
  62. ^Girvan, Ray (May–June 2003)."The revealed grace of the mechanism: computing after Babbage".Scientific Computing World. Archived fromthe original on 2012-11-03.
  63. ^Torres, Leonardo (10 October 1895)."Memória sobre las Máquinas Algébricas"(PDF).Revista de Obras Públicas (in Spanish) (28):217–222.
  64. ^Leonardo Torres.Memoria sobre las máquinas algébricas: con un informe de la Real academia de ciencias exactas, fisicas y naturales, Misericordia, 1895.
  65. ^Thomas, Federico (1 August 2008)."A short account on Leonardo Torres' endless spindle".Mechanism and Machine Theory.43 (8).IFToMM:1055–1063.doi:10.1016/j.mechmachtheory.2007.07.003.hdl:10261/30460.ISSN 0094-114X.
  66. ^"Norden M9 Bombsight". National Museum of the USAF. Archived fromthe original on 2007-08-29. Retrieved2008-05-17.
  67. ^Coriolis 1836, pp. 5–9.
  68. ^Tomayko, James E. (1985). "Helmut Hoelzer's Fully Electronic Analog Computer".IEEE Annals of the History of Computing.7 (3):227–240.Bibcode:1985IAHC....7c.227T.doi:10.1109/MAHC.1985.10025.S2CID 15986944.
  69. ^Neufeld, Michael J. (10 September 2013).The Rocket and the Reich: Peenemunde and the Coming of the Ballistic Missile Era. Smithsonian Institution. p. 138.ISBN 9781588344663.Archived from the original on 2023-02-02. Retrieved2020-10-18.
  70. ^Ulmann, Bernd (22 July 2013).Analog Computing. Walter de Gruyter. p. 38.ISBN 9783486755183.Archived from the original on 2023-02-02. Retrieved2021-12-27.
  71. ^abTuring 1937,1938
  72. ^Copeland 2004, p. 22: "von Neumann ... firmly emphasized to me, and to others I am sure, that the fundamental conception is owing to Turing—insofar as not anticipated by Babbage, Lovelace and others. Letter byStanley Frankel toBrian Randell, 1972."
  73. ^abZuse, Horst."Part 4: Konrad Zuse's Z1 and Z3 Computers".The Life and Work of Konrad Zuse. EPE Online. Archived fromthe original on 2008-06-01. Retrieved2008-06-17.
  74. ^Smith 2007, p. 60.
  75. ^Welchman 1984, p. 77.
  76. ^"A Computer Pioneer Rediscovered, 50 Years On".The New York Times. 20 April 1994.Archived from the original on 2016-11-04. Retrieved2017-02-16.
  77. ^Zuse 1993, p. 55.
  78. ^"Zuse".Crash! The Story of IT. Archived fromthe original on 2008-03-18.
  79. ^Rojas, Raúl (1998). "How to Make Zuse's Z3 a Universal Computer".IEEE Annals of the History of Computing.20 (3): 51.Bibcode:1998IAHC...20c..51R.CiteSeerX 10.1.1.37.665.doi:10.1109/85.707574.
  80. ^Williams, F. C.; Kilburn, T. (25 September 1948)."Electronic Digital Computers".Nature.162 (4117): 487.Bibcode:1948Natur.162..487W.doi:10.1038/162487a0.S2CID 4110351.
  81. ^Da Cruz 2008.
  82. ^"Computer Pioneers – George Stibitz".history.computer.org.Archived from the original on 2018-10-05. Retrieved2018-11-08.
  83. ^Ritchie, David (1986).The Computer Pioneers. New York: Simon and Schuster. p. 35.ISBN 067152397X.
  84. ^Dunn, J. Michael; Hardegree, Gary M. (2001).Algebraic methods in philosophical logic. Oxford University Press US. p. 2.ISBN 978-0-19-853192-0.Archived from the original on 2023-02-02. Retrieved2016-06-04.
  85. ^Arthur Gottlob Frege.Begriffsschrift: eine der arithmetischen nachgebildete Formelsprache des reinen Denkens.
  86. ^Shannon 1938.
  87. ^Shannon 1940.
  88. ^Shannon 1938, pp. 494–495.[verification needed]
  89. ^Guarnieri, M. (2012). "The Age of Vacuum Tubes: Merging with Digital Computing [Historical]".IEEE Industrial Electronics Magazine.6 (3):52–55.doi:10.1109/MIE.2012.2207830.S2CID 41800914.
  90. ^Pugh, Emerson W. (1996).Building IBM: Shaping an Industry and its Technology.The MIT Press.
  91. ^"Patents and Innovation".IBM 100. 7 March 2012.Archived from the original on 2020-12-02. Retrieved2020-12-01.
  92. ^15 January 1941 notice in theDes Moines Register
  93. ^Burks, Alice R.; Burks, Arthur W. (1988).The First Electronic Computer: the Atanasoff story. Ann Arbor: University of Michigan Press.ISBN 0-472-10090-4.
  94. ^Copeland 2006, p. 107.
  95. ^Welchman 1984, pp. 138–145, 295–309.
  96. ^Copeland 2006, p. 182.
  97. ^Randell 1980, p. 9.
  98. ^Budiansky 2000, p. 314.
  99. ^"Bletchley's code-cracking Colossus".BBC News. 2 February 2010.Archived from the original on 2020-03-08. Retrieved2012-10-19.
  100. ^Fensom, Jim (8 November 2010),"Harry Fensom obituary",The Guardian,archived from the original on 2013-09-17, retrieved2012-10-17
  101. ^Sale, Tony."Colossus - The Rebuild Story". The National Museum of Computing. Archived fromthe original on 2015-04-18.
  102. ^Copeland 2006, p. 75.
  103. ^Copeland 2006, p. 2.
  104. ^Small, Albert W. (December 1944),The Special Fish Report, College Campus Washington: The American National Archive (NARA),archived from the original on 2011-05-15, retrieved2019-01-11
  105. ^Randell, Brian; Fensom, Harry; Milne, Frank A. (15 March 1995),"Obituary: Allen Coombs",The Independent, archived fromthe original on 2012-02-03, retrieved2012-10-18
  106. ^Flowers, T. H. (1983),"The Design of Colossus",Annals of the History of Computing,5 (3):239–252,doi:10.1109/MAHC.1983.10079,S2CID 39816473,archived from the original on 2006-03-26, retrieved2019-03-03
  107. ^Loerner, Brendan I. (25 November 2014)."How the World's First Computer Was Rescued From the Scrap Heap".Wired.Archived from the original on 2017-05-02. Retrieved2017-03-07.
  108. ^Evans 2018, p. 39.
  109. ^"Generations of Computer". Archived fromthe original on 2015-07-02. Retrieved2015-08-11.
  110. ^Copeland 2004, pp. 21–22.
  111. ^Copeland 2006, p. 104.
  112. ^abWilkes, M. V. (1956).Automatic Digital Computers. New York: John Wiley & Sons. pp. 305 pages. QA76.W5 1956.
  113. ^Alan Turing (1945).Proposed Electronic Calculator(PDF) (Report). Retrieved2023-08-24.
  114. ^Enticknap, Nicholas (Summer 1998),"Computing's Golden Jubilee",Resurrection (20), The Computer Conservation Society,ISSN 0958-7403, archived fromthe original on 2012-01-09, retrieved2008-04-19
  115. ^"Early computers at Manchester University",Resurrection,1 (4), The Computer Conservation Society, Summer 1992,ISSN 0958-7403, archived fromthe original on 2017-08-28, retrieved2010-07-07
  116. ^"Why Williams-Kilburn Tube is a Better Name for the Williams Tube".Computer 50. Archived fromthe original on 2013-06-06.
  117. ^Kilburn, Tom (1990),"From Cathode Ray Tube to Ferranti Mark I",Resurrection,1 (2), The Computer Conservation Society,ISSN 0958-7403,archived from the original on 2020-06-27, retrieved2012-03-15
  118. ^"Early Electronic Computers (1946–51)",Computer 50, University of Manchester, archived fromthe original on 2009-01-05, retrieved2008-11-16
  119. ^Napper, R. B. E.,"Introduction to the Mark 1",Computer 50, The University of Manchester, archived fromthe original on 2008-10-26, retrieved2008-11-04
  120. ^Lavington 1998, p. 20.
  121. ^Ward, Mark (13 January 2011)."Pioneering Edsac computer to be built at Bletchley Park".BBC News.Archived from the original on 2018-06-20. Retrieved2018-06-21.
  122. ^Wilkes, W. V.; Renwick, W. (1950)."The EDSAC (Electronic delay storage automatic calculator)".Math. Comp.4 (30):61–65.doi:10.1090/s0025-5718-1950-0037589-7.
  123. ^"A brief informal history of the Computer Laboratory".EDSAC 99. University of Cambridge Computer Laboratory.Archived from the original on 2013-05-06. Retrieved2020-12-01.
  124. ^"The Manchester Mark 1".Computer 50. Archived fromthe original on 2014-02-09. Retrieved2014-01-05.
  125. ^"Pioneer computer to be rebuilt".Cam.62: 5. 2011. To be precise, EDSAC's first program printed a list of thesquares of theintegers from 0 to 99 inclusive.
  126. ^EDSAC 99: 15–16 April 1999(PDF), University of Cambridge Computer Laboratory, 6 May 1999, pp. 68–69,archived(PDF) from the original on 2020-09-26, retrieved2013-06-29
  127. ^Campbell-Kelly, Martin (July 2001)."Tutorial Guide to the EDSAC Simulator"(PDF). Department of Computer Science, University of Warwick. Archived fromthe original(PDF) on 2015-12-22. Retrieved2016-11-18.
     • "Tutorial Guide to the EDSAC Simulator"(PDF). The EDSAC Replica Project, The National Museum of Computing. March 2018.Archived(PDF) from the original on 2015-12-22. Retrieved2020-12-02.
  128. ^Lavington 1998, p. 25.
  129. ^Our Computer Heritage Pilot Study: Deliveries of Ferranti Mark I and Mark I Star computers., Computer Conservation Society, archived fromthe original on 2016-12-11, retrieved2010-01-09
  130. ^Lavington, Simon."A brief history of British computers: the first 25 years (1948–1973)".British Computer Society. Archived fromthe original on 2010-07-05. Retrieved2010-01-10.
  131. ^ab1634–1699:McCusker, J. J. (1997).How Much Is That in Real Money? A Historical Price Index for Use as a Deflator of Money Values in the Economy of the United States: Addenda et Corrigenda(PDF).American Antiquarian Society. 1700–1799:McCusker, J. J. (1992).How Much Is That in Real Money? A Historical Price Index for Use as a Deflator of Money Values in the Economy of the United States(PDF).American Antiquarian Society. 1800–present:Federal Reserve Bank of Minneapolis."Consumer Price Index (estimate) 1800–". Retrieved2024-02-29.
  132. ^abcLeclerc, Bruno (January 1990). "From Gamma 2 to Gamma E.T.: The Birth of Electronic Computing at Bull".Annals of the History of Computing.12 (1):5–22.Bibcode:1990IAHC...12a...5L.doi:10.1109/MAHC.1990.10010.ISSN 0164-1239.S2CID 15227017.
  133. ^Platoni, Kara (May–June 2006)."Love at First Byte".Stanford Magazine. Archived fromthe original on 2006-09-25.
  134. ^V. M. Wolontis (18 August 1955) "A Complete Floating-Decimal Interpretive System for the I.B.M. 650 Magnetic Drum Calculator—Case 20878" Bell Telephone Laboratories Technical Memorandum MM-114-37, Reported in IBM Technical Newsletter No. 11, March 1956, as referenced in"Wolontis-Bell Interpreter".Annals of the History of Computing.8 (1). IEEE:74–76. January–March 1986.Bibcode:1986IAHC....8a..74..doi:10.1109/MAHC.1986.10008.S2CID 36692260.
  135. ^Dudley, Leonard (2008).Information Revolution in the History of the West. Edward Elgar Publishing. p. 266.ISBN 978-1-84720-790-6. Retrieved2020-08-30.
  136. ^IBM (1957),SOAP II for the IBM 650(PDF), C24-4000-0,archived(PDF) from the original on 2009-09-20, retrieved2009-05-25
  137. ^Horowitz & Hill 1989, p. 743.
  138. ^Wilkes, M. V. (1992). "Edsac 2".IEEE Annals of the History of Computing.14 (4):49–56.Bibcode:1992IAHC...14d..49W.doi:10.1109/85.194055.S2CID 11377060.
  139. ^T. Kilburn; R. B. Payne; D. J. Howarth (1962)."The Atlas Supervisor".Atlas Computer.Archived from the original on 2009-12-31. Retrieved2010-02-09.
  140. ^US 2708722, Wang, An, "Pulse transfer controlling device", issued 17 May 1955 
  141. ^"1953: Whirlwind computer debuts core memory".Computer History Museum.Archived from the original on 2018-05-08. Retrieved2023-08-26.
  142. ^N. Valery (21 August 1975)."Takeover in the memory market".New Scientist. pp. 419–421.Archived from the original on 2023-02-02. Retrieved2019-01-22.
  143. ^Wells, Benjamin (18 November 2010). "Unwinding performance and power on Colossus, an unconventional computer".Natural Computing.10 (4). Springer Science and Business Media LLC:1383–1405.doi:10.1007/s11047-010-9225-x.ISSN 1567-7818.S2CID 7492074.
  144. ^Feynman, Leighton & Sands 1966, pp. 14–11 to 14–12.
  145. ^IBM 1960.
  146. ^Lavington 1998, pp. 34–35.
  147. ^abcLavington 1998, p. 37.
  148. ^abCooke-Yarborough, E.H. (1998). "Some early transistor applications in the UK".Engineering Science & Education Journal.7 (3):100–106.doi:10.1049/esej:19980301 (inactive 12 July 2025).{{cite journal}}: CS1 maint: DOI inactive as of July 2025 (link)
  149. ^Cooke-Yarborough, E.H. (1957).Introduction to Transistor Circuits. Edinburgh: Oliver and Boyd.
  150. ^Lavington, Simon (1980).Early British Computers. Manchester University Press.ISBN 0-7190-0803-4.Archived from the original on 2019-05-24. Retrieved2014-01-07.
  151. ^Cooke-Yarborough, E.H.; Barnes, R.C.M.; Stephen, J.H.; Howells, G.A. (1956). "A transistor digital computer".Proceedings of the IEE - Part B: Radio and Electronic Engineering.103 (3S):364–370.doi:10.1049/pi-b-1.1956.0076.
  152. ^Lavington 1998, pp. 36–37.
  153. ^"Metrovick".Exposuremeters.net. Archived fromthe original on 2014-01-07.
  154. ^Simon 1991.
  155. ^Mayo & Newcomb 2008.
  156. ^Lavington 1998, p. 41.
  157. ^Lavington 1998, pp. 44–45.
  158. ^Lavington 1998, pp. 50–52.
  159. ^Sao-Jie Chen; Guang-Huei Lin; Pao-Ann Hsiung; Yu-Hen Hu (2009).Hardware software co-design of a multimedia SOC platform. pp. 70–72.
  160. ^Impagliazzo, John; Lee, John A. N. (2004).History of computing in education. Springer. p. 172.ISBN 1-4020-8135-9.Archived from the original on 2023-02-02. Retrieved2016-06-04.
  161. ^Sisson, Richard; Zacher, Christian K. (2006).The American Midwest: an interpretive encyclopedia. Indiana University Press. p. 1489.ISBN 0-253-34886-2.Archived from the original on 2023-02-02. Retrieved2016-06-04.
  162. ^Kilby 2000.
  163. ^"The Chip that Jack Built". Texas Instruments. c. 2008. Archived fromthe original on 2017-08-05. Retrieved2008-05-29.
  164. ^Saxena, Arjun N. (2009).Invention of Integrated Circuits: Untold Important Facts.World Scientific. p. 140.ISBN 9789812814456.Archived from the original on 2023-02-02. Retrieved2019-12-07.
  165. ^ab"Integrated circuits".NASA.Archived from the original on 2019-07-21. Retrieved2019-08-13.
  166. ^US 2981877, Noyce, Robert, "Semiconductor device-and-lead structure", issued 25 April 1961, assigned toFairchild Semiconductor Corporation 
  167. ^"1959: Practical Monolithic Integrated Circuit Concept Patented".Computer History Museum.Archived from the original on 2019-10-24. Retrieved2019-08-13.
  168. ^Lojek, Bo (2007).History of Semiconductor Engineering.Springer Science & Business Media. p. 120.ISBN 9783540342588.
  169. ^Bassett, Ross Knox (2007).To the Digital Age: Research Labs, Start-up Companies, and the Rise of MOS Technology. Johns Hopkins University Press. p. 46.ISBN 9780801886393.Archived from the original on 2023-02-02. Retrieved2019-12-07.
  170. ^Huff, Howard R.; Tsuya, H.; Gösele, U. (1998).Silicon Materials Science and Technology: Proceedings of the Eighth International Symposium on Silicon Materials Science and Technology.Electrochemical Society. pp. 181–182.ISBN 9781566771931.Archived from the original on 2023-02-02. Retrieved2019-12-07.
  171. ^Ceruzzi, Paul (2015)."Apollo Guidance Computer and the First Silicon Chips".SmithsonianNational Air and Space Museum.Archived from the original on 2021-05-22. Retrieved2021-05-12.
  172. ^"1960 - Metal Oxide Semiconductor (MOS) Transistor Demonstrated".The Silicon Engine.Computer History Museum.Archived from the original on 2019-10-27. Retrieved2019-10-21.
  173. ^ab"1970: MOS Dynamic RAM Competes with Magnetic Core Memory on Price".Computer History Museum.Archived from the original on 2021-10-26. Retrieved2019-07-29.
  174. ^Solid State Design - Vol. 6. Horizon House. 1965.Archived from the original on 2023-02-02. Retrieved2020-10-18.
  175. ^"DRAM".IBM100.IBM. 9 August 2017.Archived from the original on 2019-06-20. Retrieved2019-09-20.
  176. ^"1971: Reusable semiconductor ROM introduced".Computer History Museum.Archived from the original on 2019-10-03. Retrieved2019-06-19.
  177. ^"Not just a flash in the pan".The Economist. 11 March 2006.Archived from the original on 2019-09-25. Retrieved2019-09-10.
  178. ^abcdShirriff, Ken (30 August 2016)."The Surprising Story of the First Microprocessors".IEEE Spectrum.53 (9).Institute of Electrical and Electronics Engineers:48–54.doi:10.1109/MSPEC.2016.7551353.S2CID 32003640.Archived from the original on 2021-07-12. Retrieved2019-10-13.
  179. ^Intel 1971.
  180. ^Patterson & Hennessy 1998, pp. 27–39.
  181. ^Eckhouse & Morris 1979, pp. 1–2.
  182. ^"R2E Micral N".www.system-cfg.com.Archived from the original on 2022-11-10. Retrieved2022-12-02.
  183. ^Shankland, Stephen (1 April 2009)."Google uncloaks once-secret server".CNET. Archived fromthe original on 2014-07-16. Retrieved2009-04-01. "Since 2005, its [Google's] data centers have been composed of standard shipping containers—each with 1,160 servers and a power consumption that can reach 250 kilowatts." —Ben Jai of Google.
  184. ^Shankland, Stephen (30 May 2008)."Google spotlights data center inner workings".CNET. Archived fromthe original on 2014-08-18. Retrieved2008-05-31. "If you're running 10,000 machines, something is going to die every day." —Jeff Dean of Google.
  185. ^"Google Groups".Archived from the original on 2011-09-13. Retrieved2015-08-11.
  186. ^Shrout, Ryan (2 December 2009)."Intel Shows 48-core x86 Processor as Single-chip Cloud Computer".PC Perspective.Archived from the original on 2010-08-14. Retrieved2020-12-02.
     • "Intel unveils 48-core cloud computing silicon chip".BBC News. 3 December 2009.Archived from the original on 2012-12-06. Retrieved2009-12-03.
  187. ^Kohonen 1980, p. [page needed].
  188. ^Mead & Conway 1980, pp. 11–36.
  189. ^Energystar report(PDF) (Report). 2007. p. 4.Archived(PDF) from the original on 2013-10-22. Retrieved2013-08-18.
  190. ^Mossberg, Walt (9 July 2014)."How the PC is merging with the smartphone".Re/code.Archived from the original on 2014-07-09. Retrieved2014-07-09.
  191. ^Knight, Will (10 November 2017)."IBM Raises the Bar with a 50-Qubit Quantum Computer".MIT Technology Review.Archived from the original on 2017-11-12. Retrieved2017-11-10.
  192. ^abJulian Kelly; et al. (Google Quantum AI) (15 July 2021)."Exponential suppression of bit or phase errors with cyclic error correction"(PDF).Nature.595 (7867):383–387.doi:10.1038/s41586-021-03588-y.PMC 8279951.PMID 34262210. Cited inAdrian Cho (14 July 2021)."Physicists move closer to defeating errors in quantum computation".Science.
  193. ^Pakkiam, P.; Timofeev, A. V.; House, M. G.; Hogg, M. R.; Kobayashi, T.; Koch, M.; Rogge, S.; Simmons, M. Y. (26 November 2018). "Single-Shot Single-Gate rf Spin Readout in Silicon".Physical Review X.8 (4). 041032.arXiv:1809.01802.Bibcode:2018PhRvX...8d1032P.doi:10.1103/PhysRevX.8.041032.S2CID 119363882 – via APS.
  194. ^Moore, Samuel K. (13 March 2019)."Google Builds Circuit to Solve One of Quantum Computing's Biggest Problems".IEEE Spectrum.Archived from the original on 2019-03-14. Retrieved2019-03-14.
  195. ^Ina Fried (14 November 2021)."Exclusive: IBM achieves quantum computing breakthrough".Axios.Archived from the original on 2021-11-15.
  196. ^Juskalian, Russ (22 February 2017)."Practical Quantum Computers".MIT Technology Review.Archived from the original on 2021-06-23. Retrieved2020-12-02.
  197. ^MacKinnon, John D. (19 December 2018)."House Passes Bill to Create National Quantum Computing Program".The Wall Street Journal.Archived from the original on 2018-12-20. Retrieved2018-12-20.
  198. ^Princeton University (25 December 2019)."Quantum Computing Breakthrough: Silicon Qubits Interact at Long-Distance".SciTechDaily.Archived from the original on 2019-12-26. Retrieved2019-12-26.
  199. ^Smolin 2001, pp. 53–57. Pages 220–226 are annotated references and guide for further reading.
  200. ^Burks, Goldstine & von Neumann 1947, pp. 1–464 reprinted inDatamation, September–October 1962. Note thatpreliminary discussion/design was the term later calledsystem analysis/design, and even later, calledsystem architecture.
  201. ^"IEEE Annals of the History of Computing".Schloss Dagstuhl – Leibniz-Zentrum für Informatik.Archived from the original on 2011-03-20. Retrieved2023-08-29.
  202. ^"ORNL's Frontier First to Break the Exaflop Ceiling".top500.org. 30 May 2022.Archived from the original on 2022-06-02. Retrieved2023-08-26.
  203. ^McKay, Tom (22 June 2020)."Japan's New Fugaku Supercomputer Is Number One, Ranking in at 415 Petaflops".Gizmodo.Archived from the original on 2020-06-23. Retrieved2020-06-23.

References

[edit]

Further reading

[edit]

External links

[edit]
Wikiversity has learning resources aboutIntroduction to Computers/History
Wikimedia Commons has media related toHistory of computing hardware.
Australia
Israel
Japan
Sweden
Soviet Union
United States
IAS family
1950s
1960s
University of Illinois
Harvard University
IBM
University of Pennsylvania
EMCC
RemingtonSperry Rand
Raytheon
United Kingdom
Related
Input devices
Pointing devices
Other
Output devices
Removable
data storage
Computer case
Ports
Current
Obsolete
Related
Retrieved from "https://en.wikipedia.org/w/index.php?title=History_of_computing_hardware&oldid=1337319002"
Categories:
Hidden categories:

[8]ページ先頭

©2009-2026 Movatter.jp