Movatterモバイル変換


[0]ホーム

URL:


Jump to content
WikipediaThe Free Encyclopedia
Search

Information Age

From Wikipedia, the free encyclopedia
(Redirected fromDigital Revolution)
Industrial shift to information technology
This article is about the historical period. For the album by dead prez, seeInformation Age (album). For publisher, seeInformation Age Publishing.

Third Industrial Revolution
1947–present
A laptop connects to the Internet to display information from Wikipedia; long-distance communication between computer systems is a hallmark of the Information Age
LocationWorldwide
Key eventsInvention of thetransistor
Computer miniaturization
Invention of the Internet
Chronology
Second Industrial RevolutionFourth Industrial Revolutionclass-skin-invert-image
Part ofa series on
Human history
Prehistory (Stone Age)  (Pleistocene epoch)
Future  

TheInformation Age[a] is ahistorical period that began in the mid-20th century. It is characterized by a rapid shift from traditional industries, as established during theIndustrial Revolution, to an economy centered on information technology.[2] The onset of the Information Age has been linked to the development of thetransistor in 1947,[2] and theoptical amplifier in 1957.[3] These technological advances have had a significant impact on the way information is processed and transmitted.

According to theUnited Nations Public Administration Network, the Information Age was formed by capitalizing oncomputer miniaturization advances,[4] which led tomodernized information systems and internet communications as the driving force ofsocial evolution.[5]

There is ongoing debate concerning whether the Third Industrial Revolution has already ended, and if theFourth Industrial Revolution has already begun due to the recent breakthroughs in areas such asartificial intelligence andbiotechnology.[6] This next transition has been theorized to harken the advent of theImagination Age, theInternet of things (IoT), and rapid advancements inmachine learning.

History

[edit]
Further information:History of computing hardware

The digital revolution converted technology from analog format to digital format. By doing this, it became possible to make copies that were identical to the original. In digital communications, for example, repeating hardware was able to amplify thedigital signal and pass it on with no loss of information in the signal. Of equal importance to the revolution was the ability to easily move the digital information between media, and to access or distribute it remotely. One turning point of the revolution was the change from analog to digitally recorded music.[7] During the 1980s the digital format of optical compact discs gradually replacedanalog formats, such asvinyl records andcassette tapes, as the popular medium of choice.[8]

Previous inventions

[edit]

Humans have manufactured tools for counting and calculating since ancient times, such as theabacus,astrolabe,equatorium, and mechanical timekeeping devices. More complicated devices started appearing in the 1600s, including theslide rule andmechanical calculators. By the early 1800s, theIndustrial Revolution had produced mass-market calculators like thearithmometer and the enabling technology of thepunch card.Charles Babbage proposed a mechanical general-purpose computer called theAnalytical Engine, but it was never successfully built, and was largely forgotten by the 20th century and unknown to most of the inventors of modern computers.

TheSecond Industrial Revolution in the last quarter of the 19th century developed useful electrical circuits and thetelegraph. In the 1880s,Herman Hollerith developed electromechanical tabulating and calculating devices using punch cards andunit record equipment, which became widespread in business and government.

Meanwhile, variousanalog computer systems used electrical, mechanical, or hydraulic systems to model problems and calculate answers. These included an 1872tide-predicting machine,differential analysers,perpetual calendar machines, theDeltar for water management in the Netherlands,network analyzers for electrical systems, and various machines for aiming military guns and bombs. The construction of problem-specific analog computers continued in the late 1940s and beyond, withFERMIAC for neutron transport,Project Cyclone for various military applications, and thePhillips Machine for economic modeling.

Building on the complexity of theZ1 andZ2, German inventorKonrad Zuse used electromechanical systems to complete in 1941 theZ3, the world's first working programmable, fully automatic digital computer. Also during World War II, Allied engineers constructed electromechanicalbombes to break GermanEnigma machine encoding. The base-10 electromechanicalHarvard Mark I was completed in 1944, and was to some degree improved with inspiration from Charles Babbage's designs.

1947–1969: Origins

[edit]
See also:Early history of video games andEarly mainframe games
APennsylvania state historical marker inPhiladelphia cites the creation ofENIAC, the "first all-purpose digital computer", in 1946 as the beginning of the Information Age.

In 1947, the first workingtransistor, thegermanium-basedpoint-contact transistor, was invented byJohn Bardeen andWalter Houser Brattain while working underWilliam Shockley atBell Labs.[9] This led the way to more advanceddigital computers. From the late 1940s, universities, military, and businesses developed computer systems to digitally replicate and automate previously manually performed mathematical calculations, with theLEO being the first commercially available general-purpose computer.

Digital communication became economical for widespread adoption after the invention of the personal computer in the 1970s.Claude Shannon, aBell Labs mathematician, is credited for having laid out the foundations ofdigitalization in his pioneering 1948 article,A Mathematical Theory of Communication.[10]

In 1948, Bardeen and Brattain patented an insulated-gate transistor (IGFET) with an inversion layer. Their concept, forms the basis of CMOS and DRAM technology today.[11] In 1957 at Bell Labs, Frosch and Derick were able to manufacture planar silicon dioxide transistors,[12] later a team at Bell Labs demonstrated a working MOSFET.[13] The first integrated circuit milestone was achieved byJack Kilby in 1958.[14]

Other important technological developments included the invention of the monolithicintegrated circuit chip byRobert Noyce atFairchild Semiconductor in 1959,[15] made possible by theplanar process developed byJean Hoerni.[16] In 1963,complementary MOS (CMOS) was developed byChih-Tang Sah andFrank Wanlass atFairchild Semiconductor.[17] Theself-aligned gate transistor, which further facilitated mass production, was invented in 1966 by Robert Bower atHughes Aircraft[18][19] and independently by Robert Kerwin,Donald Klein and John Sarace at Bell Labs.[20]

In 1962 AT&T deployed theT-carrier for long-haulpulse-code modulation (PCM) digital voice transmission. The T1 format carried 24 pulse-code modulated, time-division multiplexed speech signals each encoded in 64 kbit/s streams, leaving 8 kbit/s of framing information which facilitated the synchronization and demultiplexing at the receiver. Over the subsequent decades the digitisation of voice became the norm for all but the last mile (where analogue continued to be the norm right into the late 1990s).

Following the development ofMOS integrated circuit chips in the early 1960s, MOS chips reached highertransistor density and lower manufacturing costs thanbipolar integrated circuits by 1964. MOS chips further increased in complexity at a rate predicted byMoore's law, leading tolarge-scale integration (LSI) with hundreds of transistors on a single MOS chip by the late 1960s. The application of MOS LSI chips tocomputing was the basis for the firstmicroprocessors, as engineers began recognizing that a completecomputer processor could be contained on a single MOS LSI chip.[21] In 1968, Fairchild engineerFederico Faggin improved MOS technology with his development of thesilicon-gate MOS chip, which he later used to develop theIntel 4004, the first single-chip microprocessor.[22] It was released byIntel in 1971, and laid the foundations for themicrocomputer revolution that began in the 1970s.

MOS technology also led to the development of semiconductorimage sensors suitable fordigital cameras.[23] The first such image sensor was thecharge-coupled device, developed byWillard S. Boyle andGeorge E. Smith at Bell Labs in 1969,[24] based onMOS capacitor technology.[23]

1969–1989: Invention of the internet, rise of home computers

[edit]
See also:History of arcade video games,First generation of video game consoles,Second generation of video game consoles,Third generation of video game consoles, andFourth generation of video game consoles
A visualization of the various routes through a portion of the Internet (created via The Opte Project)

The public was first introduced to the concepts that led to the Internet when a message was sent over theARPANET in 1969.Packet switched networks such as ARPANET,Mark I,CYCLADES,Merit Network,Tymnet, andTelenet, were developed in the late 1960s and early 1970s using a variety ofprotocols. The ARPANET in particular led to the development of protocols forinternetworking, in which multiple separate networks could be joined into a network of networks.

TheWhole Earth movement of the 1960s advocated the use of new technology.[25]

In the 1970s, thehome computer was introduced,[26]time-sharing computers,[27] thevideo game console, the first coin-op video games,[28][29] and thegolden age of arcade video games began withSpace Invaders. As digital technology proliferated, and the switch from analog to digital record keeping became the new standard in business, a relatively new job description was popularized, thedata entry clerk. Culled from the ranks of secretaries and typists from earlier decades, the data entry clerk's job was to convert analog data (customer records, invoices, etc.) into digital data.

In developed nations, computers achieved semi-ubiquity during the 1980s as they made their way into schools, homes, business, and industry.Automated teller machines,industrial robots,CGI in film and television,electronic music,bulletin board systems, and video games all fueled what became the zeitgeist of the 1980s. Millions of people purchased home computers, making household names of early personal computer manufacturers such asApple, Commodore, and Tandy. To this day the Commodore 64 is often cited as the best selling computer of all time, having sold 17 million units (by some accounts)[30] between 1982 and 1994.

In 1984, the U.S. Census Bureau began collecting data on computer and Internet use in the United States; their first survey showed that 8.2% of all U.S. households owned a personal computer in 1984, and that households with children under the age of 18 were nearly twice as likely to own one at 15.3% (middle and upper middle class households were the most likely to own one, at 22.9%).[31] By 1989, 15% of all U.S. households owned a computer, and nearly 30% of households with children under the age of 18 owned one.[32] By the late 1980s, many businesses were dependent on computers and digital technology.

Motorola created the first mobile phone,Motorola DynaTac, in 1983. However, this device used analog communication – digital cell phones were not sold commercially until 1991 when the2G network started to be opened in Finland to accommodate the unexpected demand for cell phones that was becoming apparent in the late 1980s.

Compute! magazine predicted thatCD-ROM would be the centerpiece of the revolution, with multiple household devices reading the discs.[33]

The first truedigital camera was created in 1988, and the first were marketed in December 1989 in Japan and in 1990 in the United States.[34] By the early 2000s, digital cameras had eclipsed traditional film in popularity.

Digital ink and paint was also invented in the late 1980s. Disney's CAPS system (created 1988) was used for a scene in 1989'sThe Little Mermaid and for all their animation films between 1990'sThe Rescuers Down Under and 2004'sHome on the Range.

1989–2005: Invention of the World Wide Web, mainstreaming of the Internet, Web 1.0

[edit]
See also:Fifth generation of video game consoles andSixth generation of video game consoles

Tim Berners-Lee invented theWorld Wide Web in 1989.[35] The "Web 1.0 era" ended in 2005, coinciding with the development of further advanced technologies during the start of the 21st century.[36]

The first public digitalHDTV broadcast was of the1990 World Cup that June; it was played in 10 theaters in Spain and Italy. However, HDTV did not become a standard until the mid-2000s outside Japan.

TheWorld Wide Web became publicly accessible in 1991, which had been available only to government and universities.[37] In 1993Marc Andreessen andEric Bina introducedMosaic, the first web browser capable of displaying inline images[38] and the basis for later browsers such as Netscape Navigator and Internet Explorer.Stanford Federal Credit Union was the firstfinancial institution to offer online internet banking services to all of its members in October 1994.[39] In 1996OP Financial Group, also acooperative bank, became the second online bank in the world and the first in Europe.[40] The Internet expanded quickly, and by 1996, it was part ofmass culture and many businesses listed websites in their ads.[citation needed] By 1999, almost every country had a connection, and nearly half ofAmericans and people in several other countries used the Internet on a regular basis.[citation needed] However throughout the 1990s, "getting online" entailed complicated configuration, anddial-up was the only connection type affordable by individual users; the present day massInternet culture was not possible.

In 1989, about 15% of all households in the United States owned a personal computer.[41]For households with children, nearly 30% owned a computer in 1989, and in 2000, 65% owned one.

Cell phones became as ubiquitous as computers by the early 2000s, with movie theaters beginning to show ads telling people to silence their phones. They also becamemuch more advanced than phones of the 1990s, most of which only took calls or at most allowed for the playing of simple games.

Text messaging became widely used in the late 1990s worldwide, except for in the United States of America where text messaging didn't become commonplace till the early 2000s.[citation needed]

The digital revolution became truly global in this time as well – after revolutionizing society in thedeveloped world in the 1990s, the digital revolution spread to the masses in thedeveloping world in the 2000s.

By 2000, a majority of U.S. households had at least one personal computer andinternet access the following year.[42] In 2002, a majority of U.S. survey respondents reported having a mobile phone.[43]

2005–2020: Web 2.0, social media, smartphones, digital TV

[edit]
Main articles:Web 2.0,Social media,Smartphone,Digital terrestrial television,Digital television transition,Video game industry,Seventh generation of video game consoles,Eighth generation of video game consoles, andNinth generation of video game consoles

In late 2005 the population of the Internet reached 1 billion,[44] and 3 billion people worldwide used cell phones by the end of the decade.HDTV became the standard television broadcasting format in many countries by the end of the decade. In September and December 2006 respectively,Luxembourg and theNetherlands became the first countries to completelytransition from analog to digital television. In September 2007, a majority of U.S. survey respondents reported havingbroadband internet at home.[45] According to estimates from theNielsen Media Research, approximately 45.7 million U.S. households in 2006 (or approximately 40 percent of approximately 114.4 million) owned a dedicatedhome video game console,[46][47] and by 2015, 51 percent of U.S. households owned a dedicated home video game console according to anEntertainment Software Association annual industryreport.[48][49] By 2012, over 2 billion people used the Internet, twice the number using it in 2007.Cloud computing had entered the mainstream by the early 2010s. In January 2013, a majority of U.S. survey respondents reported owning asmartphone.[50] By 2016, half of the world's population was connected[51] and as of 2020, that number has risen to 67%.[52]

Rise in digital technology use of computers

[edit]
Further information:History of the Internet

In the late 1980s, less than 1% of the world's technologically stored information was in digital format, while it was 94% in 2007, with more than 99% by 2014.[53]

It is estimated that the world's capacity to store information has increased from 2.6 (optimally compressed)exabytes in 1986, to some 5,000exabytes in 2014 (5zettabytes).[53][54]

Number of cell phone subscribers and internet users
YearCell phone subscribers (% of worldpop.)Internet users (% of world pop.)
199012.5 million (0.25%)[55]2.8 million (0.05%)[56]
20021.5 billion (19%)[56]631 million (11%)[56]
20104 billion (68%)[57]1.8 billion (26.6%)[51]
20204.78 billion (62%)[58]4.54 billion (59%)[59]
20236.31 billion (78%)[60]5.4 billion (67%)[61]
A universitycomputer lab containing many desktop PCs

Overview of early developments

[edit]
A timeline of major milestones of the Information Age, from the first message sent by theInternet protocol suite to globalInternet access

Library expansion and Moore's law

[edit]

Library expansion was calculated in 1945 byFremont Rider to double in capacity every 16 years where sufficient space made available.[62] He advocated replacing bulky, decaying printed works withminiaturizedmicroformanalog photographs, which could be duplicated on-demand for library patrons and other institutions.

Rider did not foresee, however, thedigital technology that would follow decades later to replaceanalog microform withdigital imaging,storage, andtransmission media, whereby vast increases in the rapidity of information growth would be made possible throughautomated, potentially-lossless digital technologies. Accordingly,Moore's law, formulated around 1965, would calculate that thenumber of transistors in a denseintegrated circuit doubles approximately every two years.[63][64]

By the early 1980s, along with improvements incomputing power, the proliferation of the smaller and less expensive personal computers allowed for immediateaccess to information and the ability toshare andstore it. Connectivity between computers within organizations enabled access to greater amounts of information.[citation needed]

Information storage and Kryder's law

[edit]
Main articles:Data storage andComputer data storage
Hilbert & López (2011). The World's Technological Capacity to Store, Communicate, and Compute Information. Science, 332(6025), 60–65.[65]

The world's technological capacity to store information grew from 2.6 (optimallycompressed)exabytes (EB) in 1986 to 15.8 EB in 1993; over 54.5 EB in 2000; and to 295 (optimally compressed) EB in 2007.[53][66] This is the informational equivalent to less than one 730-megabyte (MB)CD-ROM per person in 1986 (539 MB per person); roughly four CD-ROM per person in 1993; twelve CD-ROM per person in the year 2000; and almost sixty-one CD-ROM per person in 2007.[53] It is estimated that the world's capacity to store information has reached 5zettabytes in 2014,[54] the informational equivalent of 4,500 stacks of printed books from the earth to thesun.[citation needed]

The amount ofdigital data stored appears to be growing approximatelyexponentially, reminiscent ofMoore's law. As such,Kryder's law prescribes that the amount of storage space available appears to be growing approximately exponentially.[67][68][69][64]

Information transmission

[edit]

The world's technological capacity to receive information through one-waybroadcast networks was 432exabytes of (optimallycompressed) information in 1986; 715 (optimally compressed) exabytes in 1993; 1.2 (optimally compressed)zettabytes in 2000; and 1.9 zettabytes in 2007, the information equivalent of 174 newspapers per person per day.[53]

The world's effective capacity toexchange information throughtwo-wayTelecommunications networks was 281petabytes of (optimally compressed) information in 1986; 471 petabytes in 1993; 2.2 (optimally compressed) exabytes in 2000; and 65 (optimally compressed) exabytes in 2007, the information equivalent of six newspapers per person per day.[53] In the 1990s, the spread of the Internet caused a sudden leap in access to and ability to share information in businesses and homes globally. A computer that cost $3000 in 1997 would cost $2000 two years later and $1000 the following year, due to the rapid advancement of technology.[citation needed]

Computation

[edit]

The world's technological capacity to compute information with human-guided general-purpose computers grew from 3.0 × 108MIPS in 1986, to 4.4 × 109 MIPS in 1993; to 2.9 × 1011 MIPS in 2000; to 6.4 × 1012 MIPS in 2007.[53] An article featured in thejournalTrends in Ecology and Evolution in 2016 reported that:[54]

Digital technology has vastly exceeded thecognitivecapacity of any single human being and has done so a decade earlier than predicted. In terms of capacity, there are two measures of importance: the number of operations a system can perform and the amount of information that can be stored. The number ofsynaptic operations per second in a human brain has been estimated to lie between 10^15 and 10^17. While this number is impressive, even in 2007 humanity'sgeneral-purpose computers were capable of performing well over 10^18 instructions per second. Estimates suggest that the storage capacity of an individual human brain is about 10^12 bytes. On a per capita basis, this is matched by current digital storage (5x10^21 bytes per 7.2x10^9 people).

Genetic information

[edit]

Genetic code may also be considered part of theinformation revolution. Now that sequencing has been computerized,genome can be rendered and manipulated as data. This started withDNA sequencing, invented byWalter Gilbert andAllan Maxam[70] in 1976–1977 andFrederick Sanger in 1977, grew steadily with theHuman Genome Project, initially conceived by Gilbert and finally, the practical applications of sequencing, such asgene testing, after the discovery byMyriad Genetics of theBRCA1 breast cancer gene mutation. Sequence data inGenBank has grown from the 606 genome sequences registered in December 1982 to the 231 million genomes in August 2021. An additional 13 trillion incomplete sequences are registered in theWhole Genome Shotgun submission database as of August 2021. The information contained in these registered sequences has doubled every 18 months.[71]

Different stage conceptualizations

[edit]
This sectionneeds additional citations forverification. Please helpimprove this article byadding citations to reliable sources in this section. Unsourced material may be challenged and removed.
Find sources: "Information Age" – news ·newspapers ·books ·scholar ·JSTOR
(February 2024) (Learn how and when to remove this message)

During rare times in human history, there have been periods of innovation that have transformed human life. TheNeolithic Age, the Scientific Age and theIndustrial Age all, ultimately, induced discontinuous and irreversible changes in the economic, social and cultural elements of the daily life of most people. Traditionally, these epochs have taken place over hundreds, or in the case of the Neolithic Revolution, thousands of years, whereas the Information Age swept to all parts of the globe in just a few years, as a result of the rapidly advancing speed of information exchange.

Between 7,000 and 10,000 years ago during the Neolithic period, humans began to domesticate animals, began to farm grains and to replace stone tools with ones made of metal. These innovations allowed nomadic hunter-gatherers to settle down. Villages formed along theYangtze River in China in 6,500 B.C., theNile River region of Africa and inMesopotamia (Iraq) in 6,000 B.C. Cities emerged between 6,000 B.C. and 3,500 B.C. The development of written communication (cuneiform inSumeria andhieroglyphs inEgypt in 3,500 B.C. and writing in Egypt in 2,560 B.C. and inMinoa and China around 1,450 B.C.) enabled ideas to be preserved for extended periods to spread extensively. In all, Neolithic developments, augmented by writing as an information tool, laid the groundwork for the advent of civilization.

The Scientific Age began in the period betweenGalileo's 1543 proof that the planets orbit the Sun andNewton's publication of the laws of motion and gravity inPrincipia in 1697. This age of discovery continued through the 18th century, accelerated by widespread use of themoveable type printing press byJohannes Gutenberg.

The Industrial Age began in Great Britain in 1760 and continued into the mid-19th century. The invention of machines such as the mechanical textile weaver by Edmund Cartwrite, the rotating shaftsteam engine byJames Watt and thecotton gin byEli Whitney, along with processes for mass manufacturing, came to serve the needs of a growing global population. The Industrial Age harnessed steam and waterpower to reduce the dependence on animal and human physical labor as the primary means of production. Thus, the core of the Industrial Revolution was the generation and distribution of energy from coal and water to produce steam and, later in the 20th century, electricity.

The Information Age also requires electricity to power theglobal networks of computers that process and store data. However, what dramatically accelerated the pace of The Information Age's adoption, as compared to previous ones, was the speed by which knowledge could be transferred and pervaded the entire human family in a few short decades. This acceleration came about with the adoptions of a new form of power. Beginning in 1972, engineers devised ways to harness light to convey data throughfiber optic cable. Today, light-basedoptical networking systems at the heart of telecom networks and the Internet span the globe and carry most of the information traffic to and from users and data storage systems.

Three stages of the Information Age

There are different conceptualizations of the Information Age. Some focus on the evolution of information over the ages, distinguishing between the Primary Information Age and the Secondary Information Age. Information in the Primary Information Age was handled by newspapers, radio and television. The Secondary Information Age was developed by the Internet, satellite televisions andmobile phones. The Tertiary Information Age was emerged by media of the Primary Information Age interconnected with media of the Secondary Information Age as presently experienced.[72][73][74][75][76][77]

Stages of development expressed as Kondratiev waves

Others classify it in terms of the well-establishedSchumpeterianlong waves orKondratiev waves. Here authors distinguish three different long-term metaparadigms, each with different long waves. The first focused on the transformation of material, includingstone,bronze, andiron. The second, often referred to asIndustrial Revolution, was dedicated to the transformation of energy, includingwater,steam,electric, andcombustion power. Finally, the most recent metaparadigm aims at transforming information. It started out with the proliferation of communication andstored data and has now entered the age ofalgorithms, which aims at creating automated processes to convert the existing information into actionable knowledge.[78]

Information in social and economic activities

[edit]

The main feature of the information revolution is the growing economic, social and technological role of information.[79] Information-related activities did not come up with the Information Revolution. They existed, in one form or the other, in all human societies, and eventually developed into institutions, such as thePlatonic Academy,Aristotle's Peripatetic school in theLyceum, theMusaeum and theLibrary of Alexandria, or the schools ofBabylonian astronomy. TheAgricultural Revolution and theIndustrial Revolution came up when new informational inputs were produced by individual innovators, or by scientific and technical institutions. During the Information Revolution all these activities are experiencing continuous growth, while other information-oriented activities are emerging.

Information is the central theme of several new sciences, which emerged in the 1940s, includingShannon's (1949)Information Theory[80] andWiener's (1948)Cybernetics. Wiener stated: "information is information not matter or energy". This aphorism suggests that information should be considered along withmatter and energy as the third constituent part of the Universe; information is carried by matter or by energy.[81] By the 1990s some writers believed that changes implied by the Information revolution will lead to not only a fiscal crisis for governments but also the disintegration of all "large structures".[82]

The theory of information revolution

[edit]

The terminformation revolution may relate to, or contrast with, such widely used terms asIndustrial Revolution andAgricultural Revolution. Note, however, that you may prefer mentalist to materialist paradigm. The following fundamental aspects of the theory of information revolution can be given:[83][84]

  1. The object of economic activities can be conceptualized according to the fundamental distinction between matter, energy, and information. These apply both to the object of each economic activity, as well as within each economic activity or enterprise. For instance, an industry may process matter (e.g. iron) using energy and information (production and process technologies, management, etc.).
  2. Information is afactor of production (along withcapital,labor,land (economics)), as well as aproduct sold in themarket, that is, acommodity. As such, it acquiresuse value andexchange value, and therefore aprice.
  3. All products have use value, exchange value, and informational value. The latter can be measured by the information content of the product, in terms of innovation, design, etc.
  4. Industries develop information-generating activities, the so-calledResearch and Development (R&D) functions.
  5. Enterprises, and society at large, develop the information control and processing functions, in the form of management structures; these are also called "white-collar workers", "bureaucracy", "managerial functions", etc.
  6. Labor can be classified according to the object of labor, into information labor and non-information labor.
  7. Information activities constitute a large, new economic sector, the information sector along with the traditionalprimary sector,secondary sector, andtertiary sector, according to thethree-sector hypothesis. These should be restated because they are based on the ambiguous definitions made byColin Clark (1940), who included in the tertiary sector all activities that have not been included in the primary (agriculture, forestry, etc.) and secondary (manufacturing) sectors.[85] Thequaternary sector and thequinary sector of the economy attempt to classify these new activities, but their definitions are not based on a clear conceptual scheme, although the latter is considered by some as equivalent with the information sector.
  8. From a strategic point of view, sectors can be defined as information sector,means of production,means of consumption, thus extending the classicalRicardo-Marx model of theCapitalist mode of production (seeInfluences on Karl Marx).Marx stressed in many occasions the role of the "intellectual element" in production, but failed to find a place for it into his model.[86][87]
  9. Innovations are the result of the production of new information, as new products, new methods of production,patents, etc.Diffusion of innovations manifests saturation effects (related term:market saturation), following certain cyclical patterns and creating "economic waves", also referred to as "business cycles". There are various types of waves, such asKondratiev wave (54 years),Kuznets swing (18 years),Juglar cycle (9 years) andKitchin (about 4 years, see alsoJoseph Schumpeter) distinguished by their nature, duration, and, thus, economic impact.
  10. Diffusion of innovations causes structural-sectoral shifts in the economy, which can be smooth or can create crisis and renewal, a process whichJoseph Schumpeter called vividly "creative destruction".

From a different perspective,Irving E. Fang (1997) identified six 'Information Revolutions': writing, printing, mass media, entertainment, the 'tool shed' (which we call 'home' now), and the information highway. In this work the term 'information revolution' is used in a narrow sense, to describe trends in communication media.[88]

Measuring and modeling the information revolution

[edit]

Porat (1976) measured the information sector in the US using theinput-output analysis;OECD has included statistics on the information sector in the economic reports of its member countries.[89] Veneris (1984, 1990) explored the theoretical, economic and regional aspects of the informational revolution and developed asystems dynamicssimulationcomputer model.[83][84]

These works can be seen as following the path originated with the work ofFritz Machlup who in his (1962) book "The Production and Distribution of Knowledge in the United States", claimed that the "knowledge industry represented 29% of the US gross national product", which he saw as evidence that the Information Age had begun. He defines knowledge as a commodity and attempts to measure the magnitude of the production and distribution of this commodity within a modern economy. Machlup divided information use into three classes: instrumental, intellectual, and pastime knowledge. He identified also five types of knowledge: practical knowledge; intellectual knowledge, that is, general culture and the satisfying of intellectual curiosity; pastime knowledge, that is, knowledge satisfying non-intellectual curiosity or the desire for light entertainment and emotional stimulation; spiritual or religious knowledge; unwanted knowledge, accidentally acquired and aimlessly retained.[90]

More recent estimates have reached the following results:[53]

  • the world's technological capacity to receive information through one-way broadcast networks grew at a sustained compound annual growth rate of 7% between 1986 and 2007;
  • the world's technological capacity to store information grew at a sustained compound annual growth rate of 25% between 1986 and 2007;
  • the world's effective capacity to exchange information through two-way telecommunications networks grew at a sustained compound annual growth rate of 30% during the same two decades;
  • the world's technological capacity to compute information with the help of humanly guided general-purpose computers grew at a sustained compound annual growth rate of 61% during the same period.[91]

Economics

[edit]

Eventually,Information and communication technology (ICT)—i.e. computers,computerized machinery,fiber optics,communication satellites, the Internet, and other ICT tools—became a significant part of theworld economy, as the development ofoptical networking andmicrocomputers greatly changed many businesses and industries.[92][93]Nicholas Negroponte captured the essence of these changes in his 1995 book,Being Digital, in which he discusses the similarities and differences between products made ofatoms and products made ofbits.[94]

Jobs and income distribution

[edit]

The Information Age has affected theworkforce in several ways, such as compelling workers to compete in a globaljob market. One of the most evident concerns is the replacement of human labor by computers that can do their jobs faster and more effectively, thus creating a situation in which individuals who perform tasks that can easily beautomated are forced to find employment where their labor is not as disposable.[95] This especially creates issue for those inindustrial cities, where solutions typically involve loweringworking time, which is often highly resisted. Thus, individuals who lose their jobs may be pressed to move up into more indispensable professions (e.g. engineers,doctors, lawyers,teachers,professors, scientists,executives, journalists, consultants), who are able to compete successfully in theworld market and receive (relatively) high wages.[citation needed]

Along with automation, jobs traditionally associated with the middle class (e.g.assembly line,data processing, management, andsupervision) have also begun to disappear as result of outsourcing.[96] Unable to compete with those indeveloping countries,production and service workers inpost-industrial (i.e. developed) societies either lose their jobs through outsourcing, accept wage cuts, or settle forlow-skill,low-wage service jobs.[96] In the past, the economic fate of individuals would be tied to that of their nation's. For example, workers in the United States were once well paid in comparison to those in other countries. With the advent of the Information Age and improvements in communication, this is no longer the case, as workers must now compete in a globaljob market, whereby wages are less dependent on the success or failure of individual economies.[96]

In effectuating aglobalized workforce, the internet has just as well allowed for increased opportunity indeveloping countries, making it possible for workers in such places to provide in-person services, therefore competing directly with their counterparts in other nations. Thiscompetitive advantage translates into increased opportunities and higher wages.[97]

Automation, productivity, and job gain

[edit]

The Information Age has affected the workforce in thatautomation and computerization have resulted in higherproductivity coupled with netjob loss in manufacturing. In the United States, for example, from January 1972 to August 2010, the number of people employed in manufacturing jobs fell from 17,500,000 to 11,500,000 while manufacturing value rose 270%.[98] Although it initially appeared thatjob loss in theindustrial sector might be partially offset by the rapid growth of jobs in information technology, therecession of March 2001 foreshadowed a sharp drop in the number of jobs in the sector. This pattern of decrease in jobs would continue until 2003,[99] and data has shown that, overall, technology creates more jobs than it destroys even in the short run.[100]

Information-intensive industry

[edit]
Main article:Information industry

Industry has become more information-intensive while lesslabor- andcapital-intensive. This has left important implications for theworkforce, as workers have become increasinglyproductive as the value of their labor decreases. For the system ofcapitalism itself, the value of labor decreases, the value ofcapital increases.

In theclassical model, investments inhuman andfinancial capital are important predictors of the performance of a newventure.[101] However, as demonstrated byMark Zuckerberg and Facebook, it now seems possible for a group of relatively inexperienced people with limited capital to succeed on a large scale.[102]

Innovations

[edit]
A visualization of the various routes through a portion of the Internet

The Information Age was enabled by technology developed in theDigital Revolution, which was itself enabled by building on the developments of theTechnological Revolution.

Transistors

[edit]
Main articles:Transistor,History of the transistor, andMOSFET
Further information:Semiconductor device

The onset of the Information Age can be associated with the development oftransistor technology.[2] The concept of afield-effect transistor was first theorized byJulius Edgar Lilienfeld in 1925.[103] The first practical transistor was thepoint-contact transistor, invented by the engineersWalter Houser Brattain andJohn Bardeen while working forWilliam Shockley atBell Labs in 1947. This was a breakthrough that laid the foundations for modern technology.[2] Shockley's research team also invented thebipolar junction transistor in 1952.[104][103] The most widely used type of transistor is themetal–oxide–semiconductor field-effect transistor (MOSFET), invented byMohamed M. Atalla andDawon Kahng at Bell Labs in 1960.[105] Thecomplementary MOS (CMOS) fabrication process was developed byFrank Wanlass andChih-Tang Sah in 1963.[106]

Computers

[edit]
Main articles:Computer andHistory of computing hardware
Further information:Integrated circuit,Invention of the integrated circuit,Microprocessor, andMoore's law

Before the advent ofelectronics,mechanical computers, like theAnalytical Engine in 1837, were designed to provide routine mathematical calculation and simple decision-making capabilities. Military needs duringWorld War II drove development of the first electronic computers, based onvacuum tubes, including theZ3, theAtanasoff–Berry Computer,Colossus computer, andENIAC.

The invention of the transistor enabled the era ofmainframe computers (1950s–1970s), typified by theIBM 360. These large,room-sized computers provided data calculation andmanipulation that was much faster than humanly possible, but were expensive to buy and maintain, so were initially limited to a few scientific institutions, large corporations, and government agencies.

Thegermaniumintegrated circuit (IC) was invented byJack Kilby atTexas Instruments in 1958.[107] Thesilicon integrated circuit was then invented in 1959 byRobert Noyce atFairchild Semiconductor, using theplanar process developed byJean Hoerni, who was in turn building onMohamed Atalla's siliconsurface passivation method developed atBell Labs in 1957.[108][109] Following the invention of theMOS transistor by Mohamed Atalla andDawon Kahng at Bell Labs in 1959,[105] theMOS integrated circuit was developed by Fred Heiman and Steven Hofstein atRCA in 1962.[110] Thesilicon-gate MOS IC was later developed byFederico Faggin at Fairchild Semiconductor in 1968.[111] With the advent of the MOS transistor and the MOS IC, transistor technologyrapidly improved, and the ratio of computing power to size increased dramatically, giving direct access to computers to ever smaller groups of people.

The first commercial single-chip microprocessor launched in 1971, theIntel 4004, which was developed by Federico Faggin using his silicon-gate MOS IC technology, along withMarcian Hoff,Masatoshi Shima andStan Mazor.[112][113]

Along with electronicarcade machines andhome video game consoles pioneered byNolan Bushnell in the 1970s, the development of personal computers like theCommodore PET andApple II (both in 1977) gave individuals access to the computer. However,data sharing between individual computers was either non-existent or largelymanual, at first usingpunched cards andmagnetic tape, and laterfloppy disks.

Data

[edit]
Further information:History of telecommunication,Computer memory,Computer data storage,Data compression,Internet access, andSocial media

The first developments for storing data were initially based on photographs, starting withmicrophotography in 1851 and thenmicroform in the 1920s, with the ability to store documents on film, making them much more compact. Earlyinformation theory andHamming codes were developed about 1950, but awaited technical innovations in data transmission and storage to be put to full use.

Magnetic-core memory was developed from the research of Frederick W. Viehe in 1947 andAn Wang atHarvard University in 1949.[114][115] With the advent of the MOS transistor, MOSsemiconductor memory was developed by John Schmidt atFairchild Semiconductor in 1964.[116][117] In 1967,Dawon Kahng andSimon Sze at Bell Labs described in 1967 how the floating gate of an MOS semiconductor device could be used for the cell of a reprogrammable ROM.[118] Following the invention of flash memory byFujio Masuoka atToshiba in 1980,[119][120] Toshiba commercializedNAND flash memory in 1987.[121][118]

Copper wire cables transmitting digital data connectedcomputer terminals andperipherals to mainframes, and special message-sharing systems leading to email, were first developed in the 1960s. Independent computer-to-computer networking began withARPANET in 1969. This expanded to become the Internet (coined in 1974). Access to the Internet improved with the invention of theWorld Wide Web in 1991. The capacity expansion fromdense wave division multiplexing,optical amplification andoptical networking in the mid-1990s led to record data transfer rates. By 2018, optical networks routinely delivered 30.4 terabits/s over a fiber optic pair, the data equivalent of 1.2 million simultaneous 4K HD video streams.[122]

MOSFET scaling, the rapid miniaturization of MOSFETs at a rate predicted byMoore's law,[123] led to computers becoming smaller and more powerful, to the point where they could be carried. During the 1980s–1990s, laptops were developed as a form of portable computer, andpersonal digital assistants (PDAs) could be used while standing or walking.Pagers, widely used by the 1980s, were largely replaced by mobile phones beginning in the late 1990s, providingmobile networking features to some computers. Now commonplace, this technology is extended todigital cameras and other wearable devices. Starting in the late 1990s,tablets and thensmartphones combined and extended these abilities of computing, mobility, and information sharing.Metal–oxide–semiconductor (MOS)image sensors, which first began appearing in the late 1960s, led to the transition from analog todigital imaging, and from analog to digital cameras, during the 1980s–1990s. The most common image sensors are thecharge-coupled device (CCD) sensor and theCMOS (complementary MOS)active-pixel sensor (CMOS sensor).

Electronic paper, which has origins in the 1970s, allows digital information to appear as paper documents.

Personal computers

[edit]
Main article:History of personal computers

By 1976, there were several firms racing to introduce the first truly successful commercial personal computers. Three machines, theApple II,Commodore PET 2001 andTRS-80 were all released in 1977,[124] becoming the most popular by late 1978.[125]Byte magazine later referred to Commodore, Apple, and Tandy as the "1977 Trinity".[126] Also in 1977,Sord Computer Corporation released the Sord M200 Smart Home Computer in Japan.[127]

Apple II

[edit]
Main article:Apple II
April 1977:Apple II.

Steve Wozniak (known as "Woz"), a regular visitor toHomebrew Computer Club meetings, designed the single-boardApple I computer and first demonstrated it there. With specifications in hand and an order for 100 machines at US$500 each fromthe Byte Shop, Woz and his friendSteve Jobs foundedApple Computer.

About 200 of the machines sold before the company announced the Apple II as a complete computer. It had colorgraphics, a full QWERTY keyboard, and internal slots for expansion, which were mounted in a high quality streamlined plastic case. The monitor and I/O devices were sold separately. The original Apple IIoperating system was only the built-in BASIC interpreter contained in ROM.Apple DOS was added to support the diskette drive; the last version was "Apple DOS 3.3".

Its higher price and lack offloating point BASIC, along with a lack of retail distribution sites, caused it to lag in sales behind the other Trinity machines until 1979, when it surpassed the PET. It was again pushed into 4th place whenAtari, Inc. introduced itsAtari 8-bit computers.[128]

Despite slow initial sales, the lifetime of theApple II was about eight years longer than other machines, and so accumulated the highest total sales. By 1985, 2.1 million had sold and more than 4 million Apple II's were shipped by the end of its production in 1993.[129]

Optical networking

[edit]
Further information:Fiber-optic communication,Image sensor, andOptical fiber

Optical communication plays a crucial role incommunication networks. Optical communication provides the transmission backbone for thetelecommunications andcomputer networks that underlie the Internet, the foundation for theDigital Revolution and Information Age.

The two core technologies are the optical fiber and light amplification (theoptical amplifier). In 1953, Bram van Heel demonstrated image transmission through bundles ofoptical fibers with a transparent cladding. The same year,Harold Hopkins andNarinder Singh Kapany atImperial College succeeded in making image-transmitting bundles with over 10,000 optical fibers, and subsequently achieved image transmission through a 75 cm long bundle which combined several thousand fibers.

Gordon Gould invented theoptical amplifier and thelaser, and also established the first optical telecommunications company,Optelecom, to design communication systems. The firm was a co-founder inCiena Corp., the venture that popularized the optical amplifier with the introduction of the firstdense wave division multiplexing system.[130] This massive scale communication technology has emerged as the common basis of all telecommunications networks[3] and, thus, a foundation of the Information Age.[131][132]

Economy, society, and culture

[edit]

Manuel Castells captures the significance of the Information Age inThe Information Age: Economy, Society and Culture when he writes of our global interdependence and the new relationships between economy, state and society, what he calls "a new society-in-the-making." He cautions that just because humans have dominated the material world, does not mean that the Information Age is the end of history:

"It is in fact, quite the opposite: history is just beginning, if by history we understand the moment when, after millennia of a prehistoric battle with Nature, first to survive, then to conquer it, our species has reached the level of knowledge and social organization that will allow us to live in a predominantly social world. It is the beginning of a new existence, and indeed the beginning of a new age, The Information Age, marked by the autonomy of culture vis-à-vis the material basis of our existence."[133]

Thomas Chatterton Williams wrote about the dangers ofanti-intellectualism in the Information Age in a piece forThe Atlantic. Although access to information has never been greater, most information is irrelevant or insubstantial. The Information Age's emphasis on speed over expertise contributes to "superficial culture in which even the elite will openly disparage as pointless our main repositories for the very best that has been thought."[134]

See also

[edit]

Footnotes

[edit]
  1. ^Also known as theThird Industrial Revolution,Computer Age,Digital Age,Silicon Age,New Media Age,Internet Age, or theDigital Revolution[1]

References

[edit]
  1. ^Hoover, Stewart M. (26 April 2006).Religion in the Media Age. Media, Religion and Culture (1st ed.). New York:Routledge.ISBN 978-0-415-31423-7.
  2. ^abcdManuel, Castells (1996).The information age : economy, society and culture. Oxford: Blackwell.ISBN 978-0631215943.OCLC 43092627.
  3. ^abGrobe, Klaus; Eiselt, Michael (2013).Wavelength Division Multiplexing: A Practical Engineering Guide. John T Wiley & Sons. p. 2.
  4. ^Kluver, Randy."Globalization, Informatization, and Intercultural Communication".un.org. Archived fromthe original on 19 July 2013. Retrieved18 April 2013.
  5. ^"The History of Computers".thought.co.Archived from the original on 1 August 2020. Retrieved17 October 2019.
  6. ^"Regulation for the Fourth Industrial Revolution".gov.uk. Retrieved16 September 2024.
  7. ^"Museum Of Applied Arts And Sciences – About".Museum of Applied Arts and Sciences. Retrieved22 August 2017.
  8. ^"The Digital Revolution Ahead for the Audio Industry," Business Week. New York, 16 March 1981, p. 40D.
  9. ^Phil Ament (17 April 2015)."Transistor History – Invention of the Transistor". Archived fromthe original on 13 August 2011. Retrieved17 April 2015.
  10. ^Shannon, Claude E.; Weaver, Warren (1963).The mathematical theory of communication (4. print. ed.). Urbana: University of Illinois Press. p. 144.ISBN 0252725484.
  11. ^Howard R. Duff (2001). "John Bardeen and transistor physics".AIP Conference Proceedings. Vol. 550. pp. 3–32.doi:10.1063/1.1354371.
  12. ^Frosch, C. J.; Derick, L (1957)."Surface Protection and Selective Masking during Diffusion in Silicon".Journal of the Electrochemical Society.104 (9): 547.doi:10.1149/1.2428650.
  13. ^Lojek, Bo (2007).History of Semiconductor Engineering. Berlin, Heidelberg: Springer-Verlag Berlin Heidelberg. p. 321.ISBN 978-3-540-34258-8.
  14. ^"Milestones:First Semiconductor Integrated Circuit (IC), 1958".IEEE Global History Network. IEEE. Retrieved3 August 2011.
  15. ^Saxena, Arjun (2009).Invention of Integrated Circuits: Untold Important Facts. pp. x–xi.
  16. ^Saxena, Arjun (2009).Invention of Integrated Circuits: Untold Important Facts. pp. 102–103.
  17. ^"1963: Complementary MOS Circuit Configuration is Invented".Computer History Museum. Retrieved6 July 2019.
  18. ^US3472712A, Bower, Robert W., "Field-effect device with insulated gate", issued 14 October 1969 
  19. ^US3615934A, Bower, Robert W., "Insulated-gate field-effect device having source and drain regions formed in part by ion implantation and method of making same", issued 26 October 1971 
  20. ^US3475234A, Kerwin, Robert E.; Klein, Donald L. & Sarace, John C., "Method for making mis structures", issued 28 October 1969 
  21. ^Shirriff, Ken (30 August 2016)."The Surprising Story of the First Microprocessors".IEEE Spectrum.53 (9).Institute of Electrical and Electronics Engineers:48–54.doi:10.1109/MSPEC.2016.7551353.S2CID 32003640. Retrieved13 October 2019.
  22. ^"1971: Microprocessor Integrates CPU Function onto a Single Chip".Computer History Museum.
  23. ^abWilliams, J. B. (2017).The Electronics Revolution: Inventing the Future. Springer. pp. 245–8.ISBN 9783319490885.
  24. ^James R. Janesick (2001).Scientific charge-coupled devices. SPIE Press. pp. 3–4.ISBN 978-0-8194-3698-6.
  25. ^"History of Whole Earth Catalog". Archived fromthe original on 13 February 2021. Retrieved17 April 2015.
  26. ^"Personal Computer Milestones". Retrieved17 April 2015.
  27. ^Criss, Fillur (14 August 2014)."2,076 IT jobs from 492 companies".ICTerGezocht.nl (in Dutch). Retrieved19 August 2017.
  28. ^"Atari – Arcade/Coin-op". Archived fromthe original on 2 November 2014. Retrieved17 April 2015.
  29. ^Vincze Miklós (15 June 2013)."Forgotten arcade games let you shoot space men and catch live lobsters".io9. Archived fromthe original on 14 February 2015. Retrieved17 April 2015.
  30. ^"How many Commodore 64 computers were really sold?".pagetable.com. Archived fromthe original on 6 March 2016. Retrieved17 April 2015.
  31. ^"Archived copy"(PDF). Archived fromthe original(PDF) on 2 April 2013. Retrieved20 December 2017.{{cite web}}: CS1 maint: archived copy as title (link)
  32. ^Kominski, Robert (February 1991)."Computer Use in the United States: 1989. Current Population Reports, Special Studies".Bureau of the Census (DOC), Suitland, Md. Population Div. – via ERIC (Education Resources Information Center).
  33. ^"COMPUTE! magazine issue 93 Feb 1988". February 1988.If the wheels behind the CD-ROM industry have their way, this product will help open the door to a brave, new multimedia world for microcomputers, where the computer is intimately linked with the other household electronics, and every gadget in the house reads tons of video, audio, and text data from CD-ROM disks.
  34. ^"1988". Retrieved17 April 2015.
  35. ^"A short history of the Web".CERN. 25 January 2024. Retrieved16 February 2024.
  36. ^"World Wide Web and Its Journey from Web 1.0 to Web 4.0"(PDF).ijcsit. Retrieved20 January 2025.
  37. ^Martin Bryant (6 August 2011)."20 years ago today, the World Wide Web was born – TNW Insider".The Next Web. Retrieved17 April 2015.
  38. ^"The World Wide Web".PBS. Retrieved17 April 2015.
  39. ^"Stanford Federal Credit Union Pioneers Online Financial Services" (Press release). 21 June 1995. Archived fromthe original on 21 December 2018. Retrieved21 December 2018.
  40. ^"History – About us – OP Group".
  41. ^Cheeseman Day, Jennifer; Janus, Alex; Davis, Jessica (October 2005)."Computer and Internet Use in the United States: 2003"(PDF).Census Bureau. Archived fromthe original(PDF) on 6 March 2009. Retrieved10 March 2009.
  42. ^File, Thom (May 2013).Computer and Internet Use in the United States(PDF) (Report). Current Population Survey Reports. Washington, D.C.:U.S. Census Bureau. Retrieved11 February 2020.
  43. ^Tuckel, Peter; O'Neill, Harry (2005).Ownership and Usage Patterns of Cell Phones: 2000–2005(PDF) (Report). JSM Proceedings, Survey Research Methods Section.Alexandria, VA:American Statistical Association. p. 4002. Retrieved25 September 2020.
  44. ^"One Billion People Online!". Archived fromthe original on 22 October 2008. Retrieved17 April 2015.
  45. ^"Demographics of Internet and Home Broadband Usage in the United States".Pew Research Center. 7 April 2021. Retrieved19 May 2021.
  46. ^Arendt, Susan (5 March 2007)."Game Consoles in 41% of Homes".WIRED.Condé Nast. Retrieved29 June 2021.
  47. ^Statistical Abstract of the United States: 2008(PDF) (Report).Statistical Abstract of the United States (127 ed.).U.S. Census Bureau. 30 December 2007. p. 52. Retrieved29 June 2021.
  48. ^North, Dale (14 April 2015)."155M Americans play video games, and 80% of households own a gaming device".VentureBeat. Retrieved29 June 2021.
  49. ^2015 Essential Facts About the Computer and Video Game Industry (Report). Essential Facts About the Computer and Video Game Industry. Vol. 2015.Entertainment Software Association. Retrieved29 June 2021.
  50. ^"Demographics of Mobile Device Ownership and Adoption in the United States".Pew Research Center. 7 April 2021. Retrieved19 May 2021.
  51. ^ab"World Internet Users Statistics and 2014 World Population Stats". Archived fromthe original on 23 June 2011. Retrieved17 April 2015.
  52. ^Clement."Worldwide digital population as of April 2020".Statista. Retrieved21 May 2020.
  53. ^abcdefghHilbert, Martin; López, Priscila (2011)."The World's Technological Capacity to Store, Communicate, and Compute Information".Science.332 (6025):60–65.Bibcode:2011Sci...332...60H.doi:10.1126/science.1200970.ISSN 0036-8075.PMID 21310967.S2CID 206531385.
  54. ^abcGillings, Michael R.; Hilbert, Martin; Kemp, Darrell J. (2016)."Information in the Biosphere: Biological and Digital Worlds".Trends in Ecology & Evolution.31 (3):180–189.Bibcode:2016TEcoE..31..180G.doi:10.1016/j.tree.2015.12.013.PMID 26777788.S2CID 3561873.Archived from the original on 4 June 2016. Retrieved22 August 2016.
  55. ^"Worldmapper: The world as you've never seen it before – Cellular Subscribers 1990". Retrieved17 April 2015.
  56. ^abc"Worldmapper: The world as you've never seen it before – Communication Maps". Retrieved17 April 2015.
  57. ^Arms, Michael (2013)."Cell Phone Dangers – Protecting Our Homes From Cell Phone Radiation".Computer User. Archived fromthe original on 29 March 2014.
  58. ^"Number of mobile phone users worldwide 2015–2020".Statista. Retrieved19 February 2020.
  59. ^"Global digital population 2020".Statista. Retrieved19 February 2020.
  60. ^"Fact and Figure 2023 – Mobile phone ownership".International Telecommunication Union. Retrieved10 September 2024.
  61. ^"Facts and Figures 2023 – Internet Use".Statista. Retrieved10 September 2024.
  62. ^Rider, Fredmont (1944).The Scholar and the Future of the Research Library. New York City: Hadham Press.
  63. ^"Moore's Law to roll on for another decade".Archived from the original on 9 July 2015. Retrieved27 November 2011.Moore also affirmed he never said transistor count would double every 18 months, as is commonly said. Initially, he said transistors on a chip would double every year. He then recalibrated it to every two years in 1975. David House, an Intel executive at the time, noted that the changes would cause computer performance to double every 18 months.
  64. ^abRoser, Max, andHannah Ritchie. 2013."Technological Progress".Archived 2021-09-10 at theWayback MachineOur World in Data. Retrieved 9 June 2020.
  65. ^Hilbert, Martin; López, Priscila (April 2011)."The World's Technological Capacity to Store, Communicate, and Compute Information".Science.332 (6025):60–65.Bibcode:2011Sci...332...60H.doi:10.1126/science.1200970.ISSN 0036-8075.PMID 21310967.
  66. ^Hilbert, Martin R. (2011).Supporting online material for the world's technological capacity to store, communicate, and compute infrormation. Science/AAAS.OCLC 755633889.
  67. ^Gantz, John; David Reinsel (2012)."The Digital Universe in 2020: Big Data, Bigger Digital Shadows, and Biggest Growth in the Far East".Archived 2020-06-10 at theWayback MachineIDC iView.S2CID 112313325.View multimedia contentArchived 2020-05-24 at theWayback Machine.
  68. ^Rizzatti, Lauro. 14 September 2016."Digital Data Storage is Undergoing Mind-Boggling Growth".EE Times. Archived from theoriginal on 16 September 2016.
  69. ^"The historical growth of data: Why we need a faster transfer solution for large data sets".Archived 2019-06-02 at theWayback MachineSigniant, 2020. Retrieved 9 June 2020.
  70. ^Gilbert, Walter; Allan Maxam. "Biochemistry."Proceedings of the National Academy of Sciences, USA. Vol. 74. No 2, pp. 560–64.
  71. ^Lathe III, Warren C.; Williams, Jennifer M.; Mangan, Mary E.; Karolchik, Donna (2008)."Genomic Data Resources: Challenges and Promises".Nature Education.Archived from the original on 6 December 2021. Retrieved5 December 2021.
  72. ^Iranga, Suroshana (2016).Social Media Culture. Colombo: S. Godage and Brothers.ISBN 978-9553067432.
  73. ^Di Giambattista, C. (2021). Presentare il futuro nella Digital Age. La convergenza semiotica tra arte e IxD design nella pratica del Future Casting [Zenodo].https://doi.org/10.5281/zenodo.6627276
  74. ^Jillianne Code, Rachel Ralph, Kieran Forde et al.A Disorienting Dilemma: Teaching and Learning in Technology Education During a Time of Crisis, 14 September 2021, preprint (Version 1).https://doi.org/10.21203/rs.3.rs-899835/v1
  75. ^Goodarzi, M., Fahimifar, A., Shakeri Daryani, E. (2021). "New Media and Ideology: A Critical Perspective".Journal of Cyberspace Studies, 5(2), 137–162.https://www.ssoar.info/ssoar/bitstream/handle/document/77017/ssoar-jcss-2021-2-goodarzi_et_al-New_Media_and_Ideology_a.pdf
  76. ^Wang, Xuan & Yang, Zhihui. (2022). Research on the Youth Group's Expectations for the Future Development of self-Media while in the Digital Economy. Frontiers in Business, Economics and Management. 3. 43–48.https://doi.org/10.54097/fbem.v3i3.315.
  77. ^Dr. R. Sunitha (2020). Impact of Digital Humanities And Literary Study In Electronic Era International Journal of Computer Trends and Technology, 68(2),22–24.https://ijcttjournal.org/helium/ijctt/ijctt-v68i2p104
  78. ^Hilbert, M. (2020). "Digital technology and social change: The digital transformation of society from a historical perspective".Dialogues in Clinical Neuroscience, 22(2), 189–194.https://doi.org/10.31887/DCNS.2020.22.2/mhilbert
  79. ^Krishnapuram, Raghu (September 2013). "Global trends in information technology and their implication".2013 1st International Conference on Emerging Trends and Applications in Computer Science. IEEE. pp. v.doi:10.1109/icetacs.2013.6691382.ISBN 978-1-4673-5250-5.
  80. ^Shannon, C. E. andW. Weaver (1949)The Mathematical Theory of Communication, Urbana, Ill., University of Illinois Press.
  81. ^Wiener, Norbert (1948)Cybernetics, MIT Press, CA, \\\, p. 155
  82. ^William Rees-Mogg;James Dale Davidson (1997).The Sovereign Individual.Simon & Schuster. p. 7.ISBN 978-0684832722.
  83. ^abVeneris, Y. (1984),The Informational Revolution, Cybernetics and Urban Modeling, PhD Thesis, submitted to the University of Newcastle upon Tyne, UK (British Library microfilm no. : D55307/85).[1].
  84. ^abVeneris, Y. (1990)."Modeling the transition from the Industrial to the Informational Revolution".Environment and Planning A.22 (3):399–416.Bibcode:1990EnPlA..22..399V.doi:10.1068/a220399.S2CID 144963523.
  85. ^Clark, C. (1940),Conditions of Economic Progress, McMillan and Co, London.
  86. ^Ricardo, D. (1978)The Principles of Political Economy and Taxation, Dent, London. (first published in 1817)ISBN 0486434613.
  87. ^Marx, K. (1977)Capital, Progress Publishers, Moscow.
  88. ^Fang, Irving E. (1997)A History of Mass Communication: Six Information RevolutionsArchived 2012-04-17 at theWayback Machine, Focal PressISBN 0240802543
  89. ^Porat, M.-U. (1976)The Information Economy, PhD Thesis, Univ. of Stanford. This thesis measured the role of the Information Sector in the US Economy.
  90. ^Machlup, F. (1962)The Production and Distribution of Knowledge in the United States, Princeton UP.
  91. ^"video animation on The World’s Technological Capacity to Store, Communicate, and Compute Information from 1986 to 2010Archived 2012-01-18 at theWayback Machine
  92. ^"Information Age Education Newsletter".Information Age Education. August 2008.Archived from the original on 14 September 2015. Retrieved4 December 2019.
  93. ^Moursund, David."Information Age".IAE-Pedia.Archived from the original on 1 August 2020. Retrieved4 December 2019.
  94. ^"Negroponte's articles". Archives.obs-us.com. 30 December 1996.Archived from the original on 4 September 2011. Retrieved11 June 2012.
  95. ^Porter, Michael."How Information Gives You Competitive Advantage".Harvard Business Review.Archived from the original on 23 June 2015. Retrieved9 September 2015.
  96. ^abcMcGowan, Robert. 1991. "The Work of Nations by Robert Reich" (book review).Human Resource Management 30(4):535–38.doi:10.1002/hrm.3930300407.ISSN 1099-050X.
  97. ^Bhagwati, Jagdish N. (2005).In defense of Globalization. New York:Oxford University Press.
  98. ^Smith, Fran (5 October 2010)."Job Losses and Productivity Gains".Competitive Enterprise Institute.Archived from the original on 13 October 2010.
  99. ^Cooke, Sandra D. 2003. "Information Technology Workers in the Digital EconomyArchived 2017-06-21 at theWayback Machine." InDigital Economy.Economics and Statistics Administration,Department of Commerce.
  100. ^Chang, Yongsung; Hong, Jay H. (2013)."Does Technology Create Jobs?".SERI Quarterly.6 (3):44–53. Archived fromthe original on 29 April 2014. Retrieved29 April 2014.
  101. ^Cooper, Arnold C.; Gimeno-Gascon, F. Javier; Woo, Carolyn Y. (1994). "Initial human and financial capital as predictors of new venture performance".Journal of Business Venturing.9 (5):371–395.doi:10.1016/0883-9026(94)90013-2.
  102. ^Carr, David (3 October 2010)."Film Version of Zuckerberg Divides the Generations".The New York Times.ISSN 0362-4331.Archived from the original on 14 November 2020. Retrieved20 December 2016.
  103. ^abLee, Thomas H. (2003)."A Review of MOS Device Physics"(PDF).The Design of CMOS Radio-Frequency Integrated Circuits.Cambridge University Press.ISBN 9781139643771.Archived(PDF) from the original on 9 December 2019. Retrieved21 July 2019.
  104. ^"Who Invented the Transistor?".Computer History Museum. 4 December 2013.Archived from the original on 13 December 2013. Retrieved20 July 2019.
  105. ^ab"1960 – Metal Oxide Semiconductor (MOS) Transistor Demonstrated".The Silicon Engine.Computer History Museum.Archived from the original on 27 October 2019. Retrieved21 July 2019.
  106. ^"1963: Complementary MOS Circuit Configuration is Invented".Archived from the original on 23 July 2019.
  107. ^Kilby, Jack (2000),Nobel lecture(PDF), Stockholm: Nobel Foundation,archived(PDF) from the original on 29 May 2008, retrieved15 May 2008
  108. ^Lojek, Bo (2007).History of Semiconductor Engineering.Springer Science & Business Media. p. 120.ISBN 9783540342588.
  109. ^Bassett, Ross Knox (2007).To the Digital Age: Research Labs, Start-up Companies, and the Rise of MOS Technology. Johns Hopkins University Press. p. 46.ISBN 9780801886393.Archived from the original on 27 July 2020. Retrieved31 July 2019.
  110. ^"Tortoise of Transistors Wins the Race – CHM Revolution".Computer History Museum.Archived from the original on 10 March 2020. Retrieved22 July 2019.
  111. ^"1968: Silicon Gate Technology Developed for ICs".Computer History Museum.Archived from the original on 29 July 2020. Retrieved22 July 2019.
  112. ^"1971: Microprocessor Integrates CPU Function onto a Single Chip".Computer History Museum.Archived from the original on 12 August 2021. Retrieved22 July 2019.
  113. ^Colinge, Jean-Pierre; Greer, James C.; Greer, Jim (2016).Nanowire Transistors: Physics of Devices and Materials in One Dimension.Cambridge University Press. p. 2.ISBN 9781107052406.Archived from the original on 17 March 2020. Retrieved22 July 2019.
  114. ^"1953: Whirlwind computer debuts core memory".Computer History Museum.Archived from the original on 3 October 2019. Retrieved31 July 2019.
  115. ^"1956: First commercial hard disk drive shipped".Computer History Museum.Archived from the original on 31 July 2019. Retrieved31 July 2019.
  116. ^"1970: MOS Dynamic RAM Competes with Magnetic Core Memory on Price".Computer History Museum.Archived from the original on 26 October 2021. Retrieved29 July 2019.
  117. ^Solid State Design – Vol. 6. Horizon House. 1965.Archived from the original on 9 June 2021. Retrieved12 November 2020.
  118. ^ab"1971: Reusable semiconductor ROM introduced".Computer History Museum.Archived from the original on 3 October 2019. Retrieved19 June 2019.
  119. ^Fulford, Benjamin (24 June 2002)."Unsung hero".Forbes.Archived from the original on 3 March 2008. Retrieved18 March 2008.
  120. ^US 4531203  Fujio Masuoka
  121. ^"1987: Toshiba Launches NAND Flash".eWeek. 11 April 2012. Retrieved20 June 2019.
  122. ^Saarinen, Juha (24 January 2018)."Telstra trial claims world's fasts transmission speed".ITNews Australia.Archived from the original on 17 October 2019. Retrieved5 December 2021.
  123. ^Sahay, Shubham; Kumar, Mamidala Jagadesh (2019).Junctionless Field-Effect Transistors: Design, Modeling, and Simulation.John Wiley & Sons.ISBN 9781119523536.Archived from the original on 21 December 2019. Retrieved31 October 2019.
  124. ^Chandler, Alfred Dupont; Hikino, Takashi; Nordenflycht, Andrew Von; Chandler, Alfred D. (30 June 2009).Inventing the Electronic Century. Harvard University Press.ISBN 9780674029392.Archived from the original on 18 January 2022. Retrieved11 August 2015.
  125. ^Schuyten, Peter J. (6 December 1978)."Technology; The Computer Entering Home". Business & Finance.The New York Times. p. D4.ISSN 0362-4331.Archived from the original on 22 July 2018. Retrieved9 September 2019.
  126. ^"Most Important Companies".Byte. September 1995. Archived fromthe original on 18 June 2008. Retrieved10 June 2008.
  127. ^"M200 Smart Home Computer Series-Computer Museum".Archived from the original on 3 January 2020. Retrieved18 January 2022.
  128. ^Reimer, Jeremy (14 December 2005)."Total share: 30 years of personal computer market share figures; The new era (2001– )".Ars Technica. p. 9.Archived from the original on 21 February 2008. Retrieved13 February 2008.
  129. ^Reimer, Jeremy (December 2005)."Personal Computer Market Share: 1975–2004".Ars Technica. Archived fromthe original on 6 June 2012. Retrieved13 February 2008.
  130. ^Markoff, John (3 March 1997)."Fiber-Optic Technology Draws Record Stock Value".The New York Times.Archived from the original on 9 November 2021. Retrieved5 December 2021.
  131. ^Sudo, Shoichi (1997).Optical Fiber Amplifiers: Materials Devices, and Applications. Artech House, Inc. pp. xi.
  132. ^George, Gilder (4 April 1997). "Fiber Keeps its Promise".Forbes ASAP.
  133. ^Castells, Manuel. The Power of Identity, The Information Age: Economy, Society and Culture Vol. II. Cambridge, MA; Oxford, UK: Blackwell
  134. ^Chatterton Williams, Thomas."Kanye West, Sam ...."The Atlantic. 25 January 2023. 25 January 2023.

Further reading

[edit]

External links

[edit]
Wikibooks has a book on the topic of:The Information Age
Wikiquote has quotations related toInformation Age.
Wikimedia Commons has media related toInformation Age.
Note: This template roughly follows the 2012ACM Computing Classification System.
Hardware
Computer systems organization
Networks
Software organization
Software notations andtools
Software development
Theory of computation
Algorithms
Mathematics ofcomputing
Information systems
Security
Human–computer interaction
Concurrency
Artificial intelligence
Machine learning
Graphics
Applied computing
Foundations
History
Culture
Philosophy
Religion
Law
Contemporary
integration
Commercial revolution
(1000–1760)
1st Industrial Revolution
(1760–1840)
Early Victorian Britain/
Civil War-era United States
(1840–1870)
Gilded Age/
2nd Industrial Revolution
(1870–1914)
Interwar period
(1918–1939)
Post–WWII expansion
(1945–1973)
Great Inflation
(1973–1982)
Great Moderation/
Great Regression
(1982–2007)
Information Age
(2007–present)
Portals:
Retrieved from "https://en.wikipedia.org/w/index.php?title=Information_Age&oldid=1282401018"
Categories:
Hidden categories:

[8]ページ先頭

©2009-2025 Movatter.jp