Movatterモバイル変換


[0]ホーム

URL:


Jump to content
WikipediaThe Free Encyclopedia
Search

Fifth Generation Computer Systems

From Wikipedia, the free encyclopedia
(Redirected fromFifth generation computer systems project)
Five generation of a computer
Not to be confused with the fifth-generation computer projectKronos.
icon
This articleneeds additional citations forverification. Please helpimprove this article byadding citations to reliable sources. Unsourced material may be challenged and removed.
Find sources: "Fifth Generation Computer Systems" – news ·newspapers ·books ·scholar ·JSTOR
(January 2026) (Learn how and when to remove this message)
This article'stone or style may not reflect theencyclopedic tone used on Wikipedia. See Wikipedia'sguide to writing better articles for suggestions.(February 2019) (Learn how and when to remove this message)

TheFifth Generation Computer Systems (FGCS;Japanese:第五世代コンピュータ,romanizeddaigosedai konpyūta) was a 10-year initiative launched in 1982 by Japan'sMinistry of International Trade and Industry (MITI) to develop computers based onmassively parallel computing andlogic programming. The project aimed to create an "epoch-making computer" with supercomputer-like performance and to establish a platform for future advancements inartificial intelligence. Although FGCS was ahead of its time, its ambitious goals ultimately led to commercial failure. However, on a theoretical level, the project significantly contributed to the development ofconcurrent logic programming.

The term "fifth generation" was chosen to emphasize the system's advanced nature. In thehistory of computing hardware, there had been four prior "generations" of computers: the first generation utilizedvacuum tubes; the second,transistors anddiodes; the third,integrated circuits; and the fourth,microprocessors. While earlier generations focused on increasing the number of logic elements within a single CPU, it was widely believed at the time that the fifth generation would achieve enhanced performance through the use of massive numbers of CPUs.[citation needed]

Background

[edit]

In the late 1960s until the early 1970s, there was much talk about "generations" of computer hardware, then usually organized into three generations

  1. First generation: Thermionic vacuum tubes. Mid-1940s. IBM pioneered the arrangement of vacuum tubes in pluggable modules. TheIBM 650 was a first-generation computer.
  2. Second generation: Transistors. 1956. The era of miniaturization begins. Transistors are much smaller than vacuum tubes, draw less power, and generate less heat. Discrete transistors are soldered to circuit boards, with interconnections accomplished by stencil-screened conductive patterns on the reverse side. TheIBM 7090 was a second-generation computer.
  3. Third generation: Integrated circuits (silicon chips containing multiple transistors). 1964. A pioneering example is the ACPX module used in the IBM 360/91, which, by stacking layers of silicon over a ceramic substrate, accommodated over 20 transistors per chip; the chips could be packed together onto a circuit board to achieve unprecedented logic densities. The IBM 360/91 was a hybrid second and third-generation computer.

Omitted from this taxonomy is the "zeroth-generation" computer based on metal gears (such as theIBM 407) or mechanical relays (such as the Mark I), and the post-third-generation computers based on Very Large Scale Integrated (VLSI) circuits.

There was also a parallel set of generations for software:

  1. First generation:Machine language.
  2. Second generation:Low-level programming languages such asAssembly language.
  3. Third generation: Structuredhigh-level programming languages such asC,COBOL andFORTRAN.
  4. Fourth generation: "Non-procedural"high-level programming languages (such as object-oriented languages).[1]

Throughout these multiple generations up to the 1970s, Japan built computers following U.S. and British leads. In the mid-1970s, the Ministry of International Trade and Industry stopped following western leads and started looking into the future of computing on a small scale. They asked theJapan Information Processing Development Center (JIPDEC) to indicate a number of future directions, and in 1979 offered a three-year contract to carry out more in-depth studies along with industry and academia. It was during this period that the term "fifth-generation computer" started to be used.

Prior to the 1970s, MITI guidance had successes such as an improved steel industry, the creation of the oilsupertanker, theautomotive industry, consumer electronics, and computer memory. MITI decided that the future was going to beinformation technology. However, theJapanese language, particularly in its written form, presented and still presents obstacles for computers.[2] As a result of these hurdles, MITI held a conference to seek assistance from experts.

The primary fields for investigation from this initial project were:

  • Inference computer technologies for knowledge processing
  • Computer technologies to process large-scale data bases andknowledge bases
  • High-performance workstations
  • Distributed functional computer technologies
  • Super-computers for scientific calculation

Project launch

[edit]

The aim was to build parallel computers for artificial intelligence applications using concurrent logic programming. The project imagined an "epoch-making" computer with supercomputer-like performance running on top of largedatabases (as opposed to a traditionalfilesystem) using alogic programming language to define and access the data using massivelyparallel computing/processing. They envisioned building a prototype machine with performance between 100M and 1G LIPS, where a LIPS is aLogical Inference Per Second. At the time typical workstation machines were capable of about 100k LIPS. They proposed to build this machine over a ten-year period, 3 years for initial R&D, 4 years for building various subsystems, and a final 3 years to complete a working prototype system. In 1982 the government decided to go ahead with the project, and established theInstitute for New Generation Computer Technology (ICOT) through joint investment with various Japanese computer companies. After the project ended, MITI would consider an investment in a new "sixth generation" project.

Ehud Shapiro captured the rationale and motivations driving this project:[3]

"As part of Japan's effort to become a leader in the computer industry, the Institute for New Generation Computer Technology has launched a revolutionary ten-year plan for the development of large computer systems which will be applicable to knowledge information processing systems. These Fifth Generation computers will be built around the concepts of logic programming. In order to refute the accusation that Japan exploits knowledge from abroad without contributing any of its own, this project will stimulate original research and will make its results available to the international research community."

Logic programming

[edit]

The target defined by the FGCS project was to develop "Knowledge Information Processing systems" (roughly meaning, appliedArtificial Intelligence). The chosen tool to implement this goal waslogic programming. Logic programming approach as was characterized by Maarten Van Emden – one of its founders – as:[4]

  • The use of logic to express information in a computer.
  • The use of logic to present problems to a computer.
  • The use of logical inference to solve these problems.

More technically, it can be summed up in two equations:

  • Program =Set of axioms.
  • Computation =Proof of a statement from axioms.

The Axioms typically used are universal axioms of a restricted form, calledHorn-clauses ordefinite-clauses. The statement proved in a computation is an existential statement.[citation needed] The proof is constructive, and provides values for the existentially quantified variables: these values constitute the output of the computation.

Logic programming was thought of as something that unified various gradients of computer science (software engineering,databases,computer architecture andartificial intelligence). It seemed that logic programming was a key missing connection betweenknowledge engineering and parallel computer architectures.

Results

[edit]

After having influenced theconsumer electronics field during the 1970s and theautomotive world during the 1980s, the Japanese had developed a strong reputation. The launch of the FGCS project spread the belief that parallel computing was the future of all performance gains, producing a wave of apprehension in the computer field. Soon parallel projects were set up in the US as theStrategic Computing Initiative and theMicroelectronics and Computer Technology Corporation (MCC), in the UK asAlvey, and in Europe as theEuropean Strategic Program on Research in Information Technology (ESPRIT), as well as theEuropean Computer‐Industry Research Centre (ECRC) inMunich, a collaboration betweenICL in Britain,Bull in France, andSiemens in Germany.

The project ran from 1982 to 1994, spending a little less than ¥57 billion (about US$320 million) total.[5] After the FGCS Project,MITI stopped funding large-scale computer research projects, and the research momentum developed by the FGCS Project dissipated. However MITI/ICOT embarked on a neural-net project[which?] which some called the Sixth Generation Project in the 1990s, with a similar level of funding.[6] Per-year spending was less than 1% of the entire R&D expenditure of the electronics and communications equipment industry. For example, the project's highest expenditure year was 7.2 million yen in 1991, but IBM alone spent 1.5 billion dollars (370 billion yen) in 1982, while the industry spent 2150 billion yen in 1990.[5]

Concurrent logic programming

[edit]

In 1982, during a visit to the ICOT,Ehud Shapiro invented ConcurrentProlog, a novel programming language that integrated logic programming and concurrent programming. Concurrent Prolog is aprocess oriented language, which embodiesdataflow synchronization and guarded-commandindeterminacy as its basic control mechanisms. Shapiro described the language in a Report marked as ICOT Technical Report 003,[7] which presented a Concurrent Prologinterpreter written in Prolog. Shapiro's work on Concurrent Prolog inspired a change in the direction of the FGCS from focusing on parallel implementation of Prolog to the focus onconcurrent logic programming as the software foundation for the project.[3] It also inspired the concurrent logic programming language Guarded Horn Clauses (GHC) by Kazunori Ueda, which was the basis ofKL1, the programming language that was finally designed and implemented by the FGCS project as its core programming language.

The FGCS project and its findings contributed greatly to the development of the concurrent logic programming field. The project produced a new generation of promising Japanese researchers.

Commercial failure

[edit]

Five workingParallel Inference Machines (PIM) were eventually produced: PIM/m, PIM/p, PIM/i, PIM/k, PIM/c.[citation needed] The project also produced applications to run on these systems, such as the paralleldatabase management system Kappa, thelegal reasoning systemHELIC-II, and theautomated theorem proverMGTP, as well as bioinformatics applications.

The FGCS Project did not meet with commercial success for reasons similar to theLisp machine companies andThinking Machines. The highly parallel computer architecture was eventually surpassed in speed by less specialized hardware (for example,Sun Microsystems workstations andIntelx86 machines).

A primary problem was the choice of concurrent logic programming as the bridge between the parallel computer architecture and the use of logic as aknowledge representation and problem solving language for AI applications. This never happened cleanly; a number of languages were developed, all with their own limitations. In particular, the committed choice feature ofconcurrent constraint logic programming interfered with the logical semantics of the languages.[8] The project found that the benefits oflogic programming were largely negated using committed choice.[citation needed]

Another problem was that existing CPU performance quickly overcame the barriers that experts anticipated in the 1980s, and the value of parallel computing dropped to the point where it was for some time used only in niche situations. Although a number ofworkstations of increasing capacity were designed and built over the project's lifespan, they generally found themselves soon outperformed by "off the shelf" units available commercially.

The project also failed to incorporate outside innovations. During its lifespan,GUIs became mainstream in computers; theinternet enabled locally stored databases to become distributed; and even simple research projects provided better real-world results in data mining.[citation needed]

The FGCS workstations had no appeal in a market where general purpose systems could replace and outperform them. This is parallel to the Lisp machine market, where rule-based systems such asCLIPS could run on general-purpose computers, making expensive Lisp machines unnecessary.[9]

Ahead of its time

[edit]

In summary, the Fifth-Generation project was revolutionary, and accomplished some basic research that anticipated future research directions. Many papers and patents were published. MITI established a committee which assessed the performance of the FGCS Project as having made major contributions in computing, in particular eliminating bottlenecks in parallel processing software and the realization of intelligentinteractive processing based on large knowledge bases. However, the committee was strongly biased to justify the project, so this overstates the actual results.[5]

Many of the themes seen in the Fifth-Generation project are now being re-interpreted in current technologies, as the hardware limitations foreseen in the 1980s were finally reached in the 2000s. Whenclock speeds of CPUs began to move into the 3–5 GHz range,CPU power dissipation and other problems became more important. The ability ofindustry to produce ever-faster single CPU systems (linked toMoore's Law about the periodic doubling of transistor counts) began to be threatened.

In the early 21st century, many flavors ofparallel computing began to proliferate, includingmulti-core architectures at the low-end andmassively parallel processing at the high end. Ordinary consumer machines andgame consoles began to have parallel processors like theIntel Core,AMD K10, andCell.Graphics card companies like Nvidia and AMD began introducing large parallel systems likeCUDA andOpenCL.

See also

[edit]

References

[edit]
  1. ^"Roger Clarke's Software Generations".
  2. ^J. Marshall Unger,The Fifth Generation Fallacy (New York: Oxford University Press, 1987)
  3. ^abShapiro, Ehud Y. (1983)."The fifth generation project — a trip report".Communications of the ACM.26 (9):637–641.doi:10.1145/358172.358179.S2CID 5955109.
  4. ^Van Emden, Maarten H., and Robert A. Kowalski."The semantics of predicate logic as a programming language." Journal of the ACM 23.4 (1976): 733-742.
  5. ^abcOdagiri, Hiroyuki; Nakamura, Yoshiaki; Shibuya, Minorul (1997)."Research consortia as a vehicle for basic research: The case of a fifth generation computer project in Japan".Research Policy.26 (2):191–207.doi:10.1016/S0048-7333(97)00008-5.
  6. ^MIZOGUCHI, FUMIO (14 December 2013).Prolog and its Applications: A Japanese perspective. Springer. p. ix.ISBN 978-1-4899-7144-9.
  7. ^Shapiro E. A subset of Concurrent Prolog and its interpreter, ICOT Technical Report TR-003, Institute for New Generation Computer Technology, Tokyo, 1983. Also in Concurrent Prolog: Collected Papers, E. Shapiro (ed.), MIT Press, 1987, Chapter 2.
  8. ^Carl Hewitt.Inconsistency Robustness in Logic Programming ArXiv 2009.
  9. ^Hendler, James (1 March 2008)."Avoiding Another AI Winter"(PDF).IEEE Intelligent Systems.23 (2):2–4.doi:10.1109/MIS.2008.20.S2CID 35914860. Archived fromthe original(PDF) on 12 February 2012.

External links

[edit]
  • FGCS Museum - contains a large archive of nearly all of the output of the FGCS project, including technical reports, technical memoranda, hardware specifications, and software.
  • Details about 5th Generation Computer - How the Computer System evolved.
Retrieved from "https://en.wikipedia.org/w/index.php?title=Fifth_Generation_Computer_Systems&oldid=1339092119"
Categories:
Hidden categories:

[8]ページ先頭

©2009-2026 Movatter.jp