The increasing capabilities of mainstream PCs since the late 1990s have reduced distinction between the PCs and workstations.[7] Typical 1980s workstations have expensive proprietary hardware and operating systems to categorically distinguish from standardized PCs. From the 1990s and 2000s,IBM'sRS/6000 andIntelliStation haveRISC-basedPOWER CPUs runningAIX, versus its corporateIBM PC Series and consumerAptiva PCs that have Intel x86 CPUs and usually runningMicrosoft Windows. However, by the early 2000s, this difference largely disappeared, since workstations use highlycommoditized hardware dominated by large PC vendors, such asDell,HP Inc., andFujitsu, sellingx86-64 systems runningWindows orLinux.
Workstations are older than the firstpersonal computer (PC).[8] The first computer that might qualify as a workstation is theIBM 1620, a small scientific computer designed to be used interactively by a single person sitting at the console.[9] It was introduced in 1959.[10] One peculiar feature of the machine is that it lacks any arithmetic circuitry.[11] To perform addition, it requires a memory-resident table of decimal addition rules.[12] This reduced the cost of logic circuitry, enabling IBM to make it inexpensive. The machine iscodenamed CADET and was initially rented for $1000 per month.
In 1965, theIBM 1130 scientific computer became the successor to 1620. Both of these systems runFortran and other languages.[13] They are built into roughly desk-sized cabinets, with console typewriters. They have optional add-on disk drives, printers, and both paper-tape and punched-card I/O.
Workstations have historically been more advanced than contemporary PCs, with more powerful CPU architectures, earlier networking, more advanced graphics, more memory, and multitasking with sophisticated operating systems like Unix. Because of their minicomputer heritage, from the start workstations have run professional and expensive software such as CAD and graphics design, as opposed to PCs' games and text editors.[8] TheLisp machines developed atMIT in the early 1970s pioneered some workstation principles, as high-performance, networked, single-user systems intended for heavily interactive use. Lisp Machines were commercialized beginning 1980 by companies likeSymbolics,Lisp Machines,Texas Instruments (theTI Explorer), andXerox (theInterlisp-D workstations). The first computer designed for a single user, with high-resolution graphics (and so a workstation in the modern sense), is theAlto developed atXerox PARC in 1973.[15] Other early workstations include theTerak 8510/a (1977),[16]Three Rivers PERQ (1979), and the laterXerox Star (1981).
In the early 1980s, with the advent of32-bitmicroprocessors such as theMotorola 68000, several new competitors appeared, includingApollo Computer andSun Microsystems,[17] with workstations based on 68000 andUnix.[18][19] Meanwhile,DARPA'sVLSI Project created several spinoff graphics products, such as theSilicon Graphics3130. Target markets were differentiated, with Sun and Apollo considered to be network workstations and SGI as graphics workstations.RISC CPUs increased in the mid-1980s, typical of workstation vendors.[20] Competition between RISC vendors lowered CPU prices to as little as $10 per MIPS, much less expensive than theIntel 80386;[21] after large price cuts in 1987 and 1988, a personal workstation suitable for 2D CAD costing$5,000 (equivalent to $13,000 in 2024) to$25,000 (equivalent to $63,000 in 2024) was available from multiple vendors. Mid-range models capable of 3D graphics cost from$35,000 (equivalent to $89,000 in 2024) to$60,000 (equivalent to $152,000 in 2024), while high-end models overlapping with minicomputers cost from$80,000 (equivalent to $203,000 in 2024) to$100,000 (equivalent to $254,000 in 2024) or more.[22]
By then a$12,000 (equivalent to $30,000 in 2024) "personal workstation" might be a high-end PC likeMacintosh II orIBM PS/2 Model 80, low-end workstation, or a hybrid device like theNeXT Computer, all with similar, overlapping specifications.[8] One differentiator between PC and workstation was that the latter was much more likely to have agraphics accelerator with support for a graphics standard likePHIGS orX Window, while the former usually depended onsoftware rendering or proprietary accelerators. Thecomputer animation industry's needs typically caused improvements in graphical technology, with CAD using the same improvements later.[22]BYTE predicted in 1989 "Soon, the only way we'll be able to tell the difference between traditional workstations and PCs will be by the operating system they run", with the former running Unix and the latter runningOS/2,classic Mac OS, and/or Unix. Many workstations by then had some method to run increasingly popular and powerful PC software such asLotus 1-2-3 orMicrosoft Word.[8] The magazine demonstrated that year that an individual could build a workstation with commodity components with specifications comparable to commercially available low-end workstations.[23]
Workstations often featuredSCSI orFibre Channel disk storage systems, high-end3D accelerators, single or multiple64-bitprocessors,[24] large amounts ofRAM, and well-designed cooling. Additionally, the companies that make the products tend to have comprehensive repair/replacement plans. As the distinction between workstation and PC fades, however, workstation manufacturers have increasingly employed "off-the-shelf" PC components and graphics solutions rather than proprietary hardware or software. Some "low-cost" workstations are still expensive by PC standards but offer binary compatibility with higher-end workstations and servers made by the same vendor. This allows software development to take place on low-cost (relative to the server) desktop machines.
Workstations diversified to the lowest possible price point as opposed to performance, called thethin client ornetwork computer. Dependent upon a network and server, this reduces the machine to having no hard drive, and only the CPU, keyboard, mouse, and screen. Somediskless nodes still run a traditional operating system and perform computations locally, with storage on a remoteserver.[25] These are intended to reduce the initial system purchase cost, and thetotal cost of ownership, by reducing the amount of administration required per user.[26]
This approach was first attempted as a replacement for PCs in office productivity applications, with the3Station by3Com. In the 1990s,X terminals filled a similar role for technical computing. Sun'sthin clients include theSun Ray product line.[27] However, traditional workstations and PCs continued to drop in price and complexity as remote management tools for IT staff became available, undercutting this market.
ANeXTstation graphics workstation from 1990Sony NEWS workstation: 2×68030 at 25 MHz, 1280×1024 pixel and 256-color displaySGI Indy graphics workstationSGI O2 graphics workstationHP C8000 workstation runningHP-UX 11i withCDESix workstations: four HP Z620, one HP Z820, one HP Z420
A high-end workstation of the early 1980s with the three Ms, or a "3M computer" (coined by Raj Reddy and his colleagues at CMU), has one megabyte of RAM, a megapixel display (roughly 1000×1000 pixels), and one "MegaFLOPS" compute performance (at least one million floating-point operations per second).[28] RFC 782 defines the workstation environment more generally as "hardware and software dedicated to serve a single user", and that it provisions additional shared resources. This is at least one order of magnitude beyond the capacity of the personal computer of the time. The original 1981IBM Personal Computer has 16 KB memory, a text-only display, and floating-point performance around1 kFLOPS (30 kFLOPS with the optional 8087 math coprocessor. Other features beyond the typical personal computer include networking, graphics acceleration, and high-speed internal and peripheral data buses.
Another goal was to bring the price below one "megapenny", that is, less than$10,000 (equivalent to $29,000 in 2024), which was achieved in the late 1980s. Throughout the early to mid-1990s, many workstations cost from$15,000 to$100,000 (equivalent to $206,000 in 2024) or more.
The more widespread adoption of these technologies into mainstream PCs was a direct factor in the decline of the workstation as a separate market segment:[29]
High-performanceCPUs: the firstRISC of the early 1980s offer roughly one order of magnitude in performance improvement overCISC processors of comparable cost.Intel'sx86 CISC family always had the edge in market share and theeconomies of scale that this implied. By the mid-1990s, some CISC processors like theMotorola 68040 and Intel's80486 andPentium have performance parity with RISC in some areas, such as integer performance (at the cost of greater chip complexity) and hardwarefloating-point calculations, relegating RISC to even more high-end markets.[30]
Hardware support forfloating-point operations: optional on the original IBM PC; remained on a separate chip for Intel systems until the80486DX processor. Even then, x86 floating-point performance lags other processors due to limitations in its architecture. Today even low-price PCs now have performance in the gigaFLOPS range.
High-performance/high-capacity data storage: early workstations tend to use proprietary disk interfaces until the SCSI standard of the mid-1980s. Although SCSI interfaces soon became available for IBM PCs, they were comparatively expensive and tend to be limited by the speed of the PC'sISA peripheral bus. SCSI is an advanced controller interface good for multitasking and daisy chaining. This makes it suited for use in servers, and its benefits to desktop PCs which mostly run single-user operating systems are less clear, but it is standard on the 1980s-1990s Macintosh.Serial ATA is more modern, with throughput comparable to SCSI but at a lower cost.
High-speednetworking (10 Mbit/s or better): 10 Mbit/s network interfaces were commonly available for PCs by the early 1990s, although by that time workstations were pursuing even higher networking speeds, moving to 100 Mbit/s, 1 Gbit/s, and 10 Gbit/s. However, economies of scale and the demand for high-speed networking in even non-technical areas have dramatically decreased the time it takes for newer networking technologies to reach commodity price points.
Large displays (17- to 21-inch) with high resolutions and high refresh rates for graphics and CAD work, which were rare among PCs in the late 1980s and early 1990s but became common among PCs by the late 1990s.
Large memory configurations: PCs (such as IBM clones) are originally limited to 640 KB of RAM until the 1982 introduction of the80286 processor; early workstations have megabytes of memory. IBM clones require special programming techniques to address more than 640 KB until the 80386, as opposed to other 32-bit processors such asSPARC which provide straightforward access to nearly their entire 4 GB memory address range. 64-bit workstations and servers supporting an address range far beyond 4 GB have been available since the early 1990s, a technology just beginning to appear in the PC desktop and server market in the mid-2000s.
Operating system: early workstations ran theUnix operating system (OS), aUnix-like variant, or an unrelated equivalent OS such asVMS. The PC CPUs of the time have limitations in memory capacity andmemory access protection, making them unsuitable to run OSes of this sophistication, but this, too, began to change in the late 1980s as PCs with the32-bit80386 with integrated pagedMMUs became widely affordable and enablingOS/2,Windows NT 3.1, and Unix-like systems based onBSD andLinux on commodity PC hardware.
Tight integration between the OS and the hardware: Workstation vendors both design the hardware and maintain the Unix operating system variant that runs on it. This allows for much more rigorous testing than is possible with an operating system such as Windows. Windows requires that third-party hardware vendors write compliant hardware drivers that are stable and reliable. Also, minor variations in hardware quality such as timing or build quality can affect the reliability of the overall machine. Workstation vendors are able to ensure both the quality of the hardware, and the stability of the operating system drivers by validating these things in-house, and this leads to a generally much more reliable and stable machine.
Since the late 1990s, the workstation and consumer markets have further merged. Many low-end workstation components are now the same as the consumer market, and the price differential narrowed. For example, mostMacintosh Quadra computers were originally intended for scientific or design work, all with theMotorola 68040 CPU, backward compatible with68000 Macintoshes. The consumerMacintosh IIcx andMacintosh IIci models can be upgraded to theQuadra 700. "In an era when many professionals preferred Silicon Graphics workstations, the Quadra 700 was an intriguing option at a fraction of the cost" as resource-intensive software such asInfini-D brought "studio-quality 3D rendering and animations to the home desktop". The Quadra 700 can runA/UX 3.0, making it aUnix workstation.[31] Another example is theNvidiaGeForce 256 consumer graphics card, which spawned theQuadro workstation card, which has the same GPU but different driver support and certifications for CAD applications and a much higher price.
Workstations have typically driven advancements in CPU technology. All computers benefit from multi-processor and multicore designs (essentially, multiple processors on adie). The multicore design was pioneered by IBM'sPOWER4; it and Intel Xeon have multiple CPUs, more on-die cache, and ECC memory.
Some workstations are designed or certified for use with only one specific application such asAutoCAD,Avid Xpress Studio HD, or3D Studio Max. The certification process increases workstation prices.
SGI ended general availability of its MIPS-basedSGI Fuel andSGI Tezro workstations in December 2006.[35]
Sun Microsystems announced end-of-life for its lastSun Ultra SPARC workstations in October 2008.[36]
In early 2018, RISC workstations were reintroduced in a series ofIBMPOWER9-based systems by Raptor Computing Systems.[37][38] In October of 2024 System 76 introduces The Thelio Astra an Arm workstation aim for autonomous car industry.[39]
Most of the current workstation market uses x86-64 microprocessors. Operating systems includeWindows,FreeBSD,Linux distributions,macOS, andSolaris.[40] Some vendors also market commodity mono-socket systems as workstations.
These are three types of workstations:
Workstation blade systems (IBM HC10 or Hewlett-Packard xw460c.Sun Visualization System is akin to these solutions)[41]
Deskside systems containing server-class CPUs and chipsets on large server-class motherboards with high-end RAM (HP Z-series workstations andFujitsu CELSIUS workstations)
A high-end desktop market segment includes workstations, with PC operating systems and components. Component product lines may be segmented, with premium components that are functionally similar to the consumer models but with higher robustness or performance.[42]
A workstation-class PC may have some of the following features:
^Hey, Anthony J. G. (2015).The computing universe : a journey through a revolution. Gyuri Pápay. Cambridge University Press.ISBN978-1-316-12976-0.OCLC899007268.