VGA was the last IBM graphics standard to which the majority of IBM PC compatible computer manufacturers conformed, making it thelowest common denominator that virtually all post-1990 PC graphics hardware can be expected to implement.[6]
VGA was adapted into many extended forms by third parties, collectively known asSuper VGA,[7] then gave way to customgraphics processing units which, in addition to their proprietary interfaces and capabilities, continue to implement common VGA graphics modes and interfaces to the present day.
The VGA analog interface standard has been extended to support resolutions of up to2048 × 1536 for general usage, with specialized applications improving it further still.[specify][8]
This small part count allowed IBM to include VGA directly on the PS/2 motherboard, in contrast to prior IBM PC models – PC,PC/XT, andPC AT – which required a separate display adapter installed in a slot in order to connect a monitor. The term "array" rather than "adapter" in the name denoted that it was not a complete independent expansion device, but a single component that could be integrated into a system.[11]
Unlike the graphics adapters that preceded it (MDA,CGA,EGA and many third-party options) there was initially no discrete VGA card released by IBM. The first commercial implementation of VGA was a built-in component of the IBM PS/2, in which it was accompanied by 256 KiB of video RAM, and a new DE-15 connector replacing the DE-9 used by previous graphics adapters. IBM later released the standaloneIBM PS/2 Display Adapter, which utilized the VGA but could be added to machines that did not have it built in.[12][11]
On some machines and cables, pin 9 was missing. Pin 9's purpose is to power an EEPROM chip in the monitor which tells the graphics card the capabilities on the monitor. Systems or cables missing this are likely using an older version of VGA.
Simulated VGA640 × 480 16 color imageSimulated VGA320 × 200 256 color image (corrected for aspect ratio)Comparison of standard resolutions including VGA's 640 × 480
The VGA supports all graphics modes supported by the MDA, CGA and EGA cards, as well as multiple new modes.
The640 × 480 16-color and320 × 200 256-color modes had fully redefinable palettes, with each entry selected from an18-bit (262,144-color) gamut.[15][16][17][18]
The other modes defaulted to standard EGA or CGA compatible palettes and instructions, but still permitted remapping of the palette with VGA-specific commands.
The640 × 480 resolution (at 256 colors rather than 16) was originally used by IBM inPGC graphics (which VGA offers no backward compatibility for) but did not see wide adoption until VGA was introduced. As the VGA began to be cloned in great quantities by manufacturers who added ever-increasing capabilities, its640 × 480, 16-color mode became the de facto lowest common denominator of graphics cards. By the mid 1990s, a640 × 480×16 graphics mode using the VGA memory and register specifications was expected by operating systems such asWindows 95 andOS/2 Warp 3.0, which provided no support for lower resolutions or bit depths, or support for other memory or register layouts without additional drivers. Well into the 2000s, even after theVESA standard for graphics cards became commonplace, the "VGA" graphics mode remained a compatibility option for PC operating systems.
Nonstandard display modes can be implemented, with horizontal resolutions of:
512 to 800 pixels wide, in 16 colors
256 to 400 pixels wide, in 256 colors
And heights of:
200, or 350 to 410 lines (including 400-line) at 70 Hz refresh rate, or
224 to 256, or 448 to 512 lines (including 240 or 480-line) at 60 Hz refresh rate
512 to 600 lines at reduced vertical refresh rates (down to 50 Hz, and including e.g. 528, 544, 552, 560, 576-line), depending on individual monitor compatibility.
For example, high resolution modes with square pixels are available at768 × 576 or704 × 528 in 16 colors, or medium-low resolution at320 × 240 with 256 colors. Alternatively, extended resolution is available with "fat" pixels and 256 colors using, e.g.400 × 600 (50 Hz) or360 × 480 (60 Hz), and "thin" pixels, 16 colors and the 70 Hz refresh rate with e.g.736 × 410 mode.
"Narrow" modes such as256 × 224 tend to preserve the same pixel ratio as in e.g.320 × 240 mode unless the monitor is adjusted to stretch the image out to fill the screen, as they are derived simply by masking down the wider mode instead of altering pixel or line timings, but can be useful for reducing memory requirements and pixel addressing calculations for arcade game conversions or console emulators.
The PC version ofPinball Fantasies has the option to use non-standard, "high res" modes, such as640 × 350, allowing it to display a larger portion of the pinball table on screen.[19] The gameScorched Earth uses a default resolution of360 × 480, with many other nonstandard resolutions available.[20]
80 × 25, rendered with a9 × 16 pixel font, with an effective resolution of720 × 400[21]
40 × 25, with a9 × 16 font, with an effective resolution of360 × 400
80 × 43 or80 × 50, with an8 × 8 font grid, with an effective resolution of640 × 344 or640 × 400 pixels.
As with the pixel-based graphics modes, additional text modes are possible by programming the VGA correctly, with an overall maximum of about100 × 80 cells and an active area spanning about88 × 64 cells.
One variant that is sometimes seen is80 × 30 or80 × 60, using an8 × 16 or8 × 8 font and an effective640 × 480 pixel display, which trades use of the more flickery 60 Hz mode for an additional 5 or 10 lines of text and square character blocks (or, at80 × 30, square half-blocks).
Unlike the cards that preceded it, which used binaryTTL signals to interface with a monitor (and alsocomposite, in the case of the CGA), the VGA introduced a video interface using pure analogRGB signals, with a range of 0.7 volts peak-to-peak max. In conjunction with a18-bitRAMDAC (6-bit per RGB channel), this produced a color gamut of 262,144 colors.[15][16][17][18]
The original VGA specifications follow:
Selectable 25.175 MHz[22] or 28.322 MHz master pixel clock
Maximum of 640 horizontalpixels[23] in graphics mode, and 720 pixels in text mode
The intended standard value for the horizontal frequency of VGA's640 × 480 mode is exactly double the value used in theNTSC-M video system, as this made it much easier to offer optionalTV-out solutions or external VGA-to-TV converter boxes at the time of VGA's development. It is also at least nominally twice that of CGA, which also supportedcomposite monitors.
All "derived" VGA timings (i.e. those which use the master 25.175 and 28.322 MHz crystals and, to a lesser extent, the nominal 31.469 kHz line rate) can be varied by software that bypasses the VGA firmware interface and communicates directly with the VGA hardware, as many MS-DOS based games did. However, only the standard modes, or modes that at least use almost exactly the same H-sync and V-sync timings as one of the standard modes, can be expected to work with the original late-1980s and early-1990s VGA monitors. The use of other timings may in fact damage such monitors and thus was usually avoided by software publishers.
Third-party "multisync" CRT monitors were more flexible, and in combination with "super EGA", VGA, and later SVGA graphics cards using extended modes, could display a much wider range of resolutions and refresh rates at arbitrary sync frequencies and pixel clock rates.
For the most common VGA mode (640 × 480, 60 Hz,non-interlaced), the horizontal timings can be found in the HP Super VGA Display Installation Guide and in other places.[25][26]
640 × 400 @ 70 Hz is traditionally the video mode used for booting VGA-compatiblex86personal computers[27] that show a graphical boot screen, while text-mode boot uses720 × 400 @ 70 Hz.
This convention has been eroded in recent years, however, with POST and BIOS screens moving to higher resolutions, taking advantage ofEDID data to match the resolution to a connected monitor.[citation needed]
640 × 480 @ 60 Hz is the default Windows graphics mode (usually with 16 colors),[27] up to Windows 2000. It remains an option in XPand later versions[citation needed] via the boot menu "low resolution video" option and per-application compatibility mode settings, despite newer versions of Windows now defaulting to1024 × 768 and generally not allowing any resolution below800 × 600 to be set.
The need for such a low-quality, universally compatible fallback has diminished since the turn of the millennium, asVGA-signalling standard screens or adaptors unable to show anything beyond the original resolutions have become increasingly rare[clarify].
320 × 200 at 70 Hz was the most common mode for early 1990's PC games, with pixel-doubling and line-doubling performed in hardware to present a640 × 400 at 70 Hz signal to the monitor.
TheWindows 95/98/MeLOGO.SYS boot-up image was 320 × 400 resolution, displayed with pixel-doubling to present a640 × 400 at 70 Hz signal to the monitor. The 400-line signal was the same as the standard80 × 25 text mode, which meant that pressingEsc to return to text mode didn't change the frequency of the video signal, and thus the monitor did not have to resynchronize (which could otherwise have taken several seconds).[citation needed]
All VGA connectors carryanalogRGBHV (red, green, blue,horizontal sync,vertical sync) video signals. Modern connectors also includeVESADDC pins, for identifying attached display devices.
Because VGA uses low-voltage analog signals, signal degradation becomes a factor with low-quality or overly long cables. Solutions include shielded cables, cables that include a separate internalcoaxial cable for each color signal, and "broken out" cables utilizing a separate coaxial cable with aBNC connector for each color signal.
BNC breakout cables typically use five connectors, one each for Red, Green, Blue, Horizontal Sync, and Vertical Sync, and do not include the other signal lines of the VGA interface. With BNC, the coaxial wires are fully shielded end-to-end and through the interconnect so that virtually no crosstalk and very little external interference can occur. The use of BNC RGB video cables predates VGA in other markets and industries.
VGA 256 default color paletteVGA palette organised into 4 groupsExamples of VGA images in 640×480 with 16 colors and 320×200 with 256 colors (bottom).Dithering is used to mask color limitations.
The VGA color system uses register-based palettes to map colors in various bit depths to its 18-bit output gamut. It isbackward compatible with the EGA and CGA adapters, but supports extrabit depth for the palette when in these modes.
For instance, when in EGA 16-color modes, VGA offers 16 palette registers, and in 256-color modes, it offers 256 registers.[28] Each palette register contains a3×6 bit RGB value, selecting a color from the 18-bit gamut of theDAC.
These color registers are initialized to default values IBM expected to be most useful for each mode. For instance, EGA 16-color modes initialize to the default CGA 16-color palette, and the 256-color mode initializes to a palette consisting of 16 CGA colors, 16 grey shades, and then 216 colors chosen by IBM to fit expected use cases.[29][30]After initialization they can be redefined at any time without altering the contents of video RAM, permittingpalette cycling.
In the 256-color modes, the DAC is set to combine four 2-bit color values, one from each plane, into an 8-bit-value representing an index into the 256-color palette. The CPU interface combines the 4 planes in the same way, a feature called "chain-4", so that each pixel appears to the CPU as a packed 8-bit value representing the palette index.[31]
The video memory of the VGA is mapped to the PC's memory via a window in the range between segments0xA0000 and 0xBFFFF in the PC'sreal mode address space (A000:0000 and B000:FFFF in segment:offset notation). Typically, these starting segments are:
0xB8000 for color text mode and CGA-compatible graphics modes (32 KB)
A typical VGA card also provides this port-mapped I/O segment:
0x3B0 to 0x3DF
Due to the use of different address mappings for different modes, it is possible to have a monochrome adapter (i.e. MDA orHercules) and a color adapter such as the VGA,EGA, orCGA installed in the same machine.
At the beginning of the 1980s, this was typically used to displayLotus 1-2-3 spreadsheets in high-resolution text on a monochrome display and associated graphics on a low-resolution CGA display simultaneously. Many programmers also used such a setup with the monochrome card displaying debugging information while a program ran in graphics mode on the other card. Several debuggers, like Borland'sTurbo Debugger,D86 and Microsoft'sCodeView could work in a dual monitor setup. Either Turbo Debugger or CodeView could be used to debug Windows.
There were also device drivers such asox.sys, which implemented a serial interface simulation on the monochrome display and, for example, allowed the user to receive crash messages from debugging versions of Windows without using an actual serial terminal.
It is also possible to use the "MODE MONO" command at the command prompt to redirect the output to the monochrome display. When a monochrome adapter was not present, it was possible to use the 0xB000–0xB7FF address space as additional memory for other programs.
"Unchaining" the 256 KB VGA memory into four separate "planes" makes VGA's 256 KB of RAM available in 256-color modes. There is a trade-off for extra complexity and performance loss in some types of graphics operations, but this is mitigated by other operations becoming faster in certain situations:
Single-color polygon filling could be accelerated due to the ability to set four pixels with a single write to the hardware.[33]
The video adapter could assist in copying video RAM regions, which was sometimes faster than doing this with the relatively slow CPU-to-VGA interface.
The use of multiple video pages in hardware alloweddouble buffering,triple buffering or split screens, which, while available in VGA's320 × 200 16-color mode, was not possible using stockMode 13h.
Most particularly, several higher, arbitrary-resolution display modes were possible, all the way up to the programmable limit of800 × 600 with 16 colors (or400 × 600 with 256 colors), as well as other custom modes using unusual combinations of horizontal and vertical pixel counts in either color mode.
Software such asFractint,Xlib and ColoRIX also supported tweaked 256-color modes on standard adaptors using freely-combinable widths of 256, 320, and 360 pixels and heights of 200, 240 and 256 (or 400, 480 and 512) lines, extending still further to 384 or 400 pixel columns and 576 or 600 (or 288, 300). However,320 × 240 was the best known and most frequently used, as it offered a standard 40-column resolution and 4:3 aspect ratio with square pixels. "320 × 240 × 8" resolution was commonly calledMode X, the name used byMichael Abrash when he presented the resolution inDr. Dobb's Journal.
The highest resolution modes were only used in special, opt-in cases rather than as standard, especially where high line counts were involved. Standard VGA monitors had a fixed line scan (H-scan) rate – "multisync" monitors being, at the time, expensive rarities – and so the vertical/frame (V-scan)refresh rate had to be reduced in order to accommodate them, which increased visibleflicker and thuseye strain. For example, the highest800 × 600 mode, being otherwise based on the matching SVGA resolution (with 628 total lines), reduced the refresh rate from 60 Hz to about 50 Hz (and832 × 624, the theoretical maximum resolution achievable with 256 KB at 16 colors, would have reduced it to about 48 Hz, barely higher than the rate at which XGA monitors employed a double-frequency interlacing technique to mitigate full-frame flicker).
These modes were also outright incompatible with some monitors, producing display problems such as picture detail disappearing intooverscan (especially in the horizontal dimension), vertical roll, poorhorizontal sync or even a complete lack of picture depending on the exact mode attempted. Due to these potential issues, most VGA tweaks used in commercial products were limited to more standards-compliant, "monitor-safe" combinations, such as320 × 240 (square pixels, three video pages, 60 Hz),320 × 400 (double resolution, two video pages, 70 Hz), and360 × 480 (highest resolution compatible with both standard VGA monitors and cards, one video page, 60 Hz) in 256 colors, or double the horizontal resolution in 16-color mode.
Extended Graphics Array (XGA) is an IBM display standard introduced in 1990. Later it became the most common appellation of the 1024 × 768 pixelsdisplay resolution.
^Dr. Jon Peddie (12 March 2019)."Famous Graphics Chips: IBM's VGA. The VGA was the most popular graphics chip ever". Retrieved2020-04-13.It is said about airplanes that the DC3 and 737 are the most popular planes ever built, and the 737, in particular, the best-selling airplane ever. The same could be said for the ubiquitous VGA, and its big brother the XGA. The VGA, which can still be found buried in today's modern GPUs and CPUs, set the foundation for a video standard, and an application programming standard.
"IBM VGA Technical Reference Manual"(PDF).This is the original IBM reference. The document provides good overview of VGA functionality and is fairly complete, including a detailed description of standard BIOS modes and some programming techniques.