This article'sfactual accuracy may be compromised due to out-of-date information. Please help update this article to reflect recent events or newly available information.(October 2025)
NTSC (anacronym ofNational Television System Committee) was the first American standard foranalog television, published and adopted in 1941.[1] It was one of three major color formats for analog television; the others werePAL andSECAM.NTSC color was usually associated with System M, and this combination was sometimes called NTSC II.[2][3] A second NTSC standard was adopted in 1953,[4] which allowedcolor television compatible with the existing stock ofblack-and-white sets.[5][6][7] TheEIA defined NTSC performance standards in EIS-170 (also known as RS-170) in 1957.[8]
The term "NTSC" has referred to digital formats with 480–487 active lines and a 30 or 29.97 FPSframe rate since the introduction of digital sources such as DVDs, and is a digital shorthand for System M. TheNTSC-Film standard has a digital resolution of720 × 480 pixels forDVD-Videos,480 × 480 pixels forSuper Video CDs (SVCD,aspect ratio 4:3) and352 × 240 pixels forVideo CDs (VCD).[9] Thedigital video (DV)-camcorder format equivalent of NTSC is720 × 480 pixels.[10] Thedigital television (DTV) equivalent is704 × 480 pixels.[10]
The NTSC was established in 1940 by the United StatesFederal Communications Commission (FCC) to resolve conflicts between companies about the introduction of a nationwide analog television system. In March 1941, the committee issued a technical standard for black-and-white television based on a 1936 recommendation by the Radio Manufacturers Association (RMA). Technical advancements of thevestigial sideband technique provided an opportunity to increase image resolution. The NTSC selected 525 scan lines as a compromise betweenRCA's441-scan line standard (used by RCA'sNBC TV network) andPhilco andDuMont's desire to increase the number of scan lines to between 605 and 800. The standard recommended a frame rate of 30 FPS, consisting of twointerlacedfields per frame at 262.5 lines per field and 60 fields per second. Other standards in the final recommendation were anaspect ratio of 4:3 andfrequency modulation (FM) of the sound signal.
In January 1950, the committee was reconstituted to standardize color television. The FCC had briefly approved a405-linefield-sequential color TV standard, developed byCBS, in October 1950.[11] The CBS system was incompatible with existing black-and-white sets. It used a rotating color wheel, reduced the number ofscan lines from 525 to 405, and increased the field rate from 60 to 144 with an effective frame rate of 24 fps. Legal action by rival RCA kept commercial use of the system off the air until June 1951, and regular broadcasts only lasted a few months before the manufacture of all color sets was banned by theOffice of Defense Mobilization in October (ostensibly due to theKorean War).[12][13][14][15] A variant of the CBS system was later used byNASA to broadcast pictures of astronauts in space.[16] CBS rescinded its system in March 1953,[17] and the FCC replaced it on December 17 of that year with an NTSC color standard developed by several companies (including RCA and Philco).[18]
In December 1953, the FCC unanimously approved what became the NTSC color-television standard (later defined as RS-170a). The standard retained backward compatibility with existing black-and-white sets. Color information was added to the black-and-white image by introducing a colorsubcarrier of315⁄88 MHz (3.579545 MHz ± 10 Hz).[19] This frequency was chosen so horizontal line-rate modulation components of thechrominance signal fall between the horizontal line-rate modulation components of the luminance signal; the chrominance signal could be easily filtered out of the luminance signal on new sets, and would be minimally visible on existing sets. Due to limitations offrequency divider circuits when the color standard was promulgated, the color subcarrier frequency was constructed as a composite frequency assembled from small integers – in this case, 5 × 7 × 9 MHz divided by 8 × 11. The horizontal line rate was reduced to 15,734 lines per second (3.579545 MHz × 2 ÷ 455 =9⁄572 MHz) from 15,750 lps, and the frame rate was reduced to 30/1.001 ≈ 29.970 fps (the horizontal line rate divided by 525 lines/frame) from 30 fps. The changes amounted to 0.1 percent, and were tolerated by existing TV sets.[20][21]
The first publicly-announced network television broadcast of a program using the NTSC "compatible color" system was an episode of NBC'sKukla, Fran and Ollie on August 30, 1953, viewable in color only at NBC headquarters.[22] The first nationwide viewing of NTSC color was on the following January 1 with the coast-to-coast broadcast of theTournament of Roses Parade, viewable on prototype color receivers at special presentations nationwide. The first color NTSCtelevision camera was theRCA TK-40, used for experimental broadcasts in 1953; an improved version, the TK-40A (introduced in March 1954), was the first commercially-available color-television camera. Later that year, an improved TK-41 became the standard camera and was used through much of the 1960s. The NTSC standard was adopted by other countries, includingJapan and several in theAmericas.
With the advent ofdigital television, analog broadcasts were largely phased out. NTSC broadcasters in the U.S. were required by the FCC to shut down their analog transmitters by February 17, 2009; the shutdown was later moved to June 12 of that year.Low-power andClass A stations andtranslators were required to shut down by 2015, although an FCC extension allowed some stations operating on Channel 6 to operate until July 13, 2021.[23] Canadian analog TV transmitters in markets not subject to the mandatory 2011 transition were to be shut down by January 14, 2022, under a 2017 schedule fromInnovation, Science and Economic Development Canada.[24]
Most countries using the NTSC standard and those using otheranalog television standards have switched (or are switching) to newer digital television standards; at least four different standards are in use worldwide. North America, parts ofCentral America, andSouth Korea are adopting (or have adopted) theATSC standards; other countries, such asJapan, are adopting (or have adopted) standards other than ATSC. Most over-the-air NTSC transmissions in the United States ended on June 12, 2009,[25] and by August 31, 2011,[26] inCanada and most other NTSC markets.[27]
Colorimetry refers to the colorimetric characteristics of the system and its components, including the primary colors used, the camera, and the display. NTSC color had two distinctly-defined colorimetries, shown on thechromaticity diagram as NTSC 1953 and SMPTE C. Manufacturers introduced a number of variations for technical, economic, marketing, and other reasons.[28]
The original 1953 color NTSC specification, still part of the United StatesCode of Federal Regulations, defined thecolorimetric values of the system as shown in the table.[29] Early color-television receivers, such as the RCACT-100, were faithful to this specification (based on prevailing motion-picture standards) which had a largergamut than most present-day monitors. Their low-efficiency phosphors (notably in red) were weak and persistent, leaving trails after moving objects. Beginning in the late 1950s, picture-tubephosphors sacrificed saturation for increased brightness; this deviation from the standard at receiver and broadcaster was the source of considerable color variation.
To ensure more uniform color reproduction, some manufacturers incorporated color-correction circuits into sets which converted the received signal—encoded for colorimetric values—and adjusted the monitor's phosphor characteristics. Since color cannot be accurately corrected on the nonlinear transmittedgamma corrected signals, the adjustment can only be approximated.
At the broadcaster stage, in 1968–69 theConrac Corporation (working with RCA) defined a set of controlled phosphors for use in broadcast colorvideo monitors.[30] This specification survives as the SMPTE C phosphor specification.[31] As with home receivers, it was recommended[32] that studio monitors incorporate similar color-correction circuits so broadcasters would transmit pictures encoded for the original 1953 colorimetric values in accordance with FCC standards.
In 1987, theSociety of Motion Picture and Television Engineers (SMPTE) Committee on Television Technology Working Group on Studio Monitor Colorimetry adopted the SMPTE C (Conrac) phosphors for general use in Recommended Practice 145;[33] this prompted many manufacturers to modify their camera designs to encode for SMPTE C colorimetry without color correction[34] as approved in SMPTE standard 170M, "Composite Analog Video Signal – NTSC for Studio Applications" (1994). TheATSC digital television standard states that for480i signals, SMPTE C colorimetry should be assumed unless colorimetric data is included in thetransport stream.[35]
The Japanese NTSC never changed primaries andwhite point to SMPTE C, continuing to use the 1953 NTSC primaries and white point.[32] ThePAL andSECAM systems used the original 1953 NTSC colorimetry until 1970;[32] unlike NTSC, the European Broadcasting Union (EBU) rejected color correction in receivers and studio monitors and called for all equipment to encode signals for EBU colorimetric values.[36]
Fully saturated color spectrum rendered using SMPTE C (top) and NTSC 1953 (bottom) colorimetry.
In the gamuts on the CIE chromaticity diagram, variations among colorimetries can result in visual differences. Proper viewing requiresgamut mapping viaLUTs or additionalcolor grading. SMPTE Recommended Practice RP 167-1995 refers to such an automatic correction as an "NTSC corrective display matrix."[37] Material prepared for 1953 NTSC may look de-saturated when displayed on SMPTE C or ATSC/BT.709 displays, and may have noticeable hue shifts. SMPTE C materials may appear slightly more saturated on BT.709/sRGB displays, or significantly more saturated on P3 displays, if appropriate gamut mapping is not done.
NTSC uses aluminance-chrominance encoding system. Using a separate luminance signal maintained backward compatibility with contemporary black-and-white television sets; only color sets would recognize the chroma signal.
The red, green, and blue primary color signals are weighted and summed into a single luma signal, designated (Y prime),[38] which replaces the originalmonochrome signal. The color-difference information is encoded into the chrominance signal, which carries only the color information. This allows black-and-white receivers to display NTSC color signals by ignoring the chrominance signal. Some black-and-white TVs sold in the U.S. after the introduction of color broadcasting in 1953 were designed to filter chroma out, but early sets did not do this andchrominance could be seen as acrawling dot pattern in areas of the picture with saturated colors.[39]
To derive separate signals with only color information, the difference is determined between each color primary and the summed luma; the red difference signal is, and the blue difference signal is. These difference signals are used to derive two new color signals, known as (in-phase) and (in quadrature), in a process known asQAM. The color space is rotated relative to the difference-signal color space; orange-blue color information (which the human eye is most sensitive to) is transmitted on the signal at 1.3 MHz bandwidth, and the signal encodes purple-green color information at 0.4 MHz bandwidth. This allows the chrominance signal to use less overall bandwidth without noticeable color degradation. The two signals each amplitude modulate[40] 3.58 MHz carriers, which are 90 degrees out ofphase with each other,[41] and the result is the sum with thecarriers suppressed.[42][40] The result can be viewed as a singlesine wave, with varying phase relative to a reference carrier and with varying amplitude. The varying phase represents the instantaneous colorhue captured by a TV camera, and the amplitude represents the colorsaturation. The315⁄88 MHzsubcarrier is added to the luminance to form the composite color signal,[40] which modulates the video-signalcarrier.[43]
For a color TV to recover hue information from the color subcarrier, it must have a zero-phase reference to replace the previously-suppressed carrier. The NTSC signal includes a short sample of this reference signal, known as thecolorburst, located on the back of each horizontal synchronization pulse. The colorburst consists of at least eight cycles of the unmodulated color subcarrier. The TV receiver has a local oscillator, which is synchronized with these color bursts to create a reference signal. Combining the reference phase signal with the chrominance signal allows the recovery of the and signals, which (with the signal) is reconstructed to the individual signals which are sent to theCRT to form the image.
In CRT televisions, the NTSC signal is turned into three color signals: red, green, and blue; each controls an electron beam designed to excite only the corresponding red, green, or blue phosphors. TV sets with digital circuitry use sampling techniques to process the signals, with identical results. For analog and digital sets processing an analog NTSC signal, the original three color signals are transmitted with three discrete signals (Y, I and Q), recovered as three separate colors (R, G, and B), and presented as a color image.
When a transmitter broadcasts an NTSC signal, it amplitude-modulates a radio-frequency carrier with the NTSC signal and frequency-modulates a carrier 4.5 MHz higher with the audio signal. With non-linear distortion of the broadcast signal, the315⁄88 MHz color carrier maybeat with the sound carrier to produce a dot pattern on the screen.
Spectrum of a System M television channel with NTSC color
A transmitted NTSCtelevision channel has a total bandwidth of 6 MHz. The actual video signal, which isamplitude-modulated, is transmitted between 500 kHz and 5.45 MHz above the lower end of the channel. The videocarrier is 1.25 MHz above the lower end of the channel. Like most AM signals, the video carrier generates twosidebands: one above the carrier and one below. Each sideband is 4.2 MHz wide. The upper sideband is transmitted, but only 1.25 MHz of the lower sideband (known as avestigial sideband) is transmitted. The color subcarrier, 3.579545 MHz above the video carrier, isquadrature-amplitude-modulated with a suppressed carrier. The audio signal isfrequency modulated with a 25-kHz maximumfrequency deviation, less than the 75 kHz deviation on theFM band. The main audio carrier is 4.5 MHz above the video carrier, 250 kHz below the top of the channel. Sometimes a channel may contain anMTS signal, which offers more than one audio signal by adding one or two subcarriers to the audio signal; this is normally the case whenstereo audio orsecond audio program signals are used. The same extensions are used inATSC, whose digital carrier is 0.31 MHz above the low end of the channel.
Analog signal values for basic RGB colors, encoded in NTSC[44]
Film has aframe rate of 24 frames per second, and the NTSC standard is approximately 29.97 (10 MHz ×63/88/455/525) fps. In regions with 25 fps television and video standards, this difference can be overcome byspeed-up. For 30 fps standards,3:2 pulldown is used. One film frame is transmitted for three video fields (lasting1+1⁄2 video frames), and the next frame is transmitted for two video fields (lasting 1 video frame). Two film frames are transmitted in five video fields, for an average of2+1⁄2 video fields per film frame. The average frame rate is 60 ÷ 2.5 = 24 frames per second.
Film shot specifically for NTSC television usually has a speed of 30 frames per second to avoid 3:2 pulldown.[45] To show 25 fps material (such as Europeantelevision series and some European films) on NTSC equipment, every fifth frame is duplicated and the resulting stream isinterlaced.
Film shot for NTSC television at 24 frames per second has traditionally been accelerated by 1/24 (to about 104.17% of normal speed) for transmission in regions with 25 fps television standards. This increase in picture speed has traditionally been accompanied by a similar increase in audio pitch and tempo. Frame-blending is used to convert 24 fps video to 25 fps without altering its speed.
Film shot for television in regions with 25 fps television standards can be handled in one of two ways:
The film can be shot at 24 frames per second; when transmitted in its native region, it can be accelerated to 25 fps according to the analog technique or kept at 24 fps by the digital technique. When the film is transmitted in regions with a nominal 30 fps television standard, there is no noticeable change in speed, tempo, and pitch.
The film can be shot at 25 frames per second; when transmitted in its native region, it is shown at its normal speed with no alteration of the accompanying soundtrack. When the film is shown in regions with a 30 fps television standard, every fifth frame is duplicated with no noticeable change in speed, tempo, and pitch.
An NTSC frame has twofields: F1 and F2.Field dominance depends on a combination of factors, including decisions by equipment manufacturers and historical conventions. Most professional equipment can switch between a dominant upper or dominant lower field.[25][46] Field dominance is important when editing NTSC video; incorrect interpretation of field order can cause a shuddering effect as moving objects jump forward and behind on each successive field. This is important when interlaced NTSC istranscoded to a format with a different field dominance. Field order is important when transcodingprogressive video to interlaced NTSC to prevent flash fields in the interlaced video if the field dominance is incorrect. Three-two pulldown, converting 24 fps to 30, will also provide unacceptable results with an incorrect field order.
NTSC-N was originally proposed in the 1960s to theCCIR as a 50 Hz broadcast method for theSystem N countries Paraguay, Uruguay, and Argentina before they chosePAL. In 1978, with the introduction ofApple II Europlus, it was reintroduced as NTSC 50: a system combining 625-line video with 3.58 MHz NTSC color. AnAtari ST running PAL software on its NTSC color display used this system, since the monitor could not decode PAL color. Most analog NTSC television sets and monitors with a vertical-hold can display this system after adjusting the vertical hold.[47]
NTSC 4.43 transmits an NTSC color subcarrier of 4.43 MHz instead of 3.58 MHz.[48] The output is only viewable by TVs which support the system, such as most PAL sets.[49]
In January 1960, seven years before adoption of the modified SECAM version, the experimental TV studio in Moscow began broadcasting with the OSKM system. OSKM was the version of NTSC adapted to the European D/K 625/50 standard. OSKM is an acronym for "Simultaneous system with quadrature modulation" (Russian:Одновременная Система с Квадратурной Модуляцией). It used the color-coding scheme later used in PAL (U and V, instead of I and Q).
The color subcarrier frequency was 4.4296875 MHz, and the bandwidth of U and V signals was near 1.5 MHz.[50] About 4,000 TV sets in four models (Raduga,[51] Temp-22, Izumrud-201 and Izumrud-203)[52] were produced, and were not commercially available.
In NTSC (and, to a lesser extent, PAL), reception problems can degrade the color accuracy of the picture;ghosting can change the phase of thecolorburst, altering a signal'scolor balance. The vacuum-tube electronics used in televisions through the 1960s led to technical problems, which is why NTSC televisions were equipped with a tint control. Hue controls are still found on NTSC TVs, but color drifting generally ceased to be a problem by the 1970s. Compared to PAL in particular, NTSC color accuracy and consistency were sometimes considered inferior; video professionals and television engineers jokingly referred to NTSC asNever The Same Color,Never Twice the Same Color, orNo True Skin Colors.[53]
This section needs to beupdated. Please help update this article to reflect recent events or newly available information.(October 2025)
A standard NTSC video image contains invisible lines (lines 1–21 of each field) known as thevertical blanking interval, or VBI); lines 1–9 are used for the vertical-sync and equalizing pulses. The remaining lines were blanked in the original NTSC specification to provide time for the electron beam on CRT screens to return to the top of the display.
VIR (vertical interval reference), adopted during the 1980s, attempts to correct some NTSC color problems by adding studio-inserted reference data for luminance and chrominance levels on line 19.[54] Suitably-equipped television sets could then use the data to adjust the display for a closer match to the original studio image. The VIR signal has three sections; the first has 70 percent luminance and the same chrominance as the colorburst signal and the other two have 50 percent and 7.5 percent luminance, respectively.[55]
Some stations transmittedTV Guide On Screen data (an electronic program guide) on VBI lines 11–18, 20, and 22. The primary station in a market (often a localPBS station) broadcast four lines, and backup stations transmitted one. TVGOS was discontinued in 2013 and 2016, ending OTA program-guide services for compatible devices.[59][60]
Countries and territories using NTSC, past and present
Parts of this article (those related to individual sections) need to beupdated. Please help update this article to reflect recent events or newly available information.(December 2014)
^In a pre-independence Compact of Free Association with the U.S.
^NTSC was intended to be abandoned at the end of 2015, but in late 2014 it was postponed to 2019[63] and later extended to 2023.[64][65][66][67] Analog broadcasts are expected to be shut off by the end of 2025 in Mega Manila and by 2026 in the rest of the country.[68][69][70][71][72][73][74][75]
^Conversion to ATSC 3.0 (instead of 1.0) was expected to begin in 2023 and be completed by 2026.[76]
^Over-the-air NTSC broadcasting in major cities ended August 31, 2011, replaced with ATSC.[77]
^Over-the-air NTSC broadcasting scheduled to be abandoned by December 15, 2021, simulcast in ATSC.[79]
^Over-the-air NTSC broadcasting scheduled to end by December 31, 2024, simulcast inISDB-Tb.[80]
^Over-the-air NTSC broadcasting scheduled to end by December 31, 2019, simulcast in ISDB-Tb.[81]
^Plans for transition from NTSC announced on July 2, 2004,[82] started conversion in 2013.[83] Full transition was scheduled on December 31, 2015,[84] but due to technical and economic issues for some transmitters, the full transition was postponed to December 31, 2016.
^NTSC broadcast ended by December 31, 2024, simulcasting ISDB-Tb.[86]
^Federal Communications Commission (September 29, 1954).20th Annual Report to Congress (1954)(PDF) (Report). p. 90.Archived(PDF) from the original on May 8, 2024. RetrievedSeptember 16, 2024.
^National Television System Committee (1951–1953) (1953). Report and Reports of Panel No. 11, 11-A, 12-19, with Some supplementary references cited in the Reports, and the Petition for adoption of transmission standards for color television before the Federal Communications Commission (Report).LCCN54021386.{{cite report}}: CS1 maint: numeric names: authors list (link)
^A third line-sequential system fromColor Television Inc. (CTI) was also considered. The CBS and final NTSC systems were called field-sequential and dot-sequential systems, respectively.
^"Color TV Shelved As a Defense Step".The New York Times. October 20, 1951. p. 1.
^"Action of Defense Mobilizer in Postponing Color TV Poses Many Question for the Industry".The New York Times. October 22, 1951. p. 23.
^abcInternational Telecommunication Union Recommendation ITU-R 470-6 (1970–1998): Conventional Television Systems, Annex 2.
^Society of Motion Picture and Television Engineers (1987–2004): Recommended Practice RP 145–2004. Color Monitor Colorimetry.
^Society of Motion Picture and Television Engineers (1994, 2004): Engineering Guideline EG 27-2004. Supplemental Information for SMPTE 170M and Background on the Development of NTSC Color Standards, pp. 9
^Advanced Television Systems Committee (2003): ATSC Direct-to-Home Satellite Broadcast Standard Doc. A/81, pp.18
^European Broadcasting Union (1975) Tech. 3213-E.: E.B.U. Standard for Chromaticity Tolerances for Studio Monitors.
^"SMPTE RP 167-1995"(PDF).SMPTE. p. 5 (A.4). RetrievedJuly 15, 2024.The NTSC corrective matrix in a display device is intended to correct any colorimetric errors introduced by the differ- ence between the camera primaries and the display tube phosphors.
^Hester, Lisa (July 6, 2004)."Mexico To Adopt The ATSC DTV Standard".Advanced Television Systems Committee. Archived fromthe original on June 6, 2014. RetrievedJune 4, 2013.On July 2 the Government of Mexico formally adopted the ATSC Digital Television (DTV) Standard for digital terrestrial television broadcasting.
A standard defining the NTSC system was published by theInternational Telecommunication Union in 1998 under the title "Recommendation ITU-R BT.470-7, Conventional Analog Television Systems". It is publicly available on the Internet atITU-R BT.470-7 or can be purchased from theITU.