Movatterモバイル変換


[0]ホーム

URL:


Jump to content
WikipediaThe Free Encyclopedia
Search

Multiple Sub-Nyquist Sampling Encoding

From Wikipedia, the free encyclopedia
1980s analog high-definition television standard
This articlemay be too technical for most readers to understand. Pleasehelp improve it tomake it understandable to non-experts, without removing the technical details.(April 2022) (Learn how and when to remove this message)

MUSE (Multiple sub-Nyquist Sampling Encoding),[1] commercially known asHi-Vision (a contraction ofHIgh-definition teleVISION)[1] was a Japaneseanalog high-definition television system, with design efforts going back to 1979.[2] Traditional interlaced video shows either odd or even lines of video at any one time, but MUSE required four fields of video to complete a single video frame. Hi-Vision also refers to a closely related Japanese television system capable of transmitting video with 1035i resolution, in other words 1035 interlaced lines. MUSE was used as a compression scheme for Hi-Vision signals.

Overview

[edit]

It useddot-interlacing anddigital video compression to deliver 1125 line, 60 field-per-second (1125i60)[2] signals to the home. The system was standardized asITU-R recommendation BO.786[3] and specified bySMPTE 260M,[4] using acolorimetry matrix specified by SMPTE 240M.[5] As with other analog systems, not all lines carry visible information. On MUSE there are 1035 activeinterlaced lines, therefore this system is sometimes also mentioned as1035i.[6] MUSE employed 2-dimensional filtering, dot-interlacing,motion-vector compensation and line-sequential color encoding with time compression to "fold" or compress an original 30 MHzbandwidth Hi-Vision source signal into just 8.1 MHz.

Because MUSE (Multiple Sub-Nyquist Sampling Encoding) was different as it used a four-field dot-interlacing cycle, taking four fields to complete a single MUSE frame. The interlacing was done on a pixel-by-pixel basis, reducing both horizontal and vertical resolution by half for each field of video, unlike traditional interlacing which only reduced vertical resolution and so only stationary images were transmitted at full resolution. This meant that moving images were blurred since MUSE lowered the resolution of material that changed greatly from frame to frame. MUSE used motion-compensation, so camera pans maintained full resolution, but individual moving elements could be reduced to only a quarter of the full frame resolution. As a result, the mix of motion and non-motion was encoded pixel-by-pixel, making it less noticeable.[7]

Japan began broadcasting wideband analogue HDTV signals in December 1988,[8] initially with anaspect ratio of 2:1. TheSony HDVShigh-definition video system was used to create content for the MUSE system, but didn't record MUSE signals.[2] It recorded Hi-Vision signals which are uncompressed.By the time of its commercial launch in 1991,digital HDTV was already under development in theUnited States. Hi-Vision MUSE was mainly broadcast byNHK through theirBShi satellite TV channel, although other channels such as WOWOW, TV Asahi, Fuji Television, TBS Television, Nippon Television, and TV Tokyo also broadcast in MUSE.[9][10][11]

Later improvements, known as the MUSE-III system, increased resolution in moving areas of the image and improved chroma resolution during motion. MUSE-III was used for broadcasts starting in 1995 and a few Hi-Vision MUSE LaserDiscs. There were many early complaints about the large size of the MUSE decoder led to the development of a miniaturized decoder.

On May 20, 1994,Panasonic released the first MUSELaserDisc player.[12] There were also a number of players available from other brands likePioneer andSony.

Despite shadows and multipath issues in this analog transmission mode, Japan switched to a digital HDTV system based on ISDB. Hi-Vision continued broadcasting in analog by NHK until 2007. Other channels had stopped soon after December 1, 2000 as they transitioned to digital HD signals in ISDB, Japan's digital broadcast standard.[13]

History

[edit]

MUSE was developed byNHK Science & Technology Research Laboratories in the 1980s as a compression system for Hi-Vision HDTV signals.

  • Japanese broadcast engineers immediately rejected conventionalvestigial sideband broadcasting.
  • It was decided early on that MUSE would be a satellite broadcast format as Japan economically supports satellite broadcasting. MUSE was transmitted at a frequency of 21 GHz[14] or 12 GHz.[15][3]
Modulation research
  • Japanese broadcast engineers had been studying the various HDTV broadcast types for some time.[16] It was initially thought thatSHF,EHF oroptic fiber would have to be used to transmit HDTV due to the high bandwidth of the signal, and HLO-PAL would be used for terrestrial broadcast.[17][18] HLO-PAL is a conventionally constructedcomposite signal (based onY{\displaystyle Y} forluminance andC{\displaystyle C} forchroma like NTSC and PAL) and uses a phase alternating by line with half-line offset carrier encoding of the wideband/narrowband chroma components. Only the very lowest part of the wideband chroma component overlapped the high-frequency chroma. The narrowband chroma was completely separated from luminance.PAF, or phase alternating by field (like the first NTSC color system trial) was also experimented with, and it gave much better decoding results, but NHK abandoned all composite encoding systems. Because of the use of satellite transmission,Frequency modulation (FM) should be used with power-limitation problem. FM incurs triangular noise, so if a sub-carrierred composite signal is used with FM, demodulated chroma signal has more noise than luminance. Because of this, they looked[19] at other options, and decided[17] to useY/C{\displaystyle Y/C} component emission for satellite. At one point, it seemed that FCFE (Frame Conversion Fineness Enhanced), I/P conversion compression system,[20] would be chosen, but MUSE was ultimately picked.[21]
  • Separate transmission ofY{\displaystyle Y} andC{\displaystyle C} components was explored. The MUSE format which is transmitted today, uses separated component signalling. The improvement in picture quality was so great, that the original test systems were recalled.
  • One more power saving tweak was made: lack of visual response to low frequency noise allows significant reduction in transponder power if the higher video frequencies are emphasised prior to modulation at the transmitter and de-emphasized at the receiver.

Technical specifications

[edit]

MUSE's "1125 lines" are an analog measurement, which includes non-videoscan lines taking place while aCRT's electron beam returns to the top of the screen to begin scanning the next field. Only 1035 lines have picture information. Digital signals count only the lines (rows of pixels) that have actual detail, so NTSC's 525 lines become 486i (rounded to 480 to be MPEG compatible), PAL's 625 lines become 576i, and MUSE would be 1035i. To convert the bandwidth of Hi-Vision MUSE into "conventional"lines-of-horizontal resolution (as is used in the NTSC world), multiply 29.9 lines per MHz of bandwidth. (NTSC and PAL/SECAM are 79.9 lines per MHz) - this calculation of 29.9 lines works for all current HD systems including Blu-ray and HD-DVD. So, for MUSE, during a still picture, the lines of resolution would be: 598-lines of luminance resolution per-picture-height. The chroma resolution is: 209-lines. The horizontal luminance measurement approximately matches the vertical resolution of a 1080 interlaced image when theKell factor and interlace factor are taken into account. 1125 lines was selected as a compromise between the resolution in lines of NTSC and PAL and then doubling this number.[22]

MUSE employs time-compression integration (TCI) which is another term for time-division multiplexing, which is used to carry luminance, chrominance, PCM audio and sync signals on one carrier signal/in one carrier frequency. However, TCI achieves multiplexing by compression of the contents in the time dimension, in other words transmitting frames of video that are divided into regions with chrominance compressed into the left of the frame and luminance compressed into the right of the frame, which must then be expanded and layered to create a visible image.[15] This makes it different from NTSC which carries luminance, audio and chrominance simultaneously in several carrier frequencies.[23][24] Hi-Vision signals are analog component video signals with 3 channels which were RGB initially, and later YPbPr. The Hi-Vision standard aims to work with both RGB and YPbPr signals.[15][25][26]

Key features of the MUSE system:

  • Scanlines (total/active): 1,125/1,035[5]
  • Pixels per line (fully interpolated): 1122 (still image)/748 (moving)
  • Reference clock periods: 1920 per active line[5]
  • Interlaced ratio: 2:1[5]
  • Aspect ratio 16:9[5]
  • Refresh rate: 59.94 or 60 frames per second[5]
  • Sampling frequency for broadcast: 16.2 MHz
  • Vector motion compensation: horizontal ± 16samples (32.4 MHz clock) / frame, a vertical line ± 3 / Field
  • Audio: "DANCE" discrete 2- or 4-channel digital audio system: 48 kHz/16 bit (2 channelstereo: 2 front channels)/32 kHz/12 bit (4 channelsurround: 3 front channels + 1 back channel)
  • DPCM Audio compression format: DPCM quasi-instantaneouscompanding
  • Required bandwidth: 27 MHz[1] Usable bandwidth is 1/3 of this, 9 Mhz due to the use of FM modulation for transmission.[15]

Colorimetry

[edit]

The MUSEluminance signalY{\displaystyle Y} encodesYM{\displaystyle YM}, specified as the following mix of the original RGB color channels:[3]

YM=0.294R+0.588G+0.118B{\displaystyle {\begin{aligned}YM=0.294\,R+0.588\,G+0.118\,B\end{aligned}}}

ThechrominanceC{\displaystyle C} signal encodesBYM{\displaystyle B-YM} andRYM{\displaystyle R-YM} difference signals. By using these three signals (YM{\displaystyle YM},BYM{\displaystyle B-YM} andRYM{\displaystyle R-YM}), a MUSE receiver can retrieve the original RGB color components using the following matrix:[3]

[GBR] = [11/51/2110101][10005/40001][YMBYMRYM] = [11/41/215/40101][YMBYMRYM]{\displaystyle {\begin{aligned}{\begin{bmatrix}G\\B\\R\end{bmatrix}}\ =\ {\begin{bmatrix}1&-1/5&-1/2\\1&1&0\\1&0&1\end{bmatrix}}{\begin{bmatrix}1&0&0\\0&5/4&0\\0&0&1\end{bmatrix}}{\begin{bmatrix}YM\\B-YM\\R-YM\end{bmatrix}}\ =\ {\begin{bmatrix}1&-1/4&-1/2\\1&5/4&0\\1&0&1\end{bmatrix}}{\begin{bmatrix}YM\\B-YM\\R-YM\end{bmatrix}}\end{aligned}}}

The system used a colorimetry matrix specified bySMPTE 240M[5][27][28] (with coefficients corresponding to the SMPTE RP 145 primaries, also known asSMPTE-C, in use at the time the standard was created).[29] The chromaticity of the primary colors and white point are:[28][5]

MUSE colorimetry (SMPTE 240M / SMPTE "C")
White pointCCTPrimary colors (CIE 1931 xy)
xykRxRyGxGyBxBy
0.31270.3296500 (D65)0.630.340.310.5950.1550.07

The luma (EY{\displaystyle EY}) function is specified as:[5]

EY=0.212ER+0.701EG+0.087EB{\displaystyle {\begin{aligned}EY=0.212\,ER+0.701\,EG+0.087\,EB\end{aligned}}}

The blue color difference (EPB{\displaystyle EPB}) is amplitude-scaled (EBEY{\displaystyle EB-EY}), according to:[5]

EPB=1.826(EBEY){\displaystyle {\begin{aligned}EPB=1.826\,(EB-EY)\end{aligned}}}

The red color difference (EPR{\displaystyle EPR}) is amplitude-scaled (EREY{\displaystyle ER-EY}), according to:[5]

EPR=1.576(EREY){\displaystyle {\begin{aligned}EPR=1.576\,(ER-EY)\end{aligned}}}


Signal and Transmission

[edit]

MUSE is a 1125 line system (1035 visible), and is not pulse and sync compatible with the digital 1080 line system used by modern HDTV. Originally, it was a 1125 line, interlaced, 60 Hz, system with a 5:3[15] (1.66:1) aspect ratio and an optimal viewing distance of roughly 3.3H. In 1989 this was changed to a 16:9 aspect ratio.[30][31][32]

For terrestrial MUSE transmission a bandwidth limited FM system was devised. A satellite transmission system uses uncompressed FM.

Before MUSE compression, the Hi- Vision signal bandwidth is reduced from 30 MHz for luminance and chrominance to a pre-compression bandwidthY{\displaystyle Y} of 20 MHz for luminance, and a pre-compression bandwidth for chrominance is a 7.425 MHz carrier.

The Japanese initially explored the idea of frequency modulation of a conventionally constructed composite signal. This would create a signal similar in structure to theY/C{\displaystyle Y/C}composite video NTSC signal - with theY{\displaystyle Y} (luminance) at the lower frequencies and theC{\displaystyle C} (chrominance) above. Approximately 3 kW of power would be required, in order to get 40 dB ofsignal to noise ratio for a composite FM signal in the 22 GHz band. This was incompatible with satellite broadcast techniques and bandwidth.

To overcome this limitation, it was decided to use a separate transmission ofY{\displaystyle Y} andC{\displaystyle C}. This reduces the effective frequency range and lowers the required power. Approximately 570 W (360 forY{\displaystyle Y} and 210 forC{\displaystyle C}) would be needed in order to get a 40 dB of signal to noise ratio for a separateY/C{\displaystyle Y/C} FM signal in the 22 GHz satellite band. This was feasible.

There is one more power saving that appears from the character of the human eye. The lack of visual response to low frequency noise allows significant reduction in transponder power if the higher video frequencies are emphasized prior to modulation at the transmitter and then de-emphasized at the receiver. This method was adopted, with crossover frequencies for the emphasis/de-emphasis at 5.2 MHz forY{\displaystyle Y} and 1.6 MHz forC{\displaystyle C}. With this in place, the power requirements drop to 260 W of power (190 forY{\displaystyle Y} and 69 forC{\displaystyle C}).

Sampling systems and ratios

[edit]
Main article:Chroma subsampling

Thesubsampling in a video system is usually expressed as a three part ratio. The three terms of the ratio are: the number of brightness (luma)Y{\displaystyle Y}samples, followed by the number of samples of the two color (chroma) componentsCb{\displaystyle Cb} andCr{\displaystyle Cr}, for each complete sample area. Traditionally the value for brightness is always 4, with the rest of the values scaled accordingly.

A sampling of 4:4:4 indicates that all three components are fully sampled. A sampling of 4:2:0, for example, indicated that the two chroma components are sampled at half the horizontal sample rate of luma - the horizontal chroma resolution is halved. This reduces the bandwidth of an uncompressed video signal by one-third.

MUSE implements a similar system as a means of reducing bandwidth, but instead of static sampling, the actual ratio varies according to the amount of motion on the screen. In practice, MUSE sampling will vary from approximately 4:2:1 to 4:0.5:0.25, depending on the amount of movement. Thus the red-green chroma componentCr{\displaystyle Cr} has between one-half and one-eighth the sampling resolution of the luma componentY{\displaystyle Y}, and the blue-yellow chromaCb{\displaystyle Cb} has half the resolution of red-green.

Audio subsystem

[edit]

MUSE had a discrete 2- or 4-channel digital audio system called "DANCE", which stood forDigital Audio Near-instantaneous Compression and Expansion.

It used differential audio transmission (differential pulse-code modulation) that was not psychoacoustics-based likeMPEG-1 Layer II. It used a fixed transmission rate of 1350 kbp/s. Like the PALNICAM stereo system, it usednear-instantaneous companding (as opposed to Syllabic-companding like thedbx system uses) and non-linear 13-bit digital encoding at a 32 kHzsample rate.

It could also operate in a 48 kHz 16-bit mode. The DANCE system was well documented in numerous NHK technical papers and in a NHK-published book issued in the USA calledHi-Vision Technology.[33]

The DANCE audio codec was superseded by Dolby AC-3 (a.k.a.Dolby Digital),DTS Coherent Acoustics (a.k.a. DTS Zeta 6x20 or ARTEC),MPEG-1 Layer III (a.k.a. MP3), MPEG-2 Layer I,MPEG-4 AAC and many other audio coders. The methods of this codec are described in the IEEE paper:[34]

Real world performance issues

[edit]

Unlike traditional, interlaced video where interlacing is done on a line by line basis, showing either odd or even lines of video at any one time, thus requiring 2 fields of video to complete a video frame, MUSE used a four-field dot-interlacing[35][15][36][37][38] cycle, meaning it took four fields to complete a single MUSE frame,[39][40] and dot interlacing is interlacing that was done on a pixel by pixel basis, dividing both horizontal and vertical resolution by half to create each field of video, and not in a line by line basis as in traditional interlaced video which reduces only the vertical resolution to create each video field. Thus, in MUSE, only stationary images were transmitted at full resolution.[41][37][42][43] However, as MUSE lowers the horizontal and vertical resolution of material that varies greatly from frame to frame, moving images were blurred. Because MUSE used motion-compensation, whole camera pans maintained full resolution, but individual moving elements could be reduced to only a quarter of the full frame resolution. Because the mix between motion and non-motion was encoded on a pixel-by-pixel basis, it wasn't as visible as most would think. Later, NHK came up with backwards compatible methods of MUSE encoding/decoding that greatly increased resolution in moving areas of the image as well as increasing the chroma resolution during motion. This so-called MUSE-III system was used for broadcasts starting in 1995 and a very few of the last Hi-Vision MUSE LaserDiscs used it (A River Runs Through It is one Hi-Vision LD that used it). During early demonstrations of the MUSE system, complaints were common about the decoder's large size, which led to the creation of a miniaturized decoder.[1]

Shadows and multipath still plague this analog frequency modulated transmission mode.

Japan has since switched to a digital HDTV system based onISDB, but the original MUSE-basedBS Satellite channel 9 (NHK BS Hi-vision) was broadcast until September 30, 2007.

Cultural and geopolitical impacts

[edit]
Internal reasons inside Japan that led to the creation of Hi-Vision
  • (1940s): The NTSC standard (as a 525 line monochrome system) was imposed by theUS occupation forces.
  • (1950s-1960s): Unlike Canada (that could have switched to PAL), Japan was stuck with the US TV transmission standard regardless of circumstances.
  • (1960s-1970s): By the late 1960s many parts of the modern Japanese electronics industry had gotten their start by fixing the transmission and storage problems inherent with NTSC's design.
  • (1970s-1980s): By the 1980s there was spare engineering talent available in Japan that could design a better television system.

MUSE, as the US public came to know it, was initially covered in the magazinePopular Science in the mid-1980s. The US television networks did not provide much coverage of MUSE until the late 1980s, as there were few public demonstrations of the system outside Japan.

Because Japan had its own domestic frequency allocation tables (that were more open to the deployment of MUSE) it became possible for this television system to be transmitted byKu Band satellite technology by the end of the 1980s.

The US FCC in the late 1980s began to issue directives that would allow MUSE to be tested in the US, providing it could be fit into a 6 MHzSystem-M channel.

The Europeans (in the form of theEuropean Broadcasting Union (EBU)) were impressed with MUSE, but could never adopt it because it is a 60 Hz TV system, not a 50 Hz system that is standard in Europe and the rest of the world (outside the Americas and Japan).

The EBU development and deployment ofB-MAC,D-MAC and much later onHD-MAC were made possible by Hi-Vision's technical success. In many ways MAC transmission systems are better than MUSE because of the total separation ofcolour frombrightness in the time domain within the MAC signal structure.

Like Hi-Vision, HD-MAC could not be transmitted in 8 MHz channels without substantial modification – and a severe loss of quality and frame rate. A 6 MHz version Hi-Vision was experimented with in the US,[8] but it too had severe quality problems so the FCC never fully sanctioned its use as a domestic terrestrial television transmission standard.

The USATSC working group that had led to the creation of NTSC in the 1950s was reactivated in the early 1990s because of Hi-Vision's success. Many aspects of the DVB standard are based on work done by the ATSC working group, however most of the impact is in support for 60 Hz (as well as 24 Hz for film transmission) and uniform sampling rates and interoperable screen sizes.

See also

[edit]

The analog TV systems these systems were meant to replace:

Related standards:

References

[edit]
  1. ^abcd"DBNSTJ : Realization of High-Definition Television by MUSE System".dbnst.nii.ac.jp.
  2. ^abcCianci, Philip J. (January 10, 2014).High Definition Television: The Creation, Development and Implementation of HDTV Technology. McFarland.ISBN 9780786487974 – via Google Books.
  3. ^abcdPDF-E.pdf "MUSE system for HDTV broadcasting-satellite services"(PDF). International Telecommunication Union. 1992. ITU-R BO.786.{{cite web}}:Check|url= value (help)
  4. ^"ST 240:1999 - SMPTE Standard - For Television — 1125-Line High-Definition Production Systems — Signal Parameters".St 240:1999:1–7. November 30, 1999.doi:10.5594/SMPTE.ST240.1999.ISBN 978-1-61482-389-6. Archived fromthe original on January 31, 2022 – via IEEE Xplore.
  5. ^abcdefghijkANSI/SMPTE 240M-1995 - Signal Parameters 1125-Line High-Definition Production Systems(PDF). SMPTE. 1995.
  6. ^Poynton, Charles (January 3, 2003).Digital Video and HD: Algorithms and Interfaces. Elsevier.ISBN 9780080504308 – via Google Books.
  7. ^日本放送協会 (1984).日本放送協会年艦. 日本放送出版協会.ISBN 978-4-14-007135-9.
  8. ^ab"MUSE LaserDisc".ura.caldc.com. Retrieved2022-10-19.
  9. ^"テレビ多チャンネル時代における放送と通信の融合(4) | NDLサーチ | 国立国会図書館".国立国会図書館サーチ(NDLサーチ).
  10. ^1994年11月28日付朝日新聞・夕刊
  11. ^"テレビ多チャンネル時代における放送と通信の融合⑷" [Convergence of Broadcasting and Communications in the Age of Multi-Channel Television (4)](PDF) (in Japanese). Archived fromthe original(PDF) on 2024-09-02.
  12. ^"MUSE HI-DEF LaserDisc Players".LaserDisc UK Web Site. Archived fromthe original on 30 April 2016. Retrieved10 October 2021.
  13. ^"MUSE方式アナログハイビジョン終了の経緯"(PDF).
  14. ^西澤, 台次 (September 10, 1993)."ハイビジョンとその動向".日本写真学会誌.56 (4):309–317.doi:10.11454/photogrst1964.56.309 – via J-Stage.
  15. ^abcdefHigh Definition Television: Hi-Vision Technology. Springer. 6 December 2012.ISBN 978-1-4684-6536-5.
  16. ^Jun-ichi, Ishida; Ninomiya, Yuichi (December 19, 1982)."3. Signal and Transmission Equipment for High-Definition TV".The Journal of the Institute of Television Engineers of Japan.36 (10):882–888.doi:10.3169/itej1978.36.10_882 – via CiNii.
  17. ^abFujio, Takashi (December 19, 1980)."High-Definition Television System for Future : Desirable Standard, Signal Form and Broadcasting System".ITE Technical Report.4 (28):19–24.doi:10.11485/tvtr.4.28_19 – via CiNii.
  18. ^Fujio, Takashi (December 19, 1981)."High Definitional Television".The Journal of the Institute of Television Engineers of Japan.35 (12):1016–1023.doi:10.3169/itej1978.35.1016 – via CiNii.
  19. ^Komoto, Taro; Ishida, Junichi; Hata, Masaji; Yasunaga, Keiichi (December 19, 1979)."YC Separate Transmission of high Definition Television Signal by BSE".ITE Technical Report.3 (26):61–66.doi:10.11485/tvtr.3.26_61 – via CiNii.
  20. ^FUJIO, Takashi (December 19, 1984)."High-Definition Television System".ITE Technical Report.8 (1):33–39.doi:10.11485/tvtr.8.1_33 – via CiNii.
  21. ^FUJIO, Takashi (August 19, 2006)."Rowing a Boat to the HDTV New World".The Journal of the Institute of Electronics, Information and Communication Engineers.89 (8):728–734 – via CiNii.
  22. ^藤尾, 孝 (September 10, 1988)."Hdtv (ハイビジョン) 開発の経緯-システムの最適化とその性能-".テレビジョン学会誌.42 (6):570–578.doi:10.3169/itej1978.42.570 – via J-Stage.
  23. ^Mahmoud, Alaa; Mahmoud, Safwat W.Z.; Abdelhady, Kamal (2015)."Modeling and simulation of dynamics and noise of semiconductor lasers under NTSC modulation for use in the CATV technology".Beni-Suef University Journal of Basic and Applied Sciences.4 (2):99–108.doi:10.1016/j.bjbas.2015.05.002.
  24. ^Radio-Frequency Electronics: Circuits and Applications. Cambridge University Press. 11 June 2009.ISBN 978-0-521-88974-2.
  25. ^"Sony HDVS High Definition Video System. Sony Corporation".
  26. ^"Sony HDVS High Definition Video System General Catalogue 1991. Sony Corporation".
  27. ^"SMPTE-240M Y'PbPr".www5.in.tum.de.
  28. ^ab"Detailed Colorspace Descriptions".www.linuxtv.org.
  29. ^Charles A. Poynton,Digital Video and HDTV: Algorithms and Interfaces, Morgan–Kaufmann, 2003.online
  30. ^"産業のすべて: 1989年版". 山一証券経済研究所. September 4, 1989 – via Google Books.
  31. ^放送とニューメディア. 丸善. September 4, 1992.ISBN 978-4-621-03681-5 – via Google Books.
  32. ^Takashi, Fujio (September 16, 1988)."HDTV (hi-vision): (1). A short history of HDTV (hi-vision)".The Journal of the Institute of Television Engineers of Japan.42 (6):570–578.doi:10.3169/itej1978.42.570 – via cir.nii.ac.jp.
  33. ^NHK (1993).High Definition Television - Hi Vision Technology. Springer.ISBN 0-442-00798-1.
  34. ^Naganawa, K.; Hori, Y.; Yanase, S.; Itoh, N.; Asano, Y. (August 19, 1991). "A single-chip audio signal processor for HDTV receiver".IEEE Transactions on Consumer Electronics.37 (3):677–683.Bibcode:1991ITCE...37..677N.doi:10.1109/30.85585.S2CID 62603128.
  35. ^"NEC Research & Development". Nippon Electric Company. September 17, 1990 – via Google Books.
  36. ^ニューメディア用語辞典. 日本放送出版協会. September 17, 1988.ISBN 978-4-14-011046-1 – via Google Books.
  37. ^ab"帯域圧縮テレビジョン信号の受信装置".
  38. ^中川, 一三夫 (September 17, 1986)."Muse方式高品位テレビ信号の周波数解析".テレビジョン学会誌.40 (11):1126–1132.doi:10.3169/itej1978.40.1126 – via J-Stage.
  39. ^情報通信年鑑. 情報通信総合研究所. September 17, 1994.ISBN 978-4-915724-20-6 – via Google Books.
  40. ^日本放送協会年艦. 日本放送出版協会. September 17, 1984.ISBN 978-4-14-007135-9 – via Google Books.
  41. ^Chiariglione, Leonardo (September 17, 1988).Signal Processing of HDTV: Proceedings of the Second International Workshop on Signal Processing of HDTV, L'Aquila, Italy, 29 February-2 March 1988. North-Holland.ISBN 978-0-444-70518-1 – via Google Books.
  42. ^"Conference Record". IEEE. September 17, 1989 – via Google Books.
  43. ^"GLOBECOM '85: Conference Record". IEEE. September 17, 1985 – via Google Books.

External links

[edit]
Television
Analog
405 lines
525 lines
625 lines
819 lines
1125 lines
1250 lines
Audio
Hidden signals
Historical
Digital
Interlaced
Progressive
MPEG-2 Video
AVS
AVS+[note 1]
MPEG-4 Visual
MPEG-4 AVC
AVS2[note 1]
MPEG-H HEVC
Audio
Hidden signals
  1. ^abAlso used in China's DVB-S/S2 network.
  2. ^abDefunct.
Technical issues
High-definition (HD)
Concepts
Resolutions
Analog broadcast
(All defunct)
Digital broadcast
Audio
Filming and storage
HD media and
compression
Connectors
Deployments
Authority control databases: NationalEdit this at Wikidata
Retrieved from "https://en.wikipedia.org/w/index.php?title=Multiple_Sub-Nyquist_Sampling_Encoding&oldid=1334623953"
Categories:
Hidden categories:

[8]ページ先頭

©2009-2026 Movatter.jp