Movatterモバイル変換


[0]ホーム

URL:


Jump to content
WikipediaThe Free Encyclopedia
Search

High-dynamic-range rendering

From Wikipedia, the free encyclopedia
Not to be confused withHigh dynamic range video.
Rendering a computer graphics scene
A comparison of the standard fixed-aperture rendering (left) with the HDR rendering (right) in the video gameHalf-Life 2: Lost Coast. The HDRR was tone-mapped to SDR for broad compatibility with almost all displays.

High-dynamic-range rendering (HDRR orHDR rendering), also known ashigh-dynamic-range lighting, is therendering ofcomputer graphics scenes by usinglighting calculations done inhigh dynamic range (HDR). This allows preservation of details that may be lost due to limitingcontrast ratios.Video games andcomputer-generated imagery movies and visual effects benefit from this as it creates more realistic scenes than with more simplistic lighting models. HDRR was originally required totone map the rendered image ontoStandard Dynamic Range (SDR) displays, as the firstHDR capable displays did not arrive until the 2010s. However, if a modern HDR display is available, it is possible to instead display the HDRR with even greater contrast and realism.

Graphics processor companyNvidia summarizes the motivation for HDRR in three points: bright things can be really bright, dark things can be really dark, and details can be seen in both.[1]

History

[edit]

The use ofhigh-dynamic-range imaging (HDRI) in computer graphics was introduced by Greg Ward in 1985 with his open-sourceRadiance rendering andlighting simulation software which created the first file format to retain a high-dynamic-range image. HDRI languished for more than a decade, held back by limited computing power, storage, and capture methods. Not until recently[when?] has the technology to put HDRI into practical use been developed.[2][3]

In 1990, Eihachiro Nakame and associates presented a lighting model for driving simulators that highlighted the need for high-dynamic-range processing in realistic simulations.[4]

In 1995, Greg Spencer presentedPhysically-based glow visual effects for digital images atSIGGRAPH, providing a quantitative model for flare and blooming in the human eye.[5]

In 1997,Paul Debevec presentedRecovering high dynamic range radiance maps from photographs[6] at SIGGRAPH, and the following year presentedRendering synthetic objects into real scenes.[7] These two papers laid the framework for creating HDRlight probes of a location, and then using this probe to light a rendered scene.

HDRI and HDRL (high-dynamic-rangeimage-based lighting) have, ever since, been used in many situations in 3D scenes in which inserting a 3D object into a real environment requires the light probe data to provide realistic lighting solutions.

In gaming applications,Riven: The Sequel to Myst in 1997 used an HDRI postprocessing shader directly based on Spencer's paper.[8] AfterE3 2003,Valve released a demo movie of theirSource engine rendering a cityscape in a high dynamic range.[9] The term was not commonly used again until E3 2004, where it gained much more attention whenEpic Games showcasedUnreal Engine 3 and Valve announcedHalf-Life 2: Lost Coast in 2005, coupled with open-source engines such asOGRE 3D and open-source games likeNexuiz.

By the 2010s,HDR displays first became available. With higher contrast ratios, HDRR can reduce or eliminatetone mapping, resulting in an even more realistic image.

Examples

[edit]

One of the primary advantages of HDR rendering is that details in a scene with a large contrast ratio are preserved. Without HDRR, areas that are too dark are clipped to black and areas that are too bright are clipped to white. These are represented by the hardware as a floating point value of 0.0 and 1.0 for pure black and pure white, respectively.

Another aspect of HDR rendering is the addition of perceptual cues which increase apparent brightness. HDR rendering also affects how light is preserved in optical phenomena such asreflections andrefractions, as well as transparent materials such as glass. In LDR rendering, very bright light sources in a scene (such as the sun) are capped at 1.0. When this light is reflected the result must then be less than or equal to 1.0. However, in HDR rendering, very bright light sources can exceed the 1.0 brightness to simulate their actual values. This allows reflections off surfaces to maintain realistic brightness for bright light sources.

Limitations and compensations

[edit]

Human eye

[edit]

Thehuman eye can perceive scenes with a very high dynamiccontrast ratio, around 1,000,000:1.Adaptation is achieved in part through adjustments of theiris and slow chemical changes, which take some time (e.g. the delay in being able to see when switching from bright lighting to pitch darkness). At any given time, the eye's static range is smaller, around 10,000:1. However, this is still higher than the static range of most display technology.[citation needed]

Output to displays

[edit]

Although many manufacturers claim very high numbers,plasma displays,liquid-crystal displays, andCRT displays can deliver only a fraction of the contrast ratio found in the real world, and these are usually measured under ideal conditions.[citation needed] The simultaneous contrast of real content under normal viewing conditions is significantly lower.

Some increase in dynamic range inLCD monitors can be achieved by automatically reducing the backlight for dark scenes. For example, LG calls this technology "Digital Fine Contrast";[10] Samsung describes it as "dynamic contrast ratio". Another technique is to have an array of brighter and darker LED backlights, for example with systems developed by BrightSide Technologies.[11]

OLED displays have better dynamic range capabilities than LCDs, similar to plasma but with lower power consumption.Rec. 709 defines the color space forHDTV, andRec. 2020 defines a larger but still incomplete color space forultra-high-definition television.

Since the 2010s, OLED and otherHDR display technologies have reduced or eliminated the need fortone mapping HDRR tostandard dynamic range.

Light bloom

[edit]
Main article:Bloom (shader effect)

Light blooming is the result of scattering in the human lens, which human brain interprets as a bright spot in a scene. For example, a bright light in the background will appear to bleed over onto objects in the foreground. This can be used to create an illusion to make the bright spot appear to be brighter than it really is.[5]

Flare

[edit]
Main article:Lens flare

Flare is the diffraction of light in the human lens, resulting in "rays" of light emanating from small light sources, and can also result in some chromatic effects. It is most visible on point light sources because of their small visual angle.[5]

Typical display devices cannot display light as bright as the Sun, and ambient room lighting prevents them from displaying true black. Thus, HDR rendering systems have to map the full dynamic range of what the eye would see in the rendered situation onto the capabilities of the device. Thistone mapping is done relative to what the virtual scene camera sees, combined with several full screen effects, e.g. to simulate dust in the air which is lit by direct sunlight in a dark cavern, or the scattering in the eye.

Tone mapping andblooming shaders can be used together to help simulate these effects.

Tone mapping

[edit]
Main article:Tone mapping

Tone mapping, in the context of graphics rendering, is a technique used to map colors from high dynamic range (in which lighting calculations are performed) to a lower dynamic range that matches the capabilities of the desired display device. Typically, the mapping is non-linear – it preserves enough range for dark colors and gradually limits the dynamic range for bright colors. This technique often produces visually appealing images with good overall detail and contrast. Various tone mapping operators exist, ranging from simple real-time methods used in computer games to more sophisticated techniques that attempt to imitate the perceptual response of the human visual system.

HDR displays with higher dynamic range capabilities can reduce or eliminate the tone mapping required after HDRR, resulting in an even more realistic image.

Applications in computer entertainment

[edit]
This section'sfactual accuracy may be compromised due to out-of-date information. The reason given is: The Wii is dead, Sproing is dead, statement uses "will support" - did they ever?, every gaming unit mentioned here is dead?. Please help update this article to reflect recent events or newly available information.(January 2022)

Currently HDRR has been prevalent ingames, primarily forPCs,Microsoft'sXbox 360, andSony'sPlayStation 3. It has also been simulated on thePlayStation 2,GameCube,Xbox andAmiga systems.Sproing Interactive Media has announced that their new Athena game engine for theWii will support HDRR, adding Wii to the list of systems that support it.

Indesktop publishing and gaming, color values are oftenprocessed several times over. As this includes multiplication and division (which can accumulaterounding errors), it is useful to have the extended accuracy and range of 16-bit integer or 16-bitfloating point formats. This is useful irrespective of the aforementioned limitations in some hardware.

Development of HDRR through DirectX

[edit]

Complex shader effects began their days with the release ofShader Model 1.0 withDirectX 8. Shader Model 1.0 illuminated 3D worlds with what is called standard lighting. Standard lighting, however, had two problems:

  1. Lighting precision was confined to 8-bit integers, which limited the contrast ratio to 256:1. Using theHVS color model, the value (V), or brightness of a color, has a range of 0 – 255. This means the brightest white (a value of 255) is only 255 levels brighter than the darkest shade above pure black (i.e.: value of 0).
  2. Lighting calculations wereinteger based, which didn't offer as much accuracy because the real world is not confined to whole numbers.

On December 24, 2002,Microsoft released a new version ofDirectX. DirectX 9.0 introduced Shader Model 2.0, which offered one of the necessary components to enable rendering of high-dynamic-range images: lighting precision was not limited to just 8-bits. Although 8-bits was the minimum in applications, programmers could choose up to a maximum of 24 bits for lighting precision. However, all calculations were still integer-based. One of the firstgraphics cards to support DirectX 9.0 natively wasATI'sRadeon 9700, though the effect wasn't programmed into games for years afterwards. On August 23, 2003, Microsoft updated DirectX to DirectX 9.0b, which enabled the Pixel Shader 2.x (Extended) profile for ATI'sRadeon X series and NVIDIA'sGeForce FX series of graphics processing units.

On August 9, 2004, Microsoft updated DirectX once more to DirectX 9.0c. This also exposed the Shader Model 3.0 profile forHigh-Level Shader Language (HLSL). Shader Model 3.0's lighting precision has a minimum of 32 bits as opposed to 2.0's 8-bit minimum. Also all lighting-precision calculations are nowfloating-point based.NVIDIA states that contrast ratios using Shader Model 3.0 can be as high as 65535:1 using 32-bit lighting precision. At first, HDRR was only possible on video cards capable of Shader-Model-3.0 effects, but software developers soon added compatibility for Shader Model 2.0. As a side note, when referred to as Shader Model 3.0 HDR, HDRR is really done by FP16 blending. FP16 blending is not part of Shader Model 3.0, but is supported mostly by cards also capable of Shader Model 3.0 (exceptions include the GeForce 6200 series). FP16 blending can be used as a faster way to render HDR in video games.

Shader Model 4.0 is a feature of DirectX 10, which was released with Windows Vista. Shader Model 4.0 allows 128-bit HDR rendering, as opposed to 64-bit HDR in Shader Model 3.0 (although this is theoretically possible under Shader Model 3.0).

Shader Model 5.0 is a feature of DirectX 11. It allows 6:1 compression of HDR textures without noticeable loss, which is prevalent in previous versions of DirectX HDR texture compression techniques.

Development of HDRR through OpenGL

[edit]

It is possible to develop HDRR throughGLSL shader starting fromOpenGL 1.4 onwards.

Game engines that support HDR rendering

[edit]

See also

[edit]

References

[edit]
  1. ^Simon Green and Cem Cebenoyan (2004)."High Dynamic Range Rendering (on the GeForce 6800)"(PDF).GeForce 6 Series. nVidia. p. 3.
  2. ^Reinhard, Erik; Greg Ward; Sumanta Pattanaik; Paul Debevec (August 2005).High Dynamic Range Imaging: Acquisition, Display, and Image-Based Lighting. Westport, Connecticut: Morgan Kaufmann.ISBN 978-0-12-585263-0.
  3. ^Greg Ward."High Dynamic Range Imaging"(PDF).anywhere.com. Retrieved18 August 2009.
  4. ^Nakamae, Eihachiro; Kaneda, Kazufumi; Okamoto, Takashi; Nishita, Tomoyuki (1990). "A lighting model aiming at drive simulators".Proceedings of the 17th annual conference on Computer graphics and interactive techniques. pp. 395–404.doi:10.1145/97879.97922.ISBN 978-0201509335.S2CID 11880939.
  5. ^abcSpencer, Greg; Shirley, Peter; Zimmerman, Kurt; Greenberg, Donald P. (1995). "Physically-based glare effects for digital images".Proceedings of the 22nd annual conference on Computer graphics and interactive techniques - SIGGRAPH '95. p. 325.CiteSeerX 10.1.1.41.1625.doi:10.1145/218380.218466.ISBN 978-0897917018.S2CID 17643910.
  6. ^Paul E. Debevec andJitendra Malik (1997). "Recovering high dynamic range radiance maps from photographs".Proceedings of the 24th annual conference on Computer graphics and interactive techniques - SIGGRAPH '97. pp. 369–378.doi:10.1145/258734.258884.ISBN 0897918967.
  7. ^Paul E. Debevec (1998). "Rendering synthetic objects into real scenes: Bridging traditional and image-based graphics with global illumination and high dynamic range photography".Proceedings of the 25th annual conference on Computer graphics and interactive techniques - SIGGRAPH '98. pp. 189–198.doi:10.1145/280814.280864.ISBN 0897919998.
  8. ^Forcade, Tim (February 1998). "Unraveling Riven".Computer Graphics World.
  9. ^Valve (2003)."Half-Life 2: Source DirectX 9.0 Effects Trailer (2003)". YouTube.Archived from the original on 2021-12-21.
  10. ^Digital Fine Contrast
  11. ^BrightSide Technologies is now part of Dolby -Archived 2007-09-10 at theWayback Machine
  12. ^"Rendering – Features – Unreal Technology".Epic Games. 2006. Archived fromthe original on 2011-03-07. Retrieved2011-03-15.
  13. ^"SOURCE – RENDERING SYSTEM".Valve. 2007. Archived fromthe original on 2011-03-23. Retrieved2011-03-15.
  14. ^"The Amazing Technology of The Witcher 3".PC-Gamer. 2015. Retrieved2016-05-08.
  15. ^"FarCry 1.3: Crytek's Last Play Brings HDR and 3Dc for the First Time".X-bit Labs. 2004. Archived fromthe original on 2008-07-24. Retrieved2011-03-15.
  16. ^"CryEngine 2 – Overview".CryTek. 2011. Retrieved2011-03-15.
  17. ^Pereira, Chris (December 3, 2016)."Kojima Partnering With Killzone, Horizon Dev Guerrilla for Death Stranding".GameSpot.CBS Interactive.Archived from the original on December 4, 2019. RetrievedDecember 3, 2016.
  18. ^"Unigine Engine – Unigine (advanced 3D engine for multi-platform games and virtual reality systems)".Unigine Corp. 2011. Retrieved2011-03-15.
  19. ^"BabylonDoc". Archived fromthe original on 2015-07-04. Retrieved2015-07-03.
  20. ^"MIT Licensed Open Source version of Torque 3D from GarageGames: GarageGames/Torque3D".GitHub. 2019-08-22.

External links

[edit]
GPU
Desktop
Mobile
Architecture
Components
Memory
Form factor
Performance
Misc
Retrieved from "https://en.wikipedia.org/w/index.php?title=High-dynamic-range_rendering&oldid=1324309649"
Categories:
Hidden categories:

[8]ページ先頭

©2009-2025 Movatter.jp