Movatterモバイル変換


[0]ホーム

URL:


Jump to content
WikipediaThe Free Encyclopedia
Search

GeForce 2 series

From Wikipedia, the free encyclopedia
(Redirected fromNV15)
For GeForce cards with a model number of 2X0, seeGeForce 200 series. For GeForce cards with a model number of 20X0, seeGeForce 20 series.

Series of GPUs by Nvidia
GeForce 2 series

Top: Logo of the GeForce 2 series
Bottom: Nvidia GeForce 2 GTS (Asus Branded) with its cooler removed, showing the NV15 die
Release datemid-May, 2000; 25 years ago (2000)[1]
CodenameNV11, NV15, NV16
ArchitectureCelsius
Models
  • GeForce MX series
  • GeForce GTS series
  • GeForce Pro series
  • GeForce Ti series
  • GeForce Ultra series
Cards
Entry-levelMX
Mid-rangeGTS, Pro
High-endTi, Ultra
API support
DirectXDirect3D 7.0
OpenGLOpenGL 1.2 (T&L)
History
PredecessorGeForce 256
SuccessorGeForce 3 series
Support status
Unsupported

TheGeForce 2 series (NV15) is the second generation ofNvidia'sGeForce line ofgraphics processing units (GPUs). Introduced in 2000, it is the successor to theGeForce 256.

The GeForce 2 family comprised a number of models. TheGeForce 2 GTS,GeForce 2 Ultra,GeForce 2 Pro, andGeForce 2 Ti are based upon the original architecture (NV15), varying only by chip and memory clock speeds. For the low-end segment and OEMs, theGeForce 2 MX series (NV11) was created, from which theGeForce 2 Go was derived for laptops. In addition, the GeForce 2 architecture is used for theQuadro series on the Quadro 2 Pro, 2 MXR, and 2 EX cards with special drivers meant to acceleratecomputer-aided design applications.

Architecture

[edit]
GeForce2 Ultra GPU
Die shot of a Geforce 2 GPU

The GeForce 2 architecture (NV15) is similar to the previous GeForce 256 line but with various improvements. Compared to the 220nm GeForce 256, the GeForce 2 is built on a 180 nm manufacturing process, making the silicon more dense and allowing for more transistors and a higher clock speed. The most significant change for 3D acceleration is the addition of a secondtexture mapping unit to each of the fourpixel pipelines. Some say[who?] the second TMU was there in the original Geforce NSR (Nvidia Shading Rasterizer) but dual-texturing was disabled due to a hardware bug; NSR's unique ability to do single-cycle trilinear texture filtering supports this suggestion. This doubles the texturefillrate per clock compared to the previous generation and is the reasoning behind the GeForce 2 GTS's naming suffix: GigaTexel Shader (GTS). The GeForce 2 also formally introduces the NSR (Nvidia Shading Rasterizer), a primitive type of programmable pixel pipeline that is somewhat similar to later pixelshaders. This functionality is also present in GeForce 256 but was unpublicized. Another hardware enhancement is an upgraded video processing pipeline, calledHDVP (high definition video processor). HDVP supports motion video playback atHDTV-resolutions (MP@HL).[2]

In 3D benchmarks and gaming applications, the GeForce 2 GTS outperforms its predecessor by up to 40%.[3] InOpenGL games (such asQuake III), the card outperforms theATI Radeon DDR and3dfxVoodoo 5 5500 cards in both 16bpp and 32 bpp display modes. However, inDirect3D games running 32 bpp, the Radeon DDR is sometimes able to take the lead.[4]

The GeForce 2 (NV15) architecture is quite memory bandwidth constrained.[5] The GPU wastes memory bandwidth and pixel fillrate due to unoptimizedz-buffer usage, drawing ofhidden surfaces, and a relatively inefficient RAM controller. The main competition for GeForce 2 GTS, the Radeon DDR (R100), has hardware functions (calledHyperZ) that address these issues.[6] Because of the inefficient nature of the GeForce 2 GPUs, they could not approach their theoretical performance potential and the Radeon, even with its significantly less powerful 3D architecture, offered strong competition. The later NV17 revision of the NV11 design used in theGeForce4 MX was more efficient.

Releases

[edit]

The first models to arrive after the original GeForce 2 GTS was theGeForce 2 Ultra andGeForce2 MX, launched on September 7, 2000.[7] On September 29, 2000 Nvidia started shipping graphics cards which had 16 and 32 MB of video memory size.

Architecturally identical to the GTS, the Ultra simply has higher core and memory clock rates. Meant to be a niche product, it was rumored that GeForce 2 Ultra was intended to prevent 3dfx taking the lead with theirVoodoo 5 6000 that was ending up never released as 3dfx went bankrupt. The Ultra model actually outperforms the firstGeForce 3 products in some cases, due to initial GeForce 3 cards having significantly lower fillrate. However, the Ultra loses its lead when anti-aliasing is enabled, because of the GeForce 3's new memory bandwidth/fillrate efficiency mechanisms; plus the GeForce 3 has a superior next-generation feature set with programmable vertex and pixel shaders for DirectX 8.0 games.

TheGeForce 2 Pro, introduced shortly after the Ultra, was an alternative to the expensive top-line Ultra and is faster than the GTS.

In October 2001, theGeForce 2 Ti was positioned as a cheaper and less advanced alternative to the GeForce 3. Faster than the GTS and Pro but slower than the Ultra, the GeForce 2 Ti performed competitively against theRadeon 7500 (RV200), although the 7500 had the advantage of dual-display support. This mid-range GeForce 2 release was replaced by theGeForce4 MX series as the budget/performance choice in January 2002.

On their 2001 product web page, Nvidia initially placed the Ultra as a separate offering from the rest of the GeForce 2 lineup (GTS, Pro, Ti), however by late 2002 with the GeForce 2 considered a discontinued product line (being succeeded by the GeForce 4 MX), the Ultra was included along the GTS, Pro, and Ti in the GeForce 2 information page.

GeForce 2 MX

[edit]
GeForce 2 MX200 AGP
Die shot of the MX400 GPU

Since the previous GeForce 256 line shipped without a budget variant, theRIVA TNT2 series was left to fill the "low-end" role—albeit with a comparably obsolete feature set. In order to create a better low-end option, Nvidia created the GeForce 2 MX series (NV11), which offered a set of standard features similar to the regular GeForce 2 (NV15), limited only by categorical tier of lower performance. In order to reduce production costs, the GeForce 2 MX cards had two 3D pixel pipelines removed and a reduced available memory bandwidth. The cards utilized either SDR SDRAM or DDR SDRAM with memory bus widths ranging from 32 to 128 bits, allowing circuit board cost to be varied. The MX series also provided dual-display support, something not found in the GeForce 256 and GeForce 2. With performance approaching the GeForce 256 while also being much more economical to produce, the GeForce 2 MX was successful in the OEM and budget market.

The prime competitors in the OEM and budget segment were ATI'sRadeon SDR (which with all other R100 chip-equipped cards, regardless of clock/memory speed and memory configuration, was later renamed collectively as Radeon 7200),Radeon VE (RV100) (later renamed Radeon 7000), and the 3dfx Voodoo4 4500.[8] Sharing the same R100 GPU as the higher-end Radeon 32MB DDR (US$230), the Radeon SDR (US$150) was equipped with SDR SDRAM instead of DDR SDRAM found in its more expensive brethren although this did not bring down costs sufficiently to match the GeForce 2 MX.[9] Released 3 months after the GeForce 2 MX, the Radeon SDR lacked multi-monitor support but exhibited faster 32-bit 3D rendering over the GeForce 2 MX.[10] 3dfx's Voodoo4 4500 arrived too late, as well as being too expensive at US$150, but too slow to compete with Nvidia or ATI's offerings, and also lacking multi-monitor support. Next up, the Radeon VE's RV100 GPU was cut down considerably from the R100 to reduce production costs, so it did not offer hardwareT&L, an emerging 3D rendering feature of the day that was the major attraction of Direct3D 7. Further, the Radeon VE featured only a single rendering pipeline, causing it to produce a substantially lower fillrate than the GeForce 2 MX. However the Radeon VE (US$100) had the advantage of somewhat better dual-monitor display software while matching the GeForce 2 MX on price.[11]

Members of the series includeGeForce 2 MX,MX400,MX200, andMX100. The GPU was also used as an integrated graphics processor in thenForce chipset line and as a mobile graphics chip for notebooks calledGeForce 2 Go.

The NVIDIA GeForce2 MX 400 is often considered underwhelming because of its limited capabilities. With just 32 MB of SDR memory and an outdated architecture based on the Celsius design, it struggles to perform efficiently. Its low GPU clock speed of 200 MHz and support for only DirectX 7.0 further hamper its ability to handle modern games and applications. In the context of today's technology, the GeForce2 MX 400 is seen as outdated and insufficient for most graphic-intensive tasks.[12][13][14]

Successor

[edit]

The successor to the GeForce 2 (non-MX) line is theGeForce 3. The non-MX GeForce 2 line was reduced in price and saw the addition of the GeForce 2 Ti, in order to offer a mid-range alternative to the high-end GeForce 3 product.

Later, both the GeForce 2 and GeForce 2 MX lines were replaced with theGeForce4 MX.

Models

[edit]
Main article:List of Nvidia graphics processing units § GeForce2_series
Further information:Celsius (microarchitecture)
  • All models support TwinView Dual-Display Architecture, Second Generation Transform and Lighting (T&L)
  • GeForce2 MX models support Digital Vibrance Control (DVC)
ModelLaunch
Transistors (million)
Die size (mm2)
Core clock (MHz)
Memory clock (MHz)
Core config[a]
FillrateMemory
Performance (GFLOPS
FP32)
TDP (Watts)
MOperations/s
MPixels/s
MTexels/s
MVertices/s
Size (MB)
Bandwidth (GB/s)
Bus type
Bus width (bit)
GeForce2 MX IGP + nForce 220/420June 4, 2001NV1A (IGP) / NV11 (MX)TSMC
180 nm
20[16]64FSB1751332:4:23503507000Up to 32 system RAM2.128
4.256
DDR64
128
0.7003
GeForce2 MX200March 3, 2001AGP 4x, PCI16632
64
1.328SDR641
GeForce2 MXJune 28, 20002.6561284
GeForce2 MX400March 3, 2001200166,200 (SDR)
166 (DDR)
4004008001.328 3.200 2.656SDR
DDR
64/128 (SDR)
64 (DDR)
0.8005
GeForce2 GTSApril 26, 2000NV1525[17]88AGP 4x1664:8:48008001,6005.312DDR1281.6006
GeForce2 ProDecember 5, 20002006.4?
GeForce2 TiOctober 1, 2001TSMC
150 nm
2501,0001,0002,0002.000?
GeForce2 UltraAugust 14, 2000TSMC
180 nm
230647.36?
  1. ^Pixel pipelines:texture mapping units:render output units

GeForce2 Go mobile GPU series

[edit]
  • Mobile GPUs are either soldered to the mainboard or to someMobile PCI Express Module (MXM).
  • All models are manufactured with a 180 nm manufacturing process
ModelLaunch
Core clock (MHz)
Memory clock (MHz)
Core config[a]
FillrateMemory
MOperations/s
MPixels/s
MTexels/s
MVertices/s
Size (MB)
Bandwidth (GB/s)
Bus type
Bus width (bit)
GeForce2 Go 100February 6, 2001NV11MAGP 4x1253322:0:4:225025050008, 161.328DDR32
GeForce2 GoNovember 11, 2000143166
332
28628657216, 322.656SDR
DDR
128
64
GeForce2 Go 200February 6, 2001332DDR64
  1. ^Pixel shaders:vertex shaders:texture mapping units:render output units

Discontinued support

[edit]
Nvidia GeForce2 Ultra

Nvidia has ceased driver support for GeForce 2 series, ending with GTS, Pro, Ti and Ultra models in 2005 and then with MX models in 2007.

Final drivers

[edit]

GeForce 2 GTS, GeForce 2 Pro, GeForce 2 Ti and GeForce 2 Ultra:

GeForce 2 MX & MX x00 Series:

  • Windows 9x & Windows Me: 81.98 released on December 21, 2005;Download;Product Support List Windows 95/98/Me – 81.98.
    • Driver version 81.98 for Windows 9x/Me was the last driver version ever released by Nvidia for these systems. No new official releases were later made for these systems.
  • Windows 2000, 32-bit Windows XP & Media Center Edition: 93.71 released on November 2, 2006;Download. (Products supported list also on this page)
  • Linux 32-bit: 96.43.23 released on September 14, 2012;Download

The drivers for Windows 2000/XP can also be installed on later versions of Windows such as Windows Vista and 7; however, they do not support desktop compositing or theAero effects of these operating systems.

Note: Despite claims in the documentation that 94.24 (released on May 17, 2006) supports the Geforce 2 MX series, it does not (94.24 actually supports onlyGeForce 6 andGeForce 7 series).[18]

Windows 95/98/Me Driver Archive
Windows XP/2000 Driver Archive
Unix Driver Archive

Competing chipsets

[edit]

See also

[edit]

References

[edit]
  1. ^Ross, Alex (April 26, 2000)."NVIDIA GeForce2 GTS Guide". SharkyExtreme. Archived fromthe original on August 23, 2004.
  2. ^Lal Shimpi, Anand (April 26, 2000)."NVIDIA GeForce 2 GTS". Anandtech. p. 2. RetrievedJuly 2, 2009.
  3. ^Lal Shimpi, Anand (April 26, 2000)."NVIDIA GeForce 2 GTS". Anandtech. RetrievedJune 14, 2008.
  4. ^Witheiler, Matthew (July 17, 2000)."ATI Radeon 64MB DDR". Anandtech. RetrievedJune 14, 2008.
  5. ^Lal Shimpi, Anand (August 14, 2000)."NVIDIA GeForce 2 Ultra". Anandtech. RetrievedJune 14, 2008.
  6. ^Lal Shimpi, Anand (April 25, 2000)."ATI Radeon 256 Preview (HyperZ)". Anandtech. p. 5. RetrievedJune 14, 2008.
  7. ^"Press Release-NVIDIA".www.nvidia.com. RetrievedApril 22, 2018.
  8. ^Witheiler, Matthew."ATI Radeon VE 32MB".www.anandtech.com.
  9. ^Shimpi, Anand Lal."ATI Radeon 32MB SDR".www.anandtech.com.
  10. ^FastSite (December 27, 2000)."ATI RADEON 32MB SDR Review". X-bit labs. Archived fromthe original on July 25, 2008. RetrievedJune 14, 2008.
  11. ^Witheiler, Matthew."ATI Radeon VE 32MB".www.anandtech.com.
  12. ^"NVIDIA GeForce2 MX Specs".TechPowerUp. February 12, 2025. RetrievedFebruary 12, 2025.
  13. ^Witheiler, Matthew."NVIDIA GeForce2 MX 400 64MB".www.anandtech.com. RetrievedFebruary 12, 2025.
  14. ^Silvino Orozco (June 29, 2000)."Full Review of NVIDIA's GeForce2 MX".Tom's Hardware. RetrievedFebruary 12, 2025.
  15. ^"3D accelerator database".Vintage 3D.Archived from the original on October 23, 2018. RetrievedAugust 30, 2024.
  16. ^"NVIDIA GeForce2 MX PCI Specs".TechPowerUp. RetrievedAugust 30, 2024.
  17. ^"NVIDIA NV15 GPU Specs | TechPowerUp GPU Database". RetrievedAugust 30, 2024.
  18. ^"Driver Details".NVIDIA.

External links

[edit]
Wikimedia Commons has media related toGeForce 2 series.
Fixed pixel pipeline
Pre-GeForce
Vertex andpixel shaders
Unified shaders
Unified shaders &NUMA
Ray tracing &Tensor Cores
Software and technologies
Multimedia acceleration
Software
Technologies
GPU microarchitectures
Other products
GraphicsWorkstation cards
GPGPU
Console components
Nvidia Shield
SoCs and embedded
CPUs
Computerchipsets
Company
Key people
Acquisitions
Retrieved from "https://en.wikipedia.org/w/index.php?title=GeForce_2_series&oldid=1277309219"
Categories:
Hidden categories:

[8]ページ先頭

©2009-2025 Movatter.jp