Glaze3D was a family of graphics cards announced by BitBoys Oy on August 2, 1999, that would have produced substantially better performance than other consumer[1] products available at the time. The family, which would have come in the Glaze3D 1200, Glaze3D 2400 and Glaze3D 4800 models, was supposed to offer full support for DirectX 7, OpenGL 1.2, AGP 4×, 4× anisotropic filtering, full-screen anti-aliasing and a host of other technologies not commonly seen at the time. The 1.5 million gate[1] GPU would have been fabricated by Infineon on a 0.2 μm eDRAM process,[1] later to be reduced to 0.17 μm with a minimum of 9 MB of embedded DRAM[2] and 128 to 512 MB of external SDRAM. The maximum supported video resolution was 2048×1536 pixels.

Development history

edit

The Glaze3D family of cards were developed in several generations, beginning with the original Glaze3D "400" with multi-channel RDRAM instead of internal eDRAM. This was offered only as IP but with no takers. Bitboys revised the design and decided to have it manufactured themselves, in cooperation with Infineon Technologies, the chip fabrication arm of Siemens. They came up with a new Glaze3D pitched for release in Q1, 2000. The card promised extremely high performance compared to contemporary consumer GPUs. As bug-hunting, validation and manufacturing problems delayed the launch, new features became necessary and a DX7 variant with built-in hardware Transform & Lighting was announced, but never appeared.

The GPU was later redesigned under a new codename, Axe, to take advantage of DirectX 8 and compete with a developing competition. The new version sported such features as an additional 3 MB of eDRAM, proprietary Matrix Antialiasing and a vastly improved fillrate, as well as offering a programmable vertex shader and widened internal memory bus. The new card was to have been released as Avalanche3D by the end of 2001.

The third development, codenamed Hammer, started development as Axe lost viability toward the end of 2001. This new card was to be a high-end DirectX 9 part, offering new features such as occlusion culling, improved rendering performance and various other innovations. This version, like the ones before it, never shipped commercially.

Bitboys turned to mobile graphics and developed an accelerator licensed and probably used by at least one flat panel display manufacture, although it was intended and designed primarily for higher-end handhelds. Later on ATI bought Bitboys for an extra research and development unit, so as of 2008 Bitboys was owned by AMD. In 2009, Bitboys was transferred to Qualcomm.

Specifications

edit

Glaze3D chip

edit
  • Infineon on a 0.2 μm eDRAM process[2]
  • Compatible with OpenGL and DirectX[3]
  • Quad-pixel pipeline at 150 MHz[4]
  • 4.5 million Triangles[4]
  • 10 million Triangles with multi-chip
  • 1.5 million logic gate[5]
  • 130 mm2 die size[5]
  • 304 pin BGA[5]
  • Thor Geometry processor[6]
  • PCI or AGP 2X/4X
  • Fillrate
    • 1.2 GigaTextel/s[2]
    • 4.8 GigaTextel/s with multi-chip[6]
    • 0.6 GigaTextel/s (Dual textured)[4]
    • 2.4 GigaTextel/s with multi-chip[6]
  • Memory
    • Embedded RAM
      • 9 MB Embedded framebuffer memory
        • 4 module of 2.25 MB with 3 bank each[7]
      • 150 MHz[7]
      • 9.6 GB/s memory bandwidth[7]
      • 512 bit interface[7]
    • External RAM
      • Up to 128 MB max
  • Texture cache
    • 16 KB for even mipmap and surface texture[8]
    • 8 KB for odd mipmap and lightmap[8]
    • Two-way associative[8]

Performance claims

edit
 
A publicity screenshot designed to highlight the realism that Glaze3D cards were supposed to achieve

The Glaze3D family was well known for the bold performance claims that were associated with it. The low-end 1200 model was purported to achieve a fillrate of 1.2 billion texels per second, with a geometry throughput of 15 million triangles per second. Most importantly, the card was originally claimed to achieve over 200 frames per second in id Software's Quake III Arena at maximum visual quality.[9]

The 1200 model's claimed specifications would place it as the rough equivalent of the GeForce FX 5200 Ultra or Radeon 9200 Pro (very low performance GPUs of 2002 vintage), while its claimed performance would place it at the same level as the GeForce 3 Ti 500 or Radeon 8500 (high-end GPUs from 2000 to 2001). To compound matters, the cards' specifications were later updated to nearly double their original performance levels.[citation needed]

While the Glaze3D 1200 was supposed to achieve unheard-of performance in video games, it was claimed that the 2400 and 4800 models would each be substantially more powerful in turn. Using two and four GPU configurations respectively, and including an additional geometry accelerator on the 4800, the higher-end Glaze3D cards were to be aimed at the very highest end of the video-gaming market.[9]

See also

edit

References

edit
  1. ^ a b c Petri Nordlund. "Glaze3D". Bitboys Oy.
  2. ^ a b c "Introduction". www.graphicshardware.org. Retrieved 2021-02-18.
  3. ^ "Design goals". www.graphicshardware.org. Retrieved 2021-02-18.
  4. ^ a b c "Performance". www.graphicshardware.org. Retrieved 2021-02-18.
  5. ^ a b c "The Glaze3DÔ chip". www.graphicshardware.org. Retrieved 2021-02-18.
  6. ^ a b c "Multichip configurations". www.graphicshardware.org. Retrieved 2021-02-18.
  7. ^ a b c d "Embedded DRAM". www.graphicshardware.org. Retrieved 2021-02-18.
  8. ^ a b c "Performance". www.graphicshardware.org. Retrieved 2021-02-18.
  9. ^ a b BitBoys Oy. "BITBOYS OY UNVEILS GLAZE3D PRODUCT FAMILY". Archived from the original on 2007-09-30. Retrieved 2006-06-11.
edit