This article is rated Stub-class on Wikipedia's content assessment scale. It is of interest to the following WikiProjects: | |||||||||||||||||||||||||||
|
Untitled
edit"the importance of the fillrate as a measurement of performance has declined as the bottleneck in graphics applications has shifted."
This is not true.
Shaders use up a lot of fillrate, especially if drawing or modifying many texture pixels. A lot of shaders utilize multiple maps.
Transparency in general uses up fillrate (since a transparent texture allows the pixels behind it to be seen/drawn).
This is extremely critical with particle systems, say 30 layers of dust clouds or in combination with shaders (like a refracting force field or water for example)
Texture sizes are still increasing, using up even more fillrate (if the resolution is high enough to display them).
Additionally, most people will want to run their game at high resolutions like 1280x1024 or 1600x1200, using up fillrate.
Anisotropic Filtering and Anti-Aliasing might use up a lot of fillrate too, depending on implementation.
However, I agree that nowadays modern high end cards have enough fillrate to get along nicely in most situations, but its still easily possible to bring performance (ingame) to it's knees despite a fillrate of 10GP/s.
A huge problem is lacking developer education.
I've seen a lot of games that stack so many alpha-channel textures (as in 30 dust textures constituting a small cloud) that performance suffers horribly.
Such problems are easyly avoided, yet it is rarely done.
I will say that fillrate below 8000 megapixel/s (budget and midrange cards) is bottlenecked, simply because a lot of developers are that ignorant.