Talk:Micro stuttering

Latest comment: 3 years ago by Kvng in topic Notability

Frame-metering

edit

Good article, but still missing a lot of information including frame-metering. NVIDIA apparently have a new hardware-based frame-metering chip on the GTX 690, and presumably the 670/680 SLi use software frame metering to smooth the output of frames and reduce micro-stutter. — Preceding unsigned comment added by 96.232.200.150 (talk) 00:43, 1 August 2012 (UTC)Reply

This is the first time I hear about it. If you have a verifiable source, or any other source for that matter, I'm happy to look into it. --JayC (talk) 06:24, 1 August 2012 (UTC)Reply

A quote from NVIDIA's page "The GTX 690 uses hardware based frame metering for smooth, consistent frame rates across both GPU's." and a link here (sorry I don't know how to embed links): http://www.geforce.com/hardware/desktop-gpus/geforce-gtx-690/ also I found some benchmarks showing a slight improvement in 690 micro-stutter over the 670/680 but overall the 6xx's are all very good. Can't find any information on frame-metering for 670/680 but an EVGA rep had posted on their forums that they do support frame metering (but presumably in software) 96.232.200.150 (talk) 08:22, 1 August 2012 (UTC)Reply

Also here's the link to the frame-metering and micro-stuttering benchmarks with 690 versus 680 and some AMD cards, but it's in german. http://www.computerbase.de/artikel/grafikkarten/2012/test-nvidia-geforce-gtx-690/8/ 96.232.200.150 (talk) 08:24, 1 August 2012 (UTC)Reply

And also found another good quote on the 690's keynote article. "Kepler introduces hardware based frame rate metering, a technology that helps to minimize stuttering. In SLI mode, two GPUs share the workload by operating on successive frames; one GPU works on the current frame while the other GPU works on the next frame. But because the workload of each frame is different, the two GPUs will complete their frames at different times. Sending the frames to the monitor at varying intervals can result in perceived stuttering. The GeForce GTX 690 features a metering mechanism (similar to a traffic meter for a freeway entrance) to regulate the flow of frames. By monitoring and smoothing out any disparities in how frames are issued to the monitor, frame rates feel smoother and more consistent" http://www.geforce.com/whats-new/articles/article-keynote/ but this is related to 690 only I believe, not sure about 670/680 still. Of course this is all yet to be verified by outside sources I suppose, but the benchmarks seem to prove it. 96.232.200.150 (talk) 08:46, 1 August 2012 (UTC)Reply

Notability

edit

The descriptions of "micro stutter" in the references is consistent with a deterministic jitter phenomenon as described by persons not familiar with signal processing. Given the use of the term being apparently exclusive to computer game enthusiasts, it may make sense to merge its description into a section of the Jitter article and redirect there.

Triskelios (talk) 21:31, 15 February 2013 (UTC)Reply

No. First of all, I think this article is crap, and probably should just become a section in Frame Rate. However, it would be gravely misplaced in jitter. Jitter is "undesired deviation from periodicity", while video game framerates in complex, modern games are jittery by necessity. "Micro stutter" has nothing to do with periodicity. It's possible to describe the issue as "framerate jitter", but it really is about how the "Frames Per Second" metric in video games is not representative of of the usability that is associated with higher framerates (the sensation of motion, reduced input/output latency). More specifically, it is a measuring problem about how frame rate has traditionally been measured.

A measured output of "100 fps" can be perceived to equal "10 fps", if every 10th frame for some reason causes an extra 100ms while all the other frames are rendered in the sub-ms range. A naive system that simply averages the number of frames over a period will report "100fps", but knowing how jittery the framerates of modern game engines are, more advanced systems now produce output like "highest 10% latencies average: 100ms" which is a metric that's a lot more relevant to the user, if less familiar, and it gives developers and hardware manufacturers more opportunity to optimize their designs for maximized usability.

The causes of micro stutter are typically in the game's engine, which may have do something like "update AI pathfinding every tenth frame", which may take an excessive amount of time due to an algorithm with a high worst case cpu cost, or a perceptible delay each time a new texture has to be loaded to the graphics card due to a buggy texture unpacking algorithm. Unfortunately, the article seems to focus on the multi-gpu scenario, which, I find, is to its detriment. 77.58.229.3 (talk) 16:20, 2 July 2013 (UTC)Reply

The "microstutter" colloquialism is indeed simply "framerate jitter" (and no, it's not deterministic). It's as simple as that. Don't be confused because the hardware reviewers are slow to pick up on their mistakes. Any perceptible jitter is just that: the jitter was bad enough that it becomes perceptable. Any object moving across physical space will be perceived to stutter once its velocity varies enough. Edit: by framerate in my post I am talking about the duration of each frame, not an average.

Stutter (disambiguation) has a link to Screen tearing which isn't quite right; The link is labeled Stutter (display) but there is actually no such article. Perhaps we can move this article to Stutter (display) or Frame-rate jitter and proceed from there. ~Kvng (talk) 13:57, 18 August 2021 (UTC)Reply