Flicker-free is a term given to video displays, primarily cathode ray tubes, operating at a high refresh rate to reduce or eliminate the perception of screen flicker. For televisions, this involves operating at a 100 Hz or 120 Hz hertz field rate to eliminate flicker, compared to standard televisions that operate at 50 Hz (PAL, SÉCAM systems) or 60 Hz (NTSC), most simply done by displaying each field twice, rather than once. For computer displays, this is usually a refresh rate of 70–90 Hz, sometimes 100 Hz or higher. This should not be confused with motion interpolation, though they may be combined – see implementation, below.
Televisions operating at these frequencies are often labelled as being 100 or 120 Hz without using the words flicker-free in the description.
Prevalence
editThe term is primarily used for CRTs, especially televisions in 50 Hz countries (PAL or SECAM) and computer monitors from the 1990s and early 2000s – the 50 Hz rate of PAL/SECAM video (compared with 60 Hz NTSC video) and the relatively large computer monitors close to the viewer's peripheral vision make flicker most noticeable on these devices.
Contrary to popular belief, modern LCD monitors are not flicker free, since most of them use pulse-width modulation (PWM) for brightness control. As the brightness setting is lowered, the flicker becomes more noticeable, since the period when the backlight is active in each PWM duty cycle shortens. The problem is much more pronounced on modern LED backlit monitors, because LED backlights reacts faster to changes in current.
Implementation
editThe goal is to display images sufficiently frequently to exceed the human flicker fusion threshold, and hence create the impression of a constant (non-flickering) source.
In computer displays this consists of changing the frame rate of the produced signal in the video card (and in sync with this, the displayed image on the display). This is limited by the clock speed of the video adapter and frame rate required of the program – for a given pixel clock speed, higher refresh rates require lower resolution or color depth, and higher frame rates require that the program producing the video recalculate the screen more frequently. For these reasons, refresh rates above 90–100 Hz to reduce flicker are uncommon on computers – these rates are sufficient to eliminate flicker.
On television, this is more involved, as the source material has a fixed frame rate (and is also traditionally interlaced video, in which one-half of the scan lines of each frame are broadcast at a time). Most simply, the frame rate can be doubled by simply displaying the same broadcast image twice in rapid succession, as is done with movie projectors (which display each frame of 24 fps film two or more times) – either displaying each field twice or alternating fields.
Alternatively, this can involve motion interpolation, where rather than displaying the original fields twice, creates interpolated images between the original frames. This may be combined with deinterlacing, converting the image to progressive scan (attempting to create a full picture from the two half images).
Higher refresh rates, while they reduce flicker, may cause other problems. Simply redisplaying the fields may cause judder if not a full number multiplication of the frame rate (e.g. 24 fps × 2 = 48), particularly on fast moving images, as the image is displayed repeatedly in the same location, rather than moving smoothly. Conversely, interpolation (which avoids judder and may create more fluid motion than in the original video) can instead cause blurring, particularly visible on fast scrolling text. See Three-two pull down and motion interpolation for further discussion.
See also
editNotes
edit- Pulse Width Modulation Effects of PWM used in LCD brightness control