Talk:Interlaced video

Latest comment: 5 months ago by Wtshymanski in topic On Flicker


Interline twitter explanation

edit

The explanation is bogus and does not make sense, please correct it or insert citation that explains it in the current way.

The deffect explained here sounds like TV's strobing

—Preceding unsigned comment added by 92.87.192.98 (talk) 16:57, 16 February 2009 (UTC)Reply

Strobing? We are not talking about jerkiness here, and the image hardly represents what you call "strobing". Eugene2x-talk 23:15, 5 March 2009 (UTC)Reply

Leon Theremin a inventor of television and the inventor of 'interlaced' technique?

edit

"The concept of breaking a single video frame into interlaced 'fields' was first demonstrated by Russian inventor Léon Theremin in 1927" (Albert Glinsky: Theramin, University of Illinois Press, 2000)

First of all, I do not doubt that Theremin invented his electronic instrument "Theramin" and other things in the 1920s. These are proven facts.

But with all respect, I very much doubt this whole Theremin / television inventor / interlaced inventor stuff. This sounds like one of those classic Russian/Soviet falsifications to me.

The whole story is just based on a single source, which is the book mentioned above - published in 2000. The other source would be Theremin's own few lines about this topic in the book "A.F. Joffe - Memories [my own translation], Academy of Sciences Press [again my own translation], Moscow, 1973", offering a phantastic story how he invented television devices with a few lines up to 62 and even 120 lines (incorporating interlaced 'lines' technique) within 2 or 3 (!) years (from 1924 or 1925 to 1927) as part of his academic thesis! Moreover, he was travelling a lot during that period to present his 'Theramin', to negotiate lincence issues for his 'Theramin', etc. His alleged development of television of course had to be temporary halted during his extensive travels in 1925/26. He also claims to have invented in 1927 a portable(!) camera or whole television system with 100 lines, which could operate outdoor and under daylight conditions! (Meaning without any additional light source - in 1927!!!)

Should these claims be true, Theremin would have been ways ahead of all other television pioneers. I think - and 'early television buffs' would agree - about one decade ahead wouldn't be exaggerated in this case. The story of early television would have to be rewritten.

The stupid thing, now, about all of these phantastic 'achievements' regarding television devices is that there is not a single proof for not even one of these claims. No photos, no patent files, no drawings, no working schemes, no technical descriptions, nothing detailed and nothing general, no presentations (he travelled a lot and had his own laboratory in the US during the early 1930s), no contemporary articles. Absolutely nothing. Zero.

However, Theremin himself claims that there had been an article in the magazine Ogonyok [my own translation] in the 1920s. But even if that would be true, it wouldn't change anything about the non-existing proofs: Theremin maybe indeed has written a (theoretic) thesis about television and maybe did some research. But if one remembers the tons of propaganda, which had been put out especially by the early Soviet Union, to show how 'progressive' and 'modern' the largely backwards country was, it is not unlikely that Ogonyok somehow 'sexed up' its article a little bit.

There is some literature about Theremin from (communist) East Germany, a soviet puppet state, which glorified everything Soviet/Russian. Theremin had been a big celebrity in East Germany due to his various achievements. But in none of this literature, if at all, I could discover anything new about his alleged achievements in the field of television. It's always the same few statements, which I tried to give above in my own words.

So I have good reason to assume that this whole 'Theremin-television story' is nothing more than a huge fake!

I can only hope that it's only Russian chauvinists that spread such allegations via the 'University of Illinois Press' and all over corresponding Wikipedia articles.

I for my part will erase those non-proven statements about Theremin and television.

Greets. —Preceding unsigned comment added by 84.157.69.85 (talk) 19:34, 5 August 2009 (UTC)Reply

The mention of Theremin seems to be back, but I agree with this post from 4 years ago. Glinsky's book on Theremin is sensationalist more than anything, and I do not consider it a reliable source on the history of television technology. I cannot say anything about the Russian literature, but I am dealing quite comprehensively with the history of frame rates and have yet to encounter a credible mention of Theremin. In this respect, Schröter's work with interlaced phototelegraphy around 1928 is much more interesting and properly documented. I will remove the mention of Theremin if there are no objections. K37b8e4fd (talk) 01:32, 22 May 2013 (UTC)Reply

switching between in interlaced and progressive scan

edit

How can PAL, NTSC, VGA, SCART switch between progressive (like 240p) and interlaced scan. Is it just timing? Just a single sentence with a reference would suffice (and you know the correct section). It would add hard facts to the article and we could remove some of the advantage/disadvantage stuff. -- Arnero (talk) 17:00, 8 March 2010 (UTC)Reply

indeed, it is only timing, since you're skipping half of the image, since the output is analog, it is quite easier to interlace, comapred to digital outputs 178.255.104.154 (talk) 19:12, 7 November 2023 (UTC)Reply

Text in lede

edit

I am here in response to a request posted on the NPOV noticeboard.

The question was about the sentence, "Interlace is a technique of improving the picture quality of a video signal without consuming extra bandwidth", which currently is the first sentence of the article. It is my opinion that this is not NPOV. It would be better to say something like, "Interlace is a technique of displaying video." Then subsequently, its quality could be compared to other video display techniques (without saying one is improved over the other) and its bandwidth use could likewise be compared. Thoughts? Blue Rasberry 17:06, 25 April 2010 (UTC)Reply

I think that the original statement is uncontroversially true. Given a fixed analog bandwidth and no digital processing, an interlaced signal would always look better then a progressive one. That is why none of the analog standards are progressive. Once digital compression is involved, that changes the rules because MPEG does a much better job of reducing bandwidth then interlace. Algr (talk) 17:28, 25 April 2010 (UTC)Reply
I did not speak clearly so I think you misunderstand. The statement says "improved", but does not say improved with regard to what, and is therefore giving a subjective opinion rather than an objective comparison. What you just wrote sounds much better, if it is not controversial that in all circumstances an interlaced signal always looks better than a progressive one. I know absolutely nothing about this topic; I am just commenting on the pov of the grammar and word choice in the statement. Blue Rasberry 21:39, 25 April 2010 (UTC)Reply
MPEG is also often interlaced. Digital encoding changes nothing. Today it is the difference of recorded media: Movies: 25fps progressive, TV: 60fps broadcast as interlaced, either analog or digital. Carewolf (talk) 00:16, 28 April 2010 (UTC)Reply
Digital encoding, (and specifically compression) changes things radically because with a compression, you can no longer assume that two images with the same pixel count and bit depth will have the same picture quality. A video DVD running at 1200 kbps will look much better then one running at 300 kbps for example even though they are both 480/60i. This is why you can't add interlace to 720p and produce 1440i. Algr (talk) 09:20, 5 May 2010 (UTC)Reply

Interlacing was adopted as a means of increasing the number of scan lines of early TV systems, and thus the vertical resolution, while staying within a 6 MHz TV channel. The field rate of 60 Hz gave interlaced video an effective frame rate of 30 fps, exceeding the 24 fps frame rate of motion-picture film, and allowed TV equipment to synchronize to the AC power-line frequency of 60 Hz. Both persistence of vision (Phi phenomenon) and phosphor persistence reduce apparent flicker. This technology was developed by RCA in the 1930's in and around New York city (NBC) and RCA headquarters in Camden, New Jersey. Interlacing was a way of improving picture quality by increasing the number of scan lines without exceeding the allocated RF spectrum.

I think that the original statement is uncontroversially true...

Can it be then backed up by a source/citation? --Xerces8 (talk) 13:15, 9 May 2010 (UTC)Reply

User:Algr said some interesting things on the WP:NPOVN. Here is a quote from this which I feel summarizes the content. "But video signals can be designed in a wide variety of ways, so there is no obvious choice as to which two signals to compare." Read the rest here.
The sentence in question is still "Interlace is a technique of improving the picture quality of a video signal without consuming extra bandwidth." I really know nothing about this topic, but if I understand Algr, there was not a point when video was widely used without interlace, and thus there are no established or popular terms for non-interlace video. Could we go another route with this and instead of comparing it to something else, describe the cost of the improvement?
There must be some drawback, otherwise interlaced video could be further interlaced indefinitely. I think qualifying the improvement would be another way of stating a comparison. The picture improves up to a certain point, then some attribute becomes compromised, and so interlacing is only a technique for improving video with less interlacing than that point, right?
And we are only talking about the human perception of the presentation of a video signal, right? The video is actually being made a lot worse for preservation of data within any presented frame, right?
Is it the opinion of anyone here that this line of questioning is meaningful? I do not want to debate the facts, but rather I want to go toward a statement which is less open to interpretation and unlikely to be further questioned on Wikipedia. Blue Rasberry 06:06, 25 May 2010 (UTC)Reply

If you have a problem with the word "improving", say "doubling the apparent vertical resolution". — Preceding unsigned comment added by 172.91.176.10 (talk) 05:05, 3 September 2018 (UTC)Reply

Requested move

edit
The following discussion is an archived discussion of the proposal. Please do not modify it. Subsequent comments should be made in a new section on the talk page. No further edits should be made to this section.

Page moved to Interlaced video. Vegaswikian (talk) 19:33, 14 November 2010 (UTC)Reply

InterlaceInterlace (video)Relisted. There may be a consensus forming for a different name in the discussion, but if so we need some clear indications of support. Vegaswikian (talk) 22:56, 7 November 2010 (UTC) Especially now the video technique is gradually falling from use, it does not meet the requirement of WP:PRIMARY, & the plain term should go straight to disambiguation. Johnbod (talk) 20:25, 31 October 2010 (UTC)Reply

These "art" searches are meaningless, as a look at the results shows - you get many things like "...interlace. Art..." just in these few results. Neither of the search terms you used exist as terms - there is no "interlace art" though the words do occur together occasionally, same for the other. Removing the "" gives 'interlace' plus 'art' 32,600 and "interlaced" + "art" 77,900, whereas "interlace" + "video" only goes to 33k odd. So I think these searches are proving my case. While the video article has the plain term the number of hits it gets is also meaningless - who is to say what people were looking for? There are also other meanings we don't have articles for, plus a rash of business names etc etc. But your suggestion of moving this to "Interlaced video", so long as plain "interlace" then goes to disam, would be an excellent solution. Can we agree to do this? Johnbod (talk) 16:55, 1 November 2010 (UTC)Reply
Erf edit conflict lost my reply. Anyways what I basically said was that yes the google book searches don't work exactly. I used quotation marks specifically to show that the two subject did have a relationship two eachother, I am aware that it is not perfect and that particularly for 'interlace art' this is flawed. Not using them though just give ambigous results as the terms can be used anywhere in the book and don't necessary have to have any relationship other than they exist within the same text. Nevertheless I support a new move proposal with Interlace being moved to 'Interlaced video'. I suggest you withdraw this one and propose a new request. I also think that 'Interlace (art) could be moved to Interlace. as 'interlace (bitmaps)' is actually titled Interlacing (bitmaps) which doesn't specifically share the same name (it can still be on the dab however). In terms of the video and graphics usage, 'interlace' is the action (actually more the name of the button you press) not the product that results from the action. In the art term however, interlacing is the action and interlace is the element or outcome. Just a thought, -France3470 (talk)
In retrospect perhaps Interlace (art) shouldn't appear at Interlace. I don't feel fully commited to the idea of it becoming a dab but perhaps afterall it would be alright if the video term was not longer titled as Interlace. Not sure. Also I think 'interlacing video' is also an appropriate title (this is what the article is actually about, and thoughout the article the term interlacing is used). Though a new proposal would be best for either, a move to interlacing video, interlaced video or even interlacing (video) (to agree with the bitmaps naming) though this might be odd. There is also the option of a move to Interlacing though this just leaves us with the same problem as we have now. Personally I think interlacing video or interlaced video is best. -France3470 (talk) 17:41, 1 November 2010 (UTC)Reply
OK, shall we leave it here for a few days to see if others have a preference between "interlacing video, interlaced video or even interlacing (video)"? I think I prefer your first thought, "interlaced video", but I don't have strong views. One of these options is the way to go anyway. Johnbod (talk) 17:56, 1 November 2010 (UTC)Reply
Sound good. Just a final thought (at least for today). We also have a page titled deinterlacing, in so theory perhaps this article should actually be titled interlacing to match. Also not all of this article appears to be about video specifically, see the section of 'Interlace and computers', which perhaps makes the inclusion of video in the title debatable.-France3470 (talk) 18:02, 1 November 2010 (UTC)Reply
"Interlaced signal" perhaps? That seems to cover everything. Does this one overlap with raster one? I'm not technical enough to tell. Johnbod (talk) 20:06, 1 November 2010 (UTC)Reply
No, I don't think so. "interlaced signal" is not the technique, only a transmission of an application of the technique. Did anyone ask WPTV, WPELECTRONICS, or WPCOMPUTING? 76.66.203.138 (talk) 05:17, 6 November 2010 (UTC)Reply
No, nor the other projects involved. There is rarely a response in my experience. "interlacing" reduces but does not entirely remove the ambiguity re art, I don't know about the raster one. We seem to be closing in on "interlaced video". Johnbod (talk) 11:08, 6 November 2010 (UTC)Reply
The above discussion is preserved as an archive of the proposal. Please do not modify it. Subsequent comments should be made in a new section on this talk page. No further edits should be made to this section.

Discretion in adding to projects

edit

Adding tags to talk pages is fun and easy. However, not every article belongs in every project. Adding a very low relevance project to an article's talk page wil have no effect in improving the article and will just annoy people concerned about the project's backlog. This article is pretty important to the subject of "television" but just because they watch interlaced broadcast video in Paraguay doesn't mean it should be added to the Paraguay project. --Wtshymanski (talk) 15:04, 28 February 2011 (UTC)Reply

Theremin and interlace

edit

From the description, he developed a scanning television method. But where does interlace come into it? Did his mirror drum already produce interlaced pictures? -- megA (talk) 12:28, 23 June 2011 (UTC)Reply

OLED and the future of interlacing

edit

Just wondering about the future of interlacing. Initially interlacing was designed for fast refresh rate CRT use but CRTs are no longer produced so interlacing could disappear altogether.

Does anybody know if OLED refresh rates are up to the job? I don't understand why they should stop using interlacing along with all the other digital compression mechanisms to use less bandwidth if future displays could have refresh rates up to the job. I read that oled could be up to the task with little quality loss even compared to a CRT. Should somebody mention this in the future of interlacing? — Preceding unsigned comment added by 86.27.131.165 (talk) 10:24, 14 October 2014 (UTC)Reply

Doubling perceived vertical resolution VS Doubling perceived frame rate

edit

I think the first line should be corrected as: «Interlaced video is a technique for doubling the perceived vertical resolution of a video display without consuming extra bandwidth».Senbei64 (talk) 11:35, 24 September 2015 (UTC)Reply

That is only one thing it can do. It can also double the framerate at the expense of virtual resolution without consuming extra bandwidth.Carewolf (talk) 14:04, 24 September 2015 (UTC)Reply
The problem is that you can't really say how interlace improves a picture unless you specifically define which interlaced and progressive standards you are comparing. For example, If you start with VGA 480/60p and "add interlace" you could get:
480/60i - Bandwidth requirement cut in half, small reduction in quality. Basically NTSC.
960/60i - Quality almost doubled, more complex anti-aliasing needed. Bandwidth unchanged.
480/120i - Picture quality and bandwidth unchanged, large area flicker improved. (60 hz CRT refresh was too slow for some computer displays.)
All of that is for uncompressed video, right? I challenge you to show the benefits (e.g. half "bandwidth" from VGA 480/60p -> VGA 480/60i) with compressed video. I think that interlacing has exactly no benefits and is an outdated technology that only introduces headaches. It certainly does not give you free benefits. Even if going from 60p to 60i reduces bandwith the reduction in quality is not small. 80.108.8.19 (talk) —Preceding undated comment added 01:29, 16 November 2016 (UTC)Reply
Each of these can equally be called "VGA with Interlace", but the benefits and losses of the first two almost directly contradict each other. Algr (talk) 09:33, 28 January 2016 (UTC)Reply

When the HDTV standards were first established in the U.S. in the 1990's, broadcasters had a choice between vertical resolutions of 1080 interlaced or 720 progressive. Digital compression was in its infancy and MPEG-2 was chosen as the compression standard (and remains so as of this writing). In the U.S. there was (and still is) a hard limit of 6 MHz of RF spectrum for each TV channel, which will carry 19.39 Mbps of digital data. Interlace was preserved as a legacy of analog broadcasting. These resolutions were initially offered as a way of delivering 16 x 9 high-definition pictures.

20 years later (as of this writing), digital compression has come a long way, with broadcasters using a portion of their 19.39 Mbps for their main programming service, along with digital subchannels often carrying vintage standard-definition programs and movies. — Preceding unsigned comment added by 172.91.176.10 (talk) 07:54, 18 June 2019 (UTC)Reply

edit

Hello fellow Wikipedians,

I have just modified 3 external links on Interlaced video. Please take a moment to review my edit. If you have any questions, or need the bot to ignore the links, or the page altogether, please visit this simple FaQ for additional information. I made the following changes:

When you have finished reviewing my changes, you may follow the instructions on the template below to fix any issues with the URLs.

This message was posted before February 2018. After February 2018, "External links modified" talk page sections are no longer generated or monitored by InternetArchiveBot. No special action is required regarding these talk page notices, other than regular verification using the archive tool instructions below. Editors have permission to delete these "External links modified" talk page sections if they want to de-clutter talk pages, but see the RfC before doing mass systematic removals. This message is updated dynamically through the template {{source check}} (last update: 5 June 2024).

  • If you have discovered URLs which were erroneously considered dead by the bot, you can report them with this tool.
  • If you found an error with any archives or the URLs themselves, you can fix them with this tool.

Cheers.—InternetArchiveBot (Report bug) 20:04, 14 November 2017 (UTC)Reply

Theremin

edit

Theremin had nothing to do with interlacing. The source used in the article for the claim that Theremin has invented interlacing can be found as a Google Book here: [1], and it makes no mention of Theremin having anything to do with interlacing at all, much less having invented it. --2003:71:4E16:4B83:4CBE:CCA1:12C7:4742 (talk) 04:40, 14 January 2018 (UTC)Reply

Progressive vs Interlaced Frame Rate Confusion

edit

The Description Section of this article includes the following text:

"Format identifiers like 576i 50 and 720p 50 specify the frame rate for progressive scan formats, but for interlaced formats they typically specify the field rate (which is twice the frame rate). This can lead to confusion, because industry-standard SMPTE timecode formats always deal with frame rate, not field rate. To avoid confusion, SMPTE and EBU always use frame rate to specify interlaced formats, e.g., 480i 60 is 480i/30, 576i 50 is 576i/25, and 1080i 50 is 1080i/25. This convention assumes that one complete frame in an interlaced signal consists of two fields in sequence."

This text contains several issues: 1. "576i 50 and 720p 50 specify the frame rate for progressive scan formats": 576i50 would be interlaced not progressive; 2. "To avoid confusion, SMPTE and EBU always use frame rate to specify interlaced formats": I believe this is untrue. I believe this statement is true with respect to the EBU, however if you refer to the SMPTE UHD-SDI Standards Roadmap (see: https://www.smpte.org/sites/default/files/images/SMPTE%20wallchart%232.6_20_17-JULY%202017.pdf), you can see that the SMPTE nomenclature 1080i50 and 1080i60 actually refers to 25 and 30 frames per second respectively.

It is my understanding that a lot of confusion arises due to the fact the the EBU uses frame rates, while SMPTE uses field rates for interlaced designations, while both EBU and SMPTE use frame rates for progressive designations.

Scj242 (talk) 07:27, 22 October 2018 (UTC)Reply

Edit: Further supporting references can be found in the various SMPTE standards documents, such as SMPTE 274M, SMPTE 296M (e.g. 274M, Table 1, System 6, "1920 × 1080/50/2:1, 25 [Frame rate] 2:1 interlace").

Scj242 (talk) 20:01, 22 October 2018 (UTC)Reply

Please correct terminology

edit

Progressive is a relatively new term used for non-interlacing, yet is used here to the extent that it says that "progressive scan" was reintroduced in the 1970s. This is incorrect. Non-interlacing was reintroduced. It wasn't called progressive scan until the first HD televisions. Unless I'm seriously mistaken, monitors were previously advertised as non-interlaced, not progressive. Because of this, I wondered for years when TVs would go non-interlaced. Thetrellan (talk) 17:59, 6 October 2019 (UTC)Reply

IBM 8514 not mentioned

edit

the IBM 8514 should be mentioned as one of the fiew interlaced computer display standards. What do you think? --RokerHRO (talk) 08:32, 17 October 2020 (UTC)Reply

Interlaced displays were pretty common in the 80s with home computers just using TVs as screen. Carewolf (talk) 10:49, 17 October 2020 (UTC)Reply
Sure. But there is a separate section for "Interlace and computers" where also certain interlaced video modes for computer screen and not TV screens are mentioned. But 8514 is missing there which was – at least in clones – more common than others. :-/ --RokerHRO (talk) 19:42, 19 October 2020 (UTC)Reply

On Flicker

edit

"This enhances motion perception to the viewer, and reduces flicker by taking advantage of the characteristics of the human visual system."

But, compared to what? Is it not true that interlaced video often has a lot of flicker? This is what I'm not understanding. If you were to for example use a CRT TV that was capable of displaying both 480i and 480p, it would be almost certainly clear to the eyes that the 480p picture looks much more stable and less distracting to the eye because the TV wouldn't have to draw the even and odd lines to interlace the video, which is usually very visible. 2601:147:4B80:8A70:E8B0:9DF3:1C60:2D97 (talk) 17:44, 21 June 2024 (UTC)Reply

The way English works is that in the sentence above, "this" stands in for "interlace", and the comparison is implicitly to non-interlaced systems. The weak phrase "characteristics of the human visual system" are an attempt to avoid saying "persistence of vision" which is Wikiwrong and to avoid the link to some very technical term which isn't clearly cited as the explanation in the descriptions of interlace that I've read. When comparing flicker levels, be careful of the frame rate. "Is it not true that" is always a clue that what follows needs an authoritative citation before we can put it in the article. Why would the designers of NTSC/PAL/SECAM have made an interlaced system if it made flicker worse? Happy Editing!--Wtshymanski (talk) 00:52, 23 June 2024 (UTC)Reply