Talk:MIDI/archive 1

Latest comment: 12 years ago by Dementia13 in topic Multiple issues

too many directions for this article

Please provide opinions on splitting this article up. Information about MIDI file formats do not share the same purpose or scope as MIDI as a communications protocol.

I think we need individual articles for: - MIDI Overview and History (this article) - MIDI Specifications (Electrical, Protocol) (an article already exists for electrical) - General MIDI (GM) and MIDI File Formats - Any other MIDI subsets I haven't thought of here Infindebula 14:53, 11 January 2007 (UTC)

Musical Instrument Digital Interface interface?

This article talks at least once of the MIDI interface, but surely that's not correct, just like saying Mount Fujiyama. It's either Mount Fuji, or Fujiyama, not Mount Fujiyama, or MIDI interface. --Sauronjim (talk) 12:46, 4 February 2009 (UTC)

This is a very common type of "misnomer", and is too useful and generally accepted to be considered objectionable. To insist that one discuss the "MID interface" or that "MIDI" never be followed by "interface" would be disconcertingly pedantic, esoteric, and fruitless. Unfree (talk) 05:22, 31 August 2009 (UTC)
  • I disagree, "unfree." Redundancies, which are errant, like "MIDI interface," "PIN number," "VIN number," "DSL line," and "ATM machine," among other stupidities, just sound really retarded. It's like saying that when the abbreviation was designed, there was no point in including the last word in it, because people will just say that whole last word anyway. How absurd. Sauronjim is correct. —Preceding unsigned comment added by an unknown user
I think Sauronjim and Unfree are both right. Saying MIDI Interface is redundant and not politically correct. However, Wikipedia is supposed to be about what is right, not what is "generally accepted". We should however, keep things understandable for readers though. Having the type of object it is after the name makes it clearer what the object actually is. If you talk to someone and say "I went to the bank and got a pin.", they may think that the bank was giving out free Pins for a promotion or something, whereas if you say PIN number, they connect it to Personal Identification Number. In the same way, if not for the extra "interface" in "MIDI interface", a normal reader might not understand that it is indeed an interface. There is actually a name for this called RAS syndrome, or "redundant acronym syndrome syndrome". I think what we should do is, in places where you need to know that it is an interface use "MIDI interface", (not "MID Interface", as that makes it sound like a different thing all together). Otherwise, Use "MIDI". --vgmddg (look | talk | do) 02:15, 8 July 2010 (UTC)

Is the article referring to an interface for midi, or midi in general? The little box which carries midi to and from would be a musical instrument digital interface interface, as midi is the name of the protocol and it is an interface for this protocol — Preceding unsigned comment added by 86.144.127.187 (talk) 07:55, 25 May 2012 (UTC)

MP3 to MIDI

Is it possible to convert MP3 to MIDI format? I read elsewhere that it's not. JJMan 14:39, 17 July 2006 (UTC)

Try the free software Audacity. You can import midi files and export as PCM Wav or mp3. Great software, and its free!
There are some programs that purport to do this but it is by definition impossible to convert audible sound waveforms to MIDI commands 'perfectly'. Charlie Richmond 14:11, 17 July 2006 (UTC)
The denial above is expressed too weakly. I would say it is for the forseeable future impossible to convert an mp3 (or wav) file to MIDI with any kind of acceptable result. Only an extremely crude approximation might be feasible. −Woodstone 17:07, 17 July 2006 (UTC)
I think it depends on the recording. I'm sure you could trace say a piano solo with pretty high accuracy with the right tools but a complex layered mix would be FAR more complex. Any vocals would also hugely complicate the issue. Plugwash 01:27, 18 July 2006 (UTC)
There is no software available that can do the necessary pitch extraction from polytonal passages (i.e. music with more than one note sounding at a time). If you have a recording a musical riff, played as one note at a time, with no backing accompaniment, there are a number of professional tools which will extract note info to your sequencing environment. It works on unaccompanied vocals. Because piano solos tend to include lots of notes together, they generally don't work. Infindebula 14:58, 11 January 2007 (UTC)
One example of an actual possible pitch extraction from polytonal passages is a MIDI guitar, this is made possible by a hexaphonic pickup, where 6 separate inputs are used to determine the note of each string. —Preceding unsigned comment added by 68.146.37.30 (talk) 08:48, 26 October 2007 (UTC)
Of course it's possible. You listen to the MP3, then recreate the song in your sequencer. --Komet 12:22, 28 May 2007 (UTC)
Sure... convert the MP3 back to raw audio, convert it to the SystemExclusive format for a sampler and send the adio as a sound bank to the sampler using SysEx, then play the sounds back from the sampler using MIDI note-on messages. But that's ridiculous despite being theoretically possible. (Audacity goes the wrong way, from MIDI to MP3; OP asked about MP3 to MIDI.) Generally speaking, MIDI describes a mere handful of the parameters by which sounds are created (and perhaps affected over time), with the responding synthesizer and the rest of the audio chain determining how to transform that into audio. Most audio recordings consist of almost inextricably intertwined superimpositions of many notes from many sources. While you might be able to infer pitch => note, amplitude => velocity (I have an old device that attempts this for a monophonic source -- a Roland VP-70), and maybe even timbre => channel mappings, to do so for a complex audio recording is not currently within the bounds of computer capabilities (like Plugwash said) Dave Brown 15:23, 27 July 2007 (UTC)

The answer is no. Unfree (talk) 05:25, 31 August 2009 (UTC)


Well, as of today, the answer is "Yes, but with conditions".

The main problem in my experience has been separating and thereby identifying the crucial pitch of a given audio sample from the inherent harmonics and other components that might exist, and vary over time as they do. A digital sample of a sound might today be made with a much higher accuracy than a few years ago, but it is still only a representation - however close - of a sound that was most likely created in a wholly analog way. It seems to me that, although software can now convert audio to MIDI data more accurately than before, the results continue to fall short of an ideal because they often require "cleaning up" to remove unwanted data. For example, how much audio is so "pure" that it consists of nothing but a single sine wave?
Given that an MP3 version of audio commonly has sample rates, etc, that fall below the highest available accuracy for digital audio, it's reasonable to say that converting MP3 to MIDI is much more fraught and more hard work.
That's not elitist or purist, just an observation. Do I wish the reality were otherwise? Mostly No, because I believe that most musicians find that the results of a played instrument are still more easier to create and more rewarding than those built it up digitally via MIDI. PårWöet (talk) 17:58, 19 May 2011 (UTC)

To convert audio data to note sheet (aka MIDI) you would detect frequencies in the input file using DFT. Since the frequency varies, you will need to cut the imput file up in several blocks. Now Heissenbergs uncertainty relation makes it difficult, because you cannot know the frequency exactly if you want to know when the key was pressed... — Preceding unsigned comment added by 90.230.14.73 (talk) 16:50, 6 February 2012 (UTC)

All of the above ignores the likelihood that the OP didn't understand that MIDI is not audio. Dementia13 (talk) 04:28, 28 June 2012 (UTC)

dubious addition by anon

an anon recently added the following text which i just removed

Daisy chaining of multiple devices on a single port can be acomplished using the MIDI-THRU port. This technique is generally limited to three devices in a row due to latency. The latency is caused by the fact that MIDI-IN ports are optically isolated. Instead of making a physical connection to pass electrical signals, the electrical signals arriving at the MIDI-IN port turn an LED on and off, which is read by a photo sensor. This physical break in the flow of electricity prevents the forming of ground loops. It also introduces a small amount of latency. Generally after traveling thru 3 devices in a daisy chain, this latency becomes noticable.

This seems to conflict with the following text from the MIDI-THRU article

The MIDI-THRU port avoids this delay by linking the THRU port to the MIDI-IN socket almost directly. The MIDI-OUT port is then used only for signals originating in that device.

Plugwash 09:09, 24 May 2005 (UTC)


Also, the digression to discuss opto-isolation is unnecessary, considering there is already an article on optical isolator's -- wackyvorlon
Touching on opto-isolation in relevance to the behaviour of MIDI Thru is entirely relevant, as many people are curious or misinformed about the characteristics of MIDI Thru. It should be noted that since the MIDI spec dictates that the only active component between MIDI In and MIDI Thru is an opto-isolator, there will be no appreciable latency on data that appears at the Thru port. The only counter-effect of a long MIDI daisy chain is in fact distortion of the MIDI signal caused by going through numerous opto-isolators.
This is entirely relevant to MIDI users and even researchers, but beyond the scope of an article on optical isolators. Infindebula 14:35, 11 January 2007 (UTC)

oxymoron

"almost directly responsible" is an oxymoron.


An oxymoron is properly a deliberate rhetorical or literary device, not just any old contradiction in terms. I don't see that your phrase is even a contradiction in terms, however, although it does seem to me excessively equivocal. It would help to have some context. Where in the article does it occur? TheScotch 07:17, 17 February 2007 (UTC)

It doesn't occur anywhere. Unfree (talk) 05:30, 31 August 2009 (UTC)

DIN to D-SUB?

Does anyone know how to connect 5-pin DIN MIDI port to PC 15-pin D-Sub MIDI/joystick port?

-=[ tqn ]=-

Edit: Nevermind, found. http://www.midi.org/about-midi/electrical.shtml

MIDI controller merge

I think MIDI controller should be merged into this article. At least it should have a section with a {{main|MIDI controller}}, but I think compared to other info like the specification section it is not too much to be entirely merged. Comments?

useless image

As of the current revision [1] (possibly much older), the following image is put into the article's start:

 
Note names and MIDI note numbers

This would be useful if the text inside the table can actually be read. But that's not the case, not even if you click on the image to see the enlarged image on the image page itself. I think rather than this image, we should instead put the actual table as wiki-markedup text somewhere in the article.

131.107.0.106 03:13, 8 June 2006 (UTC)

You can if you blow it up bigger. It's an SVG. See Image talk:NoteNamesFrequenciesAndMidiNumbers.svgOmegatron 21:27, 10 June 2006 (UTC)

I think the image should be removed from wikipedia altogether, and perhaps replaced with one with more legible text and the removal of the staffs. They are misleadingly positioned. See comment on the picture itself. 216.162.240.250 19:45, 29 November 2006 (UTC)Greg

Have you noticed that all the note names and other stuff in the table is as for left-hand piano (lower voices on the right)? :) —Preceding unsigned comment added by T0ljan (talkcontribs) 13:56, 23 April 2008 (UTC)

I agree that it doesn't belong early in the article. A statement identifying the note number of concert A would suffice. Unfree (talk) 06:00, 31 August 2009 (UTC)

Confused

Sorry, I'm not sure I get the "MIDI" concept. Doesn't it just basically mean, it's touch sensitive? :/ TommyBoy76 18:40, 31 July 2006 (UTC)

Where did you get that idea? MIDI is the whole system of sending musical notes in digital form, as described in the article. How can we make it more clear? —Keenan Pepper 20:50, 31 July 2006 (UTC)
I guess something isn't just clicking with me. It happens :-). Anyway, it is "sending musical notes in digital form", eh? Break it down, if I may. Send notes where? And weren't they already in digital form? You know... it's not exactly an acoustic piano that produces fake sounds. TommyBoy76 01:00, 1 August 2006 (UTC)
You can send then wherever you want. MIDI is just the language used to describe them. They weren't necessarily in digital form to begin with: say you have a MIDI keyboard hooked up to a computer. You can press any key whenever you want, as hard as you want, and the keyboard has to digitize the time and the velocity in order to send them to the computer via the MIDI connection. —Keenan Pepper 02:33, 1 August 2006 (UTC)
Ahh, so it's for computer purposes? TommyBoy76 19:22, 1 August 2006 (UTC)
Not necessarily. There's lots of MIDI equipment I wouldn't call "computers". It has microchips, but that should be obvious from the "digital" part. MIDI is a language that different pieces of electronic musical equipment use to talk to each other. How can we improve the article to make that more clear? —Keenan Pepper 22:58, 1 August 2006 (UTC)
Are you asking for my advice or are you wondering why I don't understand... I'd be glad to tell you both, if you want. :-) TommyBoy76 02:41, 2 August 2006 (UTC)
They go hand in hand. You seem like a reasonably intelligent native English speaker, so if you can't understand the article there's something wrong with it. —Keenan Pepper 03:42, 2 August 2006 (UTC)

Well, I consider myself a bit technology savvy, and I have a big vocabulary, but all of these words are preeettyyy big. I dunno, I think it's just me, though. The article is very well put together and well organized. But, if you want my advice, as an individual just introduced to MIDI, I was confused with two things:

One, the words were huge. And two, I found some explanations were somewhat broad; using words such as "information". What information? Where does it go? What does that solve? TommyBoy76 17:12, 2 August 2006 (UTC)

I'd be pleased to help by replacing the 'huge' words with multiple smaller ones and making the broad explanations less so but need you to provide more specific information on exactly what words are huge and what explanations are too broad. I will see what I can do about 'information' now... Charlie Richmond 15:40, 4 August 2006 (UTC)
I mean, even the first sentence is bit much.

Musical Instrument Digital Interface, or MIDI, is an industry-standard electronic communications protocol that defines each musical note or event in an electronic musical instrument or show device such as a synthesizer, precisely and concisely, allowing electronic musical instruments, computers and other show equipment to exchange data in real time.

Haha, what?? Anyway, I'd be glad to help you, but I'm not sure I have the patience, and I doubt you will either after you get done with me. :? TommyBoy76 18:20, 4 August 2006 (UTC)

Latency

Once and for all, it would be great if the neverending discussion about latency between a MIDI keyboard's key pressed and the final software sound produced would be discussed in this article. My question is: where does this latency come from, and is it true that cheap on-board cards produce higher latency than expensive soundcards? If yes, why is that so? Thanks. --Abdull 16:19, 25 August 2006 (UTC)

Almost certainly the latency in a MIDI setup is smaller than the latency between depressing a key on an acoustic piano and the hammer striking the string. −Woodstone 19:50, 25 August 2006 (UTC)
In a studio level setup this statement may be true, but many amateur musicians complain about lags, in, as you describe, pressing a button and hearing the sound. So the question remains: where does the latency come from some setups inhibit? --Abdull 09:22, 26 August 2006 (UTC)
Lags like that are not necessarily due to MIDI itself, and maybe problems like this could better be covered in other related articles. Computer, bus and disk speeds, auxiliary hardware that offloads some of the CPU's work, availability of multiple CPUs (and software that can use them) and so on can all be factors and maybe out of reach of most home users' budgets. After you press that key, your software needs to interpret the MIDI signal, look up the appropriate samples or wave forms, apply the envelope matching the keypress, synchronise it with any tracks already recorded, apply any effects like reverb or EQ that's been enabled, perhaps record what just happened, and mix it all together and send it to the sound card, which may have a little processing of its own to do. --71.162.89.207 08:07, 26 November 2006 (UTC)
To answer the original question: Yes, it's true, and the reason has all to do with audio buffer sizes and pretty much nothing at all with MIDI. Cheap sound cards aren't 'slower' than better sound cards (they all have to work in real time), but cheap on-board sound is handled through some OS service (such as DirectSound on Windows), whereas pricier sound cards for more serious applications typically use a special driver that is optimized for low latency (see the article on ASIO, for example). These serious applications (e.g. most virtual musical instruments and all DAWs [meaning Cubase, ProTools etc.], as opposed to MP3 playback or a computer game) let you freely set buffer size, so you can set it as small as you can go without the sound breaking up.
Consider this: when your sound card is running at a sampling rate of 44100 samples per second (CD quality), and you feed audio data to the sound card in chunks of 128 samples (a common buffer size in serious applications), then after handing over one chunk you have less than three milliseconds until that reservoir of data is depleted and the next one is required! Every time a chunk gets to the sound chip late, you get a nasty glitch in the audio. Therefore a garden variety audio solution (with 'audio solution' I mean the combination of sound API, OS, sound driver, sound chip, and all the bits in between) will employ a larger buffer size, which results in higher latency.
As I said, this issue has nothing to do with MIDI per se (MIDI messages travel at light speed, pretty much) and I wouldn't put it in the article. There are more minor latency issues that would be relevant to the topic, once you get the separate issue of audio processing (which the question above refers to) out of the picture, but these are subtle details that would belong in a much more specific article. 86.59.11.23 12:42, 6 November 2007 (UTC)
The problems is that latency is not a simple topic to cover. MIDI itself produces latency and delay because the transmission medium is rather slow. A MIDI message takes on the order of 0.96ms to send. Most people have no idea what 0.96ms means. It is considered to be below the level that can be perceived. It is on the order of the delay created by moving the listener a foot or so. It is much harder to hear latency of individual notes relative to each other compared to the latency between two simultaneous notes. Professionals tend to develop much more acute perceptions, most likely because someone like a drummer spends so much time listening for small differences in latency. However, MIDI itself is unlikely to be a major source of latency in current generation systems, it is much more likely that it is caused by drivers, the OS, the lack of real-time support in most OSes used for music, and delays created by recording software, virtual instruments, and plug-ins. That makes it a tough topic to cover in a MIDI article without delving into unrelated topics.
I have just edited the USB section for the reason of latency and jitter.

MIDI over USB has a pretty bad effect on timing. Usually there is a latency of a few miliseconds, altho this can become as bad as 15+ miliseconds. Added to that is a jitter component that is usually contained to +-5ms or so. This is all due to how USB packetizes things. It is not realy designed for real time applications that matter to the milisecond. For people that think that it's too small to be noticeable think of this. If I regularly nudge musical elements in fractions of miliseconds to get a certain feel, how do you think a jitter of +-5 ms sounds to me? -grin- The piano and hammer argument is only partially valid because at least you know exactly what the latency will be. There will certainly be no jitter introduced. —Preceding unsigned comment added by 83.87.229.8 (talk) 14:16, 11 September 2010 (UTC)

Even the latency of 20ms is noticable to me when playing a reasonably good quality sound card like my old SoundBlaster Live 5.1 digital with my old cheap half-size 3-octave keyboard (which has no velocity) through the MIDI-interface, but when playing you can easily compensate for the delay. A latency of 100ms or more is nearly unplayable that is usual with onboard sound chips and even that sound card with the manufacturer-supplied drivers (I've used the kX project EMU10K-chip drivers for that reason). MIDI itself is not to blame, as noted, but the sound chip in the computer, the drivers used to control the chip, and the cheap electronics on the keyboard (or any other MIDI-device you might have). Jitter would be more annoying than latency, if latency is not an issue. When it is, it does not really matter much anymore, unless in other applications like controlling external MIDI devices, when it must become a nightmare. 91.155.202.202 (talk) 20:25, 20 September 2010 (UTC)

MIDI and game port

Following MIDI's official design for DIN-MIDI-on-game-port adaptors on this page and comparing it to this and this pinouts, I'm surprised that MIDI OUT is sent through pin 12 (ground) and MIDI IN is sent through pin 15 (+5 VDC) - what in my opinion means the data is sent to nirvana. Why does it still work? --Abdull 09:22, 26 August 2006 (UTC)

Because the two pages you refer to only deal with joystick applications and pay no attention to MIDI and they both get it wrong by labeling pin 12 as ground and 15 as +5V. No doubt the people who wrote those pages knew that those pins were not ones that any joystick connect to so they measured the voltages with nothing connected, got those values and assumed that's what they were. Such is the accuracy one generally finds in random web articles.... Charlie Richmond 18:17, 26 August 2006 (UTC)
Thanks for your answer. So it's better not to use these diagrams to build your own adaptor as long as you don't want to toast your equipment. --Abdull 11:47, 2 September 2006 (UTC)
Those diagrams give no information whatsoever that is useful for building a joystick-MIDI port adapter since the MIDI connections are neither identified or labeled correctly so there's no way anyone could even start to build such a device using those drawings. Because pins 12 and 15 are actually mislabeled, someone might run into difficulty building a joystick application but it's unlikely anything would be 'toasted'. Charlie Richmond 15:04, 2 September 2006 (UTC)

I removed this external link:

I agree with you completely. Charlie Richmond 18:23, 29 August 2006 (UTC)
  • There seem to be other links to MIDI file sites that keep getting put on the page repeatedly as well. Perhaps we could work on a consensus policy that only links about the technical side of MIDI be put on the page and post it informationally at the top. Charlie Richmond 22:21, 3 September 2006 (UTC)

I've added this as a note in the HTML at the top of the external links section: Note: Any links added to this section should contain material about the technical aspects of MIDI. Please do not add links to sites offering MIDI files. --Ianmacm 06:33, 4 September 2006 (UTC)

I noticed there are a few of redundant sites in this section. I've tried to put some order and I've deleted a few spam sites with no-sense for this topic. I have also added an exaustive link with professional texts by Grove enc. Alegreen 18:24, 16 October 2006 (UTC)

Midi Compression

Can anyone input some information about how Midi is compressed? Timmah01 02:35, 13 November 2006 (UTC)

MIDI is already inherently extremely compressed, so are you asking how it is done now by definition or how it can be compressed even further? The article basically describes the current method by which a very small amount of MIDI data completely defines very complete musical pieces through a standard language of compressed commands. Charlie Richmond 08:28, 13 November 2006 (UTC)
Actually, MIDI data isn't "inherently extremely compressed" at all, from an information-theoretical point of view. A typical musical performance or score stored in a MIDI file has a fair measure of redundancy, so a utility such as gzip can compress such a file by a much greater factor than, say, an MP3 file (which is already compressed and thus has little redundancy). It's just that MIDI data has a very low bit rate to begin with, so it's practically never compressed because it's just not worth the trouble. The MIDI standard doesn't include anything regarding compression of a sequence of MIDI messages. However, it does include provisions for making such a sequence less redundant in the first place. For example, there is 'running status', which means you can leave out the status byte for a message if it would be the same as the status of the previous message. To make this more effective, you can send a note-on message with a velocity of 0 instead of a note-off message. Thus, when a couple of notes are played (and released) on just one channel and nothing else happens, every single message can have the same status (note-on for the channel in question) and the status byte needs to be sent/stored just once. 86.59.11.23 11:33, 6 November 2007 (UTC)
Let's not confuse the interface with the data it sends and receives. MIDI data may be compressible, but it makes no sense to compress the conceptual framework itself. Unfree (talk) 06:06, 31 August 2009 (UTC)

History

Maybe someone with a good historical knowledge of midi can create a History section.

Windows XP users and MIDI playback

OK, I'm a windows XP user, and in the 'control panel' of windows; in the 'sounds and audio devices' icon on the 'audio' tab at the very bottom is a "MIDI music playback" option. Here it gives me a default device to use.

Should there be a section of the article on such integrated devices that are widely used for midi? They are on even modern operating systems. What are those that are commonly found? I for one am trying to look up more information about them and how to load/change to new ones. Search engines like google do little to help and I thought I might find the information here on wikipedia but alas I cannot. Nagelfar 04:51, 7 December 2006 (UTC)

PC sound cards have supported midi playback for a long time because in the early days of PC sound software synthisis would have been either impossible or at least an unacceptable CPU load for things like games. Many sound cards still do and the windows api and control panels for using them are still there. There are also software synths that look to windows like a midi device but really do the synthisis in software and it looks like ms has decided to include one with XP (there are also third party ones but i dunno the names of any off the top of my head). Plugwash 02:14, 8 December 2006 (UTC)


This facility can also be used to redirect midi output to any software or hardware synthesizer. For example if you had a Hardware synthesizer attached to your PC, you could force Windows to use it as your default midi synth, which is what most media players will try to use to playback a midi file. —Preceding unsigned comment added by 68.146.37.30 (talk) 15:17, 26 October 2007 (UTC)

'Sound Samples' section is superfluous

How are the drum samples at the end of this page relevant to the topic? They have little to do with midi other than they could potentially be controlled by midi (but so could anything). They are likely to confuse beginers by reinforcing the misconception that midi is sound too. I think they should be removed. Brentt 22:14, 25 February 2007 (UTC)

They seem to be audio recordings of standard MIDI playback on standard MIDI voices. That makes sense to me. --Cheeser1 21:58, 12 October 2007 (UTC)
No! These are .mid files, not audio recordings. Describing them as 'sound samples' is indeed confusing. The section would be relevant to the 'General MIDI' article (if something has 'standard MIDI voices', it's GM; certainly not MIDI itself), but, like Brentt, I don't see the relevance here. If there are editors who would like to keep this material in the article, maybe someone could fix the section by providing an accurate description of what these files actually are and how they relate to the topic? Otherwise I'll remove the section (or I hope someone else will remove it in case I forget); as it is, it doesn't belong here.
For now I'll change the section heading to "Example MIDI files" ("Sound Samples" is simply factually wrong) and remove the "description page" links, as the descriptions themselves have to be corrected (they also state, wrongly, that the files are "sound samples"). 86.59.11.23 11:47, 6 November 2007 (UTC)

Dave Smith? What about Jim Miller???

Personal Composer, the world's first music/MIDI software for a desktop computer, was written by Jim Miller, who set up for business in his garage in 1983. Jim advanced the DOS-based program, through several releases, to Version 3.3. He was working on a Windows version at the time of his death in 1991. —The preceding unsigned comment was added by 75.71.66.14 (talk) 02:51, 6 March 2007 (UTC).

What is the relevance of MIDI software if there is no MIDI itself? (infindebula - i'm not signed in, 2007/4/25) —The preceding unsigned comment was added by 204.138.85.6 (talk) 13:57, 25 April 2007 (UTC).

Give credit where it is due...

I understand that Yamaha (associated with Dave Smith) sued Jim Miller for the rights to Personal Composer's software and didn't succeed. I see Dave Smith listed as "the father of MIDI" ... I have done research that informs me that isn't true:

Personal Composer, the world's first music/MIDI software for a desktop computer, was written by Jim Miller, who set up for business in his garage in 1983. Jim advanced the DOS-based program, through several releases, to Version 3.3. He was working on a Windows version at the time of his death in 1991.

Is there anyone who knows about this too or is everybody convinced that DS is "the father" or is this partially BS? —The preceding unsigned comment was added by 75.71.66.14 (talk) 03:01, 6 March 2007 (UTC).


And how do you think one can design sequencing software without the knowledge of underlying messaging protocol and actual electronic musical instruments to connect with? --91.76.189.195 17:35, 2 April 2007 (UTC)
Then Miller is the nephew of MIDI. MIDI predates him. Erudecorp ? * 06:59, 30 October 2007 (UTC)
Properly sourced references would clear up this and many other disputes on this page. The problem with the original statement is this line: "the world's first music/MIDI software for a desktop computer". MIDI was present in professional musical instruments before home desktop computers were commonly available. Dave Smith was involved with the development of the MIDI standard. By 1983, musical instrument manufacturers had finalized the MIDI standard, and Sequential Circuits and Yamaha already had MIDI-capable synthesizers on the market. Dementia13 (talk) 04:23, 28 June 2012 (UTC)

Splitting/Archiving

I have split off MIDI usage and applications and The MIDI 1.0 Protocol into separate articles. This has gotten rid of the "this article is getting large" warning. Kc4 17:17, 7 April 2007 (UTC)

I have also archived older discussion threads. Kc4 17:24, 7 April 2007 (UTC)

Output

On a computer, it's even possible to change what midi music will sound like. What's the term for this (sound font?)? Should it go in this article? Erudecorp ? * 09:51, 30 October 2007 (UTC)

MIDI is a communications protocol, which the article explains. It's not at all concerned with how the audio output will sound. You're probably thinking of General MIDI, which the article mentions. However, I guess the distinction isn't really made clear by the article ... I'm not an expert in this area, so I'm not particularly inclined to attempt to fix this myself, but I have a suggestion for a more daring editor who might want to help clarify the matter: As this is such a common misconception, I'd tackle it right at the top, in the paragraph starting "MIDI does not transmit an audio signal or media ..." and add something to the effect that the same sequence of MIDI messages can be interpreted by synthesizers in many different ways, producing different sounds, and that just from the MIDI messages per se there's no way to tell what exactly you'll hear. 86.59.11.23 12:00, 6 November 2007 (UTC)
Basic concepts and common misconceptions definitely need to be made clearer right in the introduction. Something like this (but in prose):
But then why does .mid redirect to Musical Instrument Digital Interface instead of General MIDI? I'll fix it. Erudecorp ? * 03:18, 4 December 2007 (UTC)
You misunderstood -- a .mid file may be interpreted according to the GM standard for playback, but then again it might not. .mid files you find on the web are generally meant to be played that way, but other .mids can just as well be intended to be interpreted in a different way, e.g. controlling a specific synthesizer at specific settings, or even lighting instead of devices that make sound. But either way, it's supposed to conform to the MIDI standard, thus the original redirect was correct. I've reverted your change. That aside, you're right that better addressing common misconceptions would be great. 86.59.11.23 (talk) 14:39, 29 December 2007 (UTC)

Merger

I just merged the content of MIDI Composition as suggested. It seemed to fit well within the scope of this page, and I placed it in what I thought was the most appropriate location. I followed the appropriate steps as laid out by Wikipedia. —Preceding unsigned comment added by Deep-fried twinkie (talkcontribs) 16:26, 15 November 2007 (UTC)

Wanted

Article on pitch to MIDI. -Zahd (talk) 00:33, 26 July 2008 (UTC)

Is MIDI on the way out?

Just looking around at the various electronic home keyboards available now, and I'm amazed to discover that very few actually have MIDI ports now. In fact, it looks like none of the current Yamaha or Casio models have MIDI sockets. Does this suggest that MIDI is on the way out? Or is it still widely used in professional circles, and just vanishing from home kit? 93.96.93.12 (talk) 08:45, 6 September 2008 (UTC)

Reply: Almost all of these keyboards still use MIDI 1.0 protocol messages, and send/receive them via the USB port (which more and more keyboards sport all the time). This illustrates that MIDI 1.0 technology has two different aspects, the message protocol vs. the physical transport, and the protocol aspect is increasingly being used with a different transport aspect. Perhaps even an invisible transport -- MIDI 1.0 message protocol over UDP/IP/WiFi may do away with all external evidence of the physical transport. —Preceding unsigned comment added by 67.101.151.13 (talk) 10:32, 10 December 2008 (UTC)

I'll second that. I think it's a safe bet to say that the MIDI message protocol is here to stay, because what would you replace it with? Anything proprietary is not going to have industry support. Given how ubiquitous USB is now, how many ports are available on a typical PC (12 ports on many computers), and how we are increasingly seeing keyboards with USB MIDI ports, my bet is that USB MIDI is the wave of the future. At 480 Mbits/sec any delay and latency currently being encountered must be due to software overhead. Current generation processors, with low latency drivers can send and receive MIDI message over USB with delay and latency of less than half that possible with the native MIDI transmission medium (due to the speed limits of the MIDI transport protocol.) The delay and latency will continue to shrink with each new generation of processors. While the USB cable length limit has the potential to be a problem, extenders using CAT5 cable have demonstrated the ability to reach 50m lengths. —Preceding unsigned comment added by 216.165.139.218 (talk) 00:11, 3 July 2010 (UTC)

MIDI THRU edits

I don't think the information just put in about MIDI THRU and MIDI being based on a ring network topology is correct. The original MIDI spec did not require that equipment echo messages not intended for itself (essentially being a merger) to its own MIDI OUT port. Nor did the spec say that a network should be created with a ring of connections going from OUTs to INs. Finally, the MIDI THRU port was part of the hardware spec right from the start as I recall, not something that some manufacturers added later. In essence the whole MIDI concept was much more free form with users allowed to connect things any way they seemed logical and which would work for them, with no requirement to network anything. The simplest example being the simplest form of keyboard sending MIDI to a synthesiser or computer to generate the sound desired. One cable, OUT to IN, no network required. —Preceding unsigned comment added by Charlierichmond (talkcontribs) 02:44, 22 September 2008 (UTC)

Reply: I agree with Charlierichmond, ring technology is a misleading reference which the MIDI 1.0 specs do not themselves make, and to which MIDI 1.0 doesn't really correspond. No receiver has any idea what other receivers down the line will or will not respond to, in fact it is very common to set multiple receivers to the same channel so that one sender reaches all of them with a single message on that channel; as a result, no receiver is able to (nor should it) remove anything from the stream merely because it plans to respond to it. Further, in MIDI 1.0 the ring is basically never closed, all you have for sure is one or more point-to-point connections which may or may not be daisy-chained in some user-determined, ad hoc, non-normative manner. I recommend removing the new material in that passage, probably rolling back to the immediately previous version. Of course I could be wrong, but if the author of the ring topology changes thinks so, then I would surely appreciate a supporting citation or two -- and if so, then thanks in advance for that...? —Preceding unsigned comment added by 67.101.151.13 (talk) 10:45, 10 December 2008 (UTC)

Restored history section

Since this article seemed to be curiously lacking in facts about the history and development of MIDI (too many details to be the result of mere omission or oversight), I checked through the revision history. Turns out the problem was related to a vandalism spree on 23 September 2008 by 204.85.12.6 (talk), an IP address of the North Carolina Research and Education Network in the Research Triangle Park. The relevant edits were made at 16:55, 16:57, 16:58, 16:58, and 17:03. Despite subsequent efforts to fix the resulting mess, the entire history section remained missing. I'm restoring it now, complete with cleanup tag. It should at least be easier to cleanup the section than to rebuild it from scratch. —Error -128 (talk) 00:02, 7 May 2009 (UTC)

EWI

"EWI, which stands for Electronic Wind Instrument, is designed for performers who want to play saxophone, clarinet, oboe, bassoon, and other wind instrument sounds with a synthesizer module"

No it isn't! It is designed for woodwind players who want to be able to play synthesised or sampled sounds using the playing and expressive techniques which they already know. An EWI doesn't have to be used with a realistic or imitative sound, and I have heard remarkable "guitar solos" played on a Yamaha WK woodwind controller! —Preceding unsigned comment added by 79.78.46.138 (talk) 13:15, 30 June 2009 (UTC)

The Spacing and Screen Issue.

Why does this article have to have gross errors with regards to spacing? Also, there is a part that causes the screen to become very large. Please, someone fix it. It is an eyesore.

Some bot-like edit added spaces to start paragraphs. This is not acceptable for wiki formatting. Removed now. −Woodstone (talk) 16:38, 1 July 2009 (UTC)

Files

Any discussion related to files ought to be removed from this page. MIDI refers to an interface, not a file format, and the standardization of MIDI files occurred independently of the standardization of MIDI itself. Unfree (talk) 05:13, 31 August 2009 (UTC)

Messages

Because of their fundamental role and conceptual importance, introducing MIDI messages ought to occur very early in the article, perhaps in the introductory paragraph, along with the idea of sending and receiving them. As it stands, such a discussion occurs too late. Also, "MIDI channel message" is discussed without distinguishing it from "MIDI message" itself -- a mistake, in my opinion. Unfree (talk) 05:57, 31 August 2009 (UTC)

I agree that MIDI messages are the heart of what MIDI is about ... especially for people trying to work with it practically speaking. The article seems pretty overloaded with details already, and one way to get more detailed is to move the bulk of that coverage to separate (existing?) articles. The existing page is already 50K, and covering messages in any depth could easily go twice that. Right now the article needs a whole lot of work on citations, because it hasn't got many. Twang (talk) 04:32, 3 July 2010 (UTC)

Overhaul of history section

I just overhauled the history section a bit because it seemed overloaded and unfocused on the essentials.

  • Moved the opening section on the MMA to the top of the Usage section where it flows well into what follows there. The MMA is important in a regulatory sense but a long discussion at that point distracts from the problems that MIDI was created to solve. I inserted a link to the new MMA location.
  • There was a long segment on MIDI's small file size; I left the part about how important that was in the early days of computing and slow net speeds -- essential to MIDI's success on the net. Rather than toss the rest, I moved it to the bottom of the Standard MIDI (.mid or .smf) section. That probably needs more work.
  • I rewrote the section on how MIDI 'sounds bad' on some hardware. MIDI doesn't specify timbres, it controls them. The limitation in hardware is the cost of producing sophisticated timbres. In the 90s, most computers were unable to handle dozens of streams of samples (which MIDI works great -to effect-). I briefly explain how General MIDI was developed to introduce SOME standardization in what sounds to expect;the resulting timbral quality was, and is, still up to manufacturers.
  • I think the result leaves the section less cluttered, and leaves room for some more development. For example, several people besides Smith were involved in the early days!! Twang (talk) 04:22, 3 July 2010 (UTC)

Other applications

The text mensions theatre lighting as a purpose of MIDI. I guess this is a mistake, isn't it? In general DMX is used to control lighting - especially since it is a bus system instead of a point to point connection. What do you say? —Preceding unsigned comment added by 93.209.226.232 (talk) 18:08, 17 August 2010 (UTC)

It is NOT a mistake. Since MIDI is just a bunch of bytes sent down a cable, nothing prevents anyone creating a device to interpret those bytes in other ways. I and a friend designed the very first MIDI-controlled light dimmer in 1985-1987. It was called the IDP 612 and had 6 channel of 1200 WATT per channel, and was demo'ed in our booth at the Chicago CES show. Our financial partner had major financial problems (not related to IDP) that caused the startup to come to a halt. We built 1 to 2 dozen dimmer packs, and some were used for a very long time by a local band. See MIDI SysEx, USA Manufacturer ID number 02 was our startup, IDP. http://www.hinton-instruments.co.uk/reference/midi/protocol/pg04.htm
SbmeirowTalk16:43, 20 January 2011 (UTC)

RTP-MIDI transport protocol

The article states the following about Apple's RTP-MIDI implementation:

Since the session protocol uses a UDP port different from the main RTP-MIDI stream port, the two protocols do not interfere (so the RTP-MIDI implementation in Mac OS X fully complies to the IETF specification).

This is not true. As far as I know there is no public information on the implementation. However, if you use Wireshark (with the builtin applemidi-dissector) to inspect the UDP-Packets you will notice that the session-protocol packets are sent on both ports. Synchronization packets ("CK") are shared with the RTP-Port; session invitation, acceptance, refusal ("IN", "OK", "NO") happens on both ports. Implementors have to look at the first two bytes of the Packet to find out the protocol to use. The sentence should either be removed, or cite reliable source to prove me wrong :) 80.149.209.52 (talk) 11:34, 12 January 2011 (UTC)

Alesis Quadrasynth and subsequent synthesizers MIDI connectivities

The Alesis QS range of synths which goes back to 1993 had (has?) two methods for MIDI connectivity. As well as the usual 5 pin DIN method for connection to other MIDI devices (including PC and MAC computers) there is also a serial port reserved for connection to a computer's serial port, switchable between PC and MAC specifications and which operates, not at 31.5, but at either 31.25 or 38.4 kilobaud. I mention this because there doesn't seem to be a mention of it in the article but as far as I know is still implemented. The synthesizer interface is DIN-8. It means you don't require an additional MIDI interface for the link, and avoids the latency issue that some people have mentioned USB has.184.41.39.44 (talk) 01:43, 16 March 2011 (UTC)

Definition of MIDI only partly correct!

The MIDI specification is not only about the protocol.

The MMA Complete MIDI Specification makes it quite clear that it covers both hardware and software. i.e. the DIN 5 pin interface, asynchronous transfer, start/stop bits, 5v current loop, bit rate, cable length as well as the data format, or protocol if you will. There is an extremely good introduction in the MIDI spec. which I'm prepared to paraphrase and submit for approval, if you wish. BTW I have a copy of the complete MIDI spec (including GM) as well as the GM2 spec. legally acquired.

The article IMO should make clear that the MIDI spec. is created and owned by the MMA as opposed to being an open standard.

Whilst USB may have become the favourite interface for transmitting MIDI data, nowhere is it mentioned in the MIDI specification. I recall reading about encapsulating MIDI messages within the USB frame structure some time ago.

One part of the MIDI spec. is a section on MIDI Time Code. I don't recall anything in the article about timing. Am I wrong?

The structure of SMFs is part of the spec. and should be included in the main article on MIDI IMO or, at least, a description of their structure. It is, after all, just a way of capturing MIDI data (or the protocol if you prefer) with timing messages for playback.

Likewise MIDI Show Control and MIDI Machine Control are part of the spec.

Comments welcome. SysExJohn. SysExJohn (talk) 14:27, 29 August 2011 (UTC)

SysExJohn, in Wikipedia you don't need prior approval to improve articles. Just be bold and make changes or add what you think needs to be included. Comments may follow if people disagree. −Woodstone (talk) 15:16, 29 August 2011 (UTC)

Multiple issues

Additional citations

Why and where does this article need additional citations for verification? What references does it need and how should they be added? Hyacinth (talk) 02:35, 4 December 2011 (UTC)

Copy editing

Why and where does this article need copying editing? How should it be cleaned up? Hyacinth (talk) 02:39, 4 December 2011 (UTC)

Personal reflection or essay

Why and where is this article written like a personal reflection or essay? How should it be cleaned up? Hyacinth (talk) 02:39, 4 December 2011 (UTC)

Wikipedia:Writing better articles is a good place to start your investigation. Dementia13 (talk) 22:23, 2 July 2012 (UTC)

Chiff lead

What It is a chiff lead? [[JesusGTAFAN]] — Preceding unsigned comment added by 190.22.236.126 (talk) 02:05, 31 December 2011 (UTC)