StudentShare
Contact Us
Sign In / Sign Up for FREE
Search
Go to advanced search...
Free

Musical Instrument Digital Interface, Polyphonic, and Multi-Timbral Instruments - Essay Example

Cite this document
Summary
This paper "Musical Instrument Digital Interface, Polyphonic, and Multi-Timbral Instruments” reviews the basics of Musical Instrument Digital Interface. It also captures the related aspects of MIDI-related musical notation software and its repertoire of MIDI messages…
Download full paper File format: .doc, available for editing
GRAB THE BEST PAPER93.3% of users find it useful

Extract of sample "Musical Instrument Digital Interface, Polyphonic, and Multi-Timbral Instruments"

Table of Contents 1.0.Musical Instrument Digital Interface (MIDI) 2 1.1.Message structures in MIDI 3 2.0.Polyphonic and Multi-Timbral Instruments 4 3.0.MIDI data 5 3.1.Parsing MIDI data 8 Figure 1: MIDI data and the MIDI instrument transfers 6 Figure 2: Figure 2: MIDI data is represented as a list of instruments, each of which has its own ‘piano roll’ 9 Table 1: Groups of MIDI data 7 1.0. Musical Instrument Digital Interface (MIDI) MIDI has the ability to link electronically, different entities that are involved in the creation, storage or performance of musical performance. As MIDI Manufacturers Association (2002) explains differently, MIDI is the centerpiece of the whole complex process or paradigm of ‘note-oriented digital representation of performance of music’ (p. 25). This section reviews the basics of MIDI. It also captures the related aspects of MIDI-related musical notation software and its repertoire of MIDI messages. Basing on the definition of MIDI as put by MIDI Manufacturers Association (2002), the insinuation from the definition is that the interface incorporate a standard physical, logical, and syntactic which, in a number of occasions, can link MIDI controller and one or several other MIDI sound modules so that the automated performance of music is allowed. In addition, the definition helps in understanding MIDI keyboard controller. Basically, with regard to MIDI keyboard controller, there are two classical units that comprise the MIDI controller category. The first is the MIDI interface-equipped musical keyboard. Ideally, this means that just the keyboard proper, contrary to the notion that it is the entire instrument entailing both the music synthesizer and a keyboard. Rumsey and McCormick (2002) sum this as ‘keyboard’ in his latest research. The second unit is the MIDI sequencer. According to Dixon (2001), this unit as an entity has the ability to automatically emit over a MIDI interface, the flow or stream of information (in most cases instructions) for the purpose of a musical performance (musical performance is thus the MIDI sequence). It has to be noted that a sequencer can any hardware that can read from floppy disk. In as much as it has been common practice to generalize the function of MIDI interface with regard to its definition and the two classical units (where the performance can be transmitted from the controller to the sound module), in the actual sense, Dixon (2001) argues that MIDI stretches beyond that and it may also be applicable in settings such as passing or transmission of a performance from a keyboard to a sequencer with a view to ensuring that the performance can be kept for later automatic rendition. Transmission format for MIDI is another area that directly with the understanding of MIDI. As McKay and jSymbolic (2006) put it, the format is in asynchronous format (meaning that it works on the basis of start-stop) which ranges at 31.3 kilobouds. It has to be understood that if there are characters that are not moving or sent at maximum rate, then the stop bit of a given character will have a prolongation (prolongation means that there is a period of the idle condition) up to the time when the next character is able to begin. 1.1. Message structures in MIDI The messages that are linked with MIDI are transmitted in terms of MIDI messages (these messages are made up of three related characters or bytes with the first being the status byte which has the ability to indicate the type of message. The second is the data bytes which carries the parameters of the message. In most cases, if the third byte is present it is always the data bytes which serve the same function. Based on the message structures of MIDI, it is possible to predict what future holds for MIDI. First, the original specificities of MIDIs have changed considerably since it was first created in as much as some of its original bytes were note defined. Secondly, how the MIDI is made (its architecture) is not allowing for any expansion without first redesigning its system. Therefore any future design should put into consideration the need for back-ward-compatible and operate with legacy MIDI hardware. On the other hand, as McKay and jSymbolic (2006) has already noted, the major improvement to enhancement to general MIDI is the ability to send sample data together with a standard MIDI file in a format that is downloadable sound (DLS). Certainly, not all of these sound modules or cards are equipped to deal with this information. 2.0. Polyphonic and Multi-Timbral Instruments The best way to understand polyphonic and multi timbral instruments is to first understand the operational concepts of MIDI. The first concept is that most MIDI messages can be tagged in respect to a channel number. Secondly, a number of MIDI sound modules have the ability of emulating (in most cases at the same time), wide range or multiple musical instruments with different timbres (resonances or tones). As a result, the channel tag system will thus allow the note message created for these multiple or different virtual instruments or gadgets that have been travelling across the same interface to be separated when they reach the destinations. Linking these concepts to the aspect of polyphonic and multi timbral instruments, any given musical instrument (this case now links even the above discussed MIDI sound module) with the ability of sounding just a given sound note (with one sound note) at a given time (let us take a case of an actual trumpet) can be argued to be monophonic. On the other hand, an instrument with ability to sound more than one note at a given time (this essay can take a case of an actual piano) is categorized as polyphonic or with the ability to produce polyphony (McKay and Fujinaga jSymbolic 2006). As a sound module (take a case of sound module explained above) which can, contemporaneously (occurring in the same period of time or existing at the same time), act as different or several different instruments, with each instrument with a different timbre, can be categorized as multitimbral or having the property of multitimbrality. 3.0. MIDI data The definition as earlier provided helps in the understanding that MIDI is a technology that permits the electronic musical instruments to transmit data to and from each other. These data are transmitted by sending or receiving performance information. It is within the context of performance information that description of MIDI data can be conceptualized. The performance information is actually the MIDI data which can be transmitted between musical instruments that are compatible with MIDI, or saved as a file that can be used for later playback. Fujishima (2008) argues that regardless of the computer or instruments used, MIDI data or performance information will produce the original information (however, the real sound of the performance has the tendency of differing from instrument to another for instance, the acoustic piano sound on one instrument may sound more rich than another). Since MIDI data is understood to be performance information saved, they can be changed or edited. For instance, one can alter incorrectly played note or change the tempo of the song. This characteristic of MIDI data is instrumental when one wants to create or practice with MIDI instrument (discussed in the subsequent section). Additionally, when connecting multiple MIDI instruments (this may also include computers) with a cable, there is possibility that the MIDI data can be received and transmitted. For instance, when one use keyboard as MIDI instrument to record a performance (obtaining MIDI data), the recorded data can be transmitted to another instrument and the same data played back on it. MIDI data can also be bought elsewhere (computer files). These may include favourite tunes or music songs that have been acquired online and listen to the MIDI data by playing it back in any of the supporting MIDI instrument. The images have been included to support in understanding of the MIDI data and transfer of such data. Figure 1: MIDI data and the MIDI instrument transfers MIDI data can be divided into two groups. The first group is channel messages/data and the second group is the system messages/data (Fujishima 2008). Beginning with channel data, when one performs on a MIDI instrument, there are different data/messages that are generated. The table below has been included to help in the understanding of this particular message/data group. Table 1: Groups of MIDI data MIDI instrument operation Data/Messages Playing the keyboard Note On/Off (When the key was played/released) Note Number (Which key was played) Velocity (How strong the key was played) Selection of a voice Program Change Changing the volume, pressing sustain key or pedal among others Control Change Moving the pitch bend wheel Pitch Bend Pressing the key down after the notes is played After Touch On the other hand, Dixon (2001) argues that System data or messages are the data that are used frequently by the entire MIDI systems. While system data are varied, they can include system exclusive messages for transferring data that are specific to each instruments manufacturer as well as messages for the control of the MIDI devices. In conclusion, there are two types of devices responsible for the generation of MIDI data. The first device is the MIDI controllers and MIDI instruments. MIDI instruments (in some instances known as synthesizers) are of different types and shapes. 3.1. Parsing MIDI data In naïve way, the explanations above presents MIDI data as a bitwise representation of a musical score which on the one hand has different bit sequences and on the other hand, the sequences indicate musical events. Due to this, parsing (describing or explaining) MIDI data is challenging, manipulating it directly is cumbersome. It is for this reason that Dixon (2001) postulates that there are a number of software that have been made available in libraries for programmatically, parsing and performing high-level description of MIDI data. Apparently, Dixon (2001) adds that all the software in existence either represent MIDI data as a lower-level (this is to mean that they are directly corresponding to what can be termed as bit-level representation) or to some extent as higher-level manner (in this case as musical features). In a simple way, this can make what would have been a simple manipulation and analysis to require a great deal of codes from the source and expertise. To contextualize this argument in python-midi module, the process of shifting up the pitch of the available notes in a MIDI data/file by applying two (2) semitones may take only a few code lines. However, the process of constructing a piano roll representation may be different in the sense that it may take some hundred lines of codes since there is need to convert MIDI ticks to time in seconds. Additionally, there need to pair note-on events with note-offs, ignoring drum events among others. It is for the same reason that Sleator and Temperley (2002) in their research created Python module that was aimed at manipulating, creating and analyzing MIDI data. In execution, the module aimed at making the most common operations that can be applied to MINI data as a one time or straightforward process. Looking at figure 2 below, the module represents a MIDI data in hierarchical manner. As seen the diagram below, the figure shows a module used for parsing, creating and manipulating MIDI data. Figure 2: Figure 2: MIDI data is represented as a list of instruments, each of which has its own ‘piano roll’ Explaining the figure above, this is a representation of MIDI data in form of hierarchy of classes. At the top of the figure 2 is the class which comprises of global information like tempo changes as well as MIDI resolution. The top side of the figure also contains a number of instrument class instances. Every instrument has been specified by a program number as well as a flag showing whether or not it is a drum instrument. Basing on the data I/O, the top level of the figure 2 can be instantiated (represented) with a route or path to already existing MIDI data, in which case the class can be populated by explaining/discussing/parsing the data. The last aspect to explain on the figure 2 above is the extraction of information. The creation as represented above has the ability of performing the analysis of data they may have contained, some of which may have a corresponding functions in a given instrument class. While the creation by xxxxx as represented in the figure 2 above has functions for analysis and extraction of MIDI data (or information from MIDI data), there are some elements that are missing with regard to understanding MIDI data. The type of analysis as created, may exists in other software such as MATLAB MIDI or Melisma Music Analyser. References Cory McKay and Ichiro Fujinaga. jSymbolic (2006): A feature extractor for MIDI files. In Proceedings of the International Computer Music Conference, pages 302– 305. Daniel Sleator and David Temperley (2002). The melisma music analyzer. http://- www.link.cs.cmu.edu/music analysis. Francis Rumsey (1994). “MIDI Systems and Control”, Focal Press.– Chapter 2: Introduction to MIDI Control Francis Rumsey and Tim McCormick (2002). “Sound and Recording: An Introduction”, Focal Press. – Chapter 13: MIDI MIDI Manufacturers Association (2002). The complete MIDI 1.0 detailed specification (www.midi.org) Simon Dixon (2001). Automatic extraction of tempo and beat from expressive performances. Journal of New Music Research, 30(1):39–55. Takuya Fujishima (2008). Realtime chord recognition of musical sound: a system using common lisp music. In Proceedings of the International Computer Music Conference, pages 464 467. Read More
Tags
Cite this document
  • APA
  • MLA
  • CHICAGO
(Musical Instrument Digital Interface, Polyphonic, and Multi-Timbral In Essay, n.d.)
Musical Instrument Digital Interface, Polyphonic, and Multi-Timbral In Essay. https://studentshare.org/music/2064970-music-technology
(Musical Instrument Digital Interface, Polyphonic, and Multi-Timbral In Essay)
Musical Instrument Digital Interface, Polyphonic, and Multi-Timbral In Essay. https://studentshare.org/music/2064970-music-technology.
“Musical Instrument Digital Interface, Polyphonic, and Multi-Timbral In Essay”. https://studentshare.org/music/2064970-music-technology.
  • Cited: 0 times

CHECK THESE SAMPLES OF Musical Instrument Digital Interface, Polyphonic, and Multi-Timbral Instruments

Assessment instruments

Reliability is about the Assessment instruments College: Assessment instruments Psychology is the scientific study of the mind involving behaviour and cognitive processes, as well as how an organism's physical state, mental state, and external environment affect them.... An instrument is reliable if iterated measurements of the test under the same conditions are consistent....
2 Pages (500 words) Assignment

Problems with Instrumental Variable Estimation

This work presents the summary of an analysis of the problems that instrumental variable estimation poses when the correlation between the instruments and the endogenous explanatory variable is weak.... The number of variables is dependent on the scope and the nature of the Task: Problems with instrumental variable estimation Introduction This work presents the summary of an analysis of the problems that instrumental variable estimation poses when the correlation between the instruments and the endogenous explanatory variable is weak....
1 Pages (250 words) Essay

Maha De Carnival by Sigman and Bonfa

Pitch denotes the highness or… Each musical instrument or voice produces its own characteristic sound patterns and resultant, which gives it unique tone color, which is referred to as timbre.... Each musical instrument or voice produces its own characteristic sound patterns and resultant, which gives it unique tone color, which is referred to as timbre.... Keyboard instruments and lack of singer's voice identify the section's timbre.... The timbre in this section is different since the singer's voice and string instruments are introduced....
1 Pages (250 words) Essay

William Barton and the Australian Youth Orchestra

In the paper “William Barton and the Australian Youth Orchestra” the author analyzes the music performance of William Barton and of the Australian Youth Orchestra, which can be characterized as having different timbres due to different types of instruments employed.... hellip; The author of the paper states that the instruments belong to the following groups: chordophones (violins), aerophones (didgeridoo, trumpets and clarinets), and membranophones (drums)....
3 Pages (750 words) Assignment

Texas Instruments Problem of Customer Satisfaction

… The paper "Texas instrument's Problem" is a great example of a business assignment.... Texas instrument has faced a problem when the company becomes consumer-centric and loses its focus on its customers.... The paper "Texas instrument's Problem" is a great example of a business assignment.... Texas instrument has faced a problem when the company becomes consumer-centric and loses its focus on its customers.... Texas instrument's problem is more strategic in nature than tactical....
3 Pages (750 words) Assignment

Field Instrument Calibration, Test and Troubleshooting

This term paper "Field Instrument Calibration, Test and Troubleshooting" presents calibration and troubleshooting of field instruments such as pressure transmitters, temperature sensors, and flow elements.... hellip; Various types of field instruments are applicable in a particular field site.... This article will provide a brief theory of calibration of the common types of field instruments.... For maximum safety and accuracy, the user must ensure any instrument used at a field site is working properly and has been calibrated according to the manufacturer's instructions....
7 Pages (1750 words) Term Paper

Darabuka - the History of Percussion Instrument

The single-headed shape of the drum distinguishes it from other percussion instruments of its family.... Complex techniques are also used with the instruments.... … The paper “Darabuka – the History of Percussion instrument ”  is an impressive example of an essay on music.... nbsp;Darabuka, also known as the goblet drum is a percussion instrument from the Middle East.... The instrument has a goblet shape....
6 Pages (1500 words) Essay
sponsored ads
We use cookies to create the best experience for you. Keep on browsing if you are OK with that, or find out how to manage cookies.
Contact Us