MIDI is short for Musical Instrument Digital Interface, and it allows electronic instruments and computers. Information can be carried between devices by a MIDI cable; the cable has 16 pins, one for every channel. A MIDI signal carries what are called "event messages' - these specify several things, including pitch, what note should be played, velocity (how hard the note should be played) and note length. So that all instruments are connected and respond in the same way, General MIDI was created. This is a standard specification for anything that responds to MIDI signals.
MIDI messages, as with all data, is sent using binary bits (either 0 or 1). There are two types of message bytes. Status bytes always begin with 1, and data bytes with 0; the other 7 bytes represent the message. All MIDI message begin with the status byte; the first three of the remaining 7 bytes represent the type of message, with the other 4 bits represent which channel numbers the data applies to.
There are several types of MIDI connections. Whilst some MIDI interfaces (such as the one I chose to use) use USB, others use the standard MIDI pins. MIDI in and out ports are the most common; these respectively take in data from a device and send data to a device. MIDI Thru is less common; this connection copies what comes in from the MIDI In port and sends it to another device, allowing for several devices to be controlled at once.
We are able to manage MIDI data in our DAW (digital audio workstation) in several ways. In Cubase, there are two methods. The first one is the key editor, allowing us to select which keys to play using a virtual 'keyboard'.
Within a DAW, there are several examples of MIDI controllers. These are event messages that affect a selection of the notes in some way. These messages are known as 'continuous controllers.' In Cubase, there are several to choose from, for example, one of these affects the overall volume of the track.
MIDI files are generally saved in the .mid format. However, there are two types of this format: MIDI Type 0 merges information for all of the tracks into one track, whilst MIDI Type 1 contain separate information for each track.
Information on Computer Systems
To sequence in MIDI, a DAW (digital audio workstation) is required. A DAW is essentially a replacement for a conventional mixing desk and analogue tape for today's modern recording engineers and producers. Just as there are several different mixing boards, each with their own 'kinks' and workflow, there are many DAWs available for use; the main difference being their workflow for different situations.
Here is a list of several DAWs:
- Steinberg Cubase - this is the DAW we are using (version 6). It is equally suitable for both recording audio and sequencing MIDI files.
- Ableton Live - this DAW originally started as a performance instrument for electronic musicians and DJs to be able to sequence samples and backing tracks live, but has since evolved to become just as good for home recording, within its 'Arrangement' view. In comparison, Ableton Live's 'Session' view is perfect for live users, as it is able to fit onto a laptop screen perfectly.
- Avid Pro Tools - this DAW is considered to be the gold standard for professional recording software, and is used in many professional studios worldwide. Traditionally, Pro Tools' strength is within mixing/mastering audio files, although it still comes with a large array of virtual instruments to use.
- Image Line FL Studio - this DAW initially started out as 'Fruity Loops' but was forced to change name due to legal action from a certain famous breakfast company. FL Studio has become extremely popular among electronic composers for its easy workflow and high quality piano roll.
- Apple Logic Pro X - After Apple purchased this DAW, it has become highly regarded among Mac users for being an extremely useful composition and recording tool, both for MIDI and audio. The latest version of Logic Pro X has been praised for its drum kit builder and wide variety of audio unit plugins.
Whilst DAWs can run on both Windows and Mac OS X, I sequenced my song on an iMac running Mac OS X Lion 10.7.5. An iMac is one of Apple's highest powered computers, so contains more than enough processing power and memory to handle a DAW.
Samplers and Keyboards
Throughout the 1980's, samplers became common. These were devices that allowed the user to play back sounds from instruments (both live and digitally recreated) within the sampling library. Devices of this type include the Synclavier (1978) and the Fairlight CMI (1979); the Fairlight is extremely well known for its drum sounds, which have been used on many hit pop songs:
As well as pop, the Fairlight was also used on many hit rock records, notably Def Leppard's Hysteria. Producer Robert John "Mutt" Lange employed the Fairlight to aid with the recording of Rick Allen's drum tracks, Allen having lost his arm in a car accident.
In recent years, the need for a dedicated synthesiser has been replaced by MIDI interfaces (such as the M-Audio Keystudio 49i) and virtual software instruments. MIDI messages are used to trigger notes within the software.
Synthesizers have also changed in modern times. One of the first synths was the miniMoog (developed by Robert Moog in the 1960s), and quickly became of the most popular musical devices of the time, notably being used by Gary Numan.
Samplers and Keyboards
Throughout the 1980's, samplers became common. These were devices that allowed the user to play back sounds from instruments (both live and digitally recreated) within the sampling library. Devices of this type include the Synclavier (1978) and the Fairlight CMI (1979); the Fairlight is extremely well known for its drum sounds, which have been used on many hit pop songs:
As well as pop, the Fairlight was also used on many hit rock records, notably Def Leppard's Hysteria. Producer Robert John "Mutt" Lange employed the Fairlight to aid with the recording of Rick Allen's drum tracks, Allen having lost his arm in a car accident.
These days, hardware samplers are no longer required. Software samplers, such as the Groove Agent One (included within Cubase) allow samples to be assigned to separate MIDI notes; these can then be played back via a MIDI interface or a key/list editor within a DAW.
In recent years, the need for a dedicated synthesiser has been replaced by MIDI interfaces (such as the M-Audio Keystudio 49i) and virtual software instruments. MIDI messages are used to trigger notes within the software.
Synthesizers have also changed in modern times. One of the first synths was the miniMoog (developed by Robert Moog in the 1960s), and quickly became of the most popular musical devices of the time, notably being used by Gary Numan.
By the 1970's, synths had been invented that played back saved sounds. An example of this is the Mellotron, a device that was able to play back pre recorded sounds from a tape inserted into it. The Beatles and Genesis adopted the device throughout the 60's and 70's, and, despite being outdated, was still used in the 80's by Orchestral Manoeuvres in the Dark.
Even so, some will still prefer to use analogue synthesisers. An example of this is the MiniBrute, made by Arturia.
This generates sound using three processes:
- Generating a wave (whether this is a sine, triangle. sawtooth or square)
- Running this wave through an ADSR (Attack - Decay - Sustain - Release), which controls the listed four parameters
- Using an LFO (Low-Frequency Oscillator) to create a sweep
Diary
Lesson 1 - 17/11/2015In today's lesson, I was introduced to the unit via the assignment brief, and given a score of the song (Clean Bandit's Rather Be) that I was to sequence. After setting up a blank project in the DAW (Cubase), I quickly created two new instrument tracks; both used the HALion VST synth plugin on a violin setting.
To create a new track, I used the project menu drop down, as shown:
In order to test whether the sound was relevant or not, I used a MIDI interface (an M-Audio Keystation 49i) to play random notes, tweaking the settings to taste.
(A VST plugin will take live audio and/or MIDI data and simulate real equipment. In this case, the plugin simulated a violin.)
Using the pencil tool in Cubase, I then set to work reading the bars where strings are included (bars 1-7), then 'drew' the notes into Cubase. After this was complete, I then repeated the process for the notes in the bass clef. When notes used stacatto (shown by a dot above the note), I reduced the note length; for example, if the note was a quarter note, I reduced its length by 75%.
Next, I created a new synth track, designed to play notes from bar 9 onwards. As the tempo changes at this point for 115bpm to 121bpm, I edited the tempo track in Cubase, keeping the slower tempo for bars 1-7 then increasing it from bar 8 onwards. For this instrument, I used a Sci-Fi synth sound, completely removing any reverb to accenuate the stacatto in the score.
Lesson 2 - 24/11/2015
In today's lesson, I continued with my work on the synth track, adding in the lower register. In addition to this, I also added the vocal track for the bars I had done so far; I chose to use an alto saxophone, as I felt it was one of the only instruments that actually sounded like a real voice.
In addition to this, I was introduced to the Groove One sampler. This allowed me to trigger drum samples from Cubase's media bay by playing the corresponding keyboard note. This week, I programmed in a kick and hi-hat sound then wrote those out on the drum track.
In order to program using the Groove Agent, I loaded up the relevant control panel, then dragged in the samples that I wished to use from the Media Bay:
Lesson 3 - 01/12/2015
In addition, I also researched MIDI and computer systems. I found a brief history on both MIDI and DAWS; my information can be found at the start of this blog.
Lesson 4 - 05/01/2016
In today's lesson, I expanded all other tracks towards the first chorus. In addition, I also set up the remaining blank tracks ready for next lesson. This included the background vocals, piano solo and percussion.
To add to this, I also added extra screenshots. These mostly revolved around informing on how to set up a blank project then creating the first instrument tracks and setting them up with either the HALion or Groove Agent.
Lesson 5 & 6 - 12/01/2016 and 19/01/2016
In both of these lessons, I focused on completely finishing all parts until the end of the first chorus.
As well as this, I gave the track a basic mix - firstly I added compression to the drums in order to reduce their overall volume. I then gave the strings, piano and synths a small amount of reverb, and finally added an EQ to both the lead and backing vocals. To ensure both tracks could be heard over the top of the instrumentation, I gave them both a large boost in the mid frequencies.
Lesson 7 - 02/02/16
In this lesson, I chose to replace both sequenced vocal tracks with live vocals. I recorded three different vocal tracks:
- The lead vocal, which lasts the length of the recording
- A second lead vocal, used in the chorus as a doubletrack
- Backing vocals, used in the chorus
Ultimately, I chose not to use the chorus double track part as it suffered from some very slight clipping in the higher frequency. I then gave the track a mix, and then exported it using Cubase's Audio Mixdown feature.
Here is a link to my final mixdown:
Below is a track list of my final mixdown, along with what effects were used:
Track No.
|
Name/Purpose of Track
|
Effects Used
|
1
|
Lead Vocals
(Panned Center)
|
|
2
|
Backing Vocals
(Panned Left 17)
|
|
3
|
Strings – treble clef
(Panned Left 11)
|
|
4
|
Strings – bass clef
(Panned Right15)
|
|
5
|
Synth – treble clef
(Panned Center)
|
|
6
|
Synth – bass clef
(Panned Center)
|
|
7
|
Piano solo
(Panned Center)
|
|
8
|
Piano – treble clef
(Panned Center)
|
|
9
|
Piano – bass clef
(Panned Center)
|
|
10
|
Bass
(Panned Right 28)
|
|
11
|
Drums
(Panned Left 17)
|
|
12
|
Percussion
(Panned Center)
|
|
Stereo Out
|
Stereo Mixdown
|
|