Song a Day: “Part of Your World” (Alan Menken and Howard Ashman)

The songs from the Walt Disney Pictures 1989 release, The Little Mermaid, are as good as any song you’ll hear in any movie. The lyrics are effective, the tunes are very singable and melodic, and the orchestration is appropriately dramatic. Amongst these songs, “Part of Your World” stands out for its impact, meaning, and appropriateness to the story. While there are many factors that contribute to the excellence of this song, we’re going to look at it largely from a single perspective, how Howard Ashman’s lyrics give words a power to reinforce emotion and meaning.

“Part of Your World” is really two songs in one.  The introduction, the part that comes before “I wanna be where the people are,” is extremely long. This section is comprised of two verses followed by a bridge, which leads into the main, or second, song. If this first section were all there is to the song, not only would it be very disappointing, but we probably would not think too much of Ariel as a person. In this section Ariel brags about her possessions, yet declares that it isn’t enough. In this section, she is cast as a teenager who longs to go to the mall to get more stuff.

The materialistic nature of this section is established by words the describe and embody possessions: stuff, collection, everything, trove, treasures, wonders, gadgets, gizmos, whozits, whatzits, and thingamabobs. Notice that as the list goes on, we start to see alliterative pairs: trove & treasures, gadgets & gizmos, and whozits & whatzits. The way in which this first song ends after the bridge works effectively, as a bridge traditionally ends on the dominant, leading back into the verse or chorus, or in this case the main body of the song. The last word of the word is more, which is both held and crescendoed to give it emphasis. Initially, we may be tempted to interpret the word more as meaning more stuff, suggested in part with the rhyming of galore with more. However, the main body of the song that follows clarifies that Ariel wants more than material items.

The main body of the song is filled with words related to mobility: dancin’, jumpin’, strollin’, walk, and run. On a general level, we can interpret this to mean that Ariel wants to be active. That is, she wants to do things. On a metaphoric level, we can frame this mobility in terms of upward mobility.  That is, she wants to go places, and advance herself. Both of these interpretations, especially the latter, are reinforced with the lyric “flippin’ your fins you don’t get too far.” The alliterative Fs combined with the dismissive use of the term “flippin'” indicates that life under the sea is too inactive for Ariel, and offers her little opportunity for advancement or self betterment.

Word choice in the lyrics further contrasts the world on the land with undersea living. The lyrics make use of the words sun, warm, and burn to contrast with the implied coldness of the undersea world. This descriptive warmth also reads as emotional warmth. This implication is further suggested through rhyme.  The lyric “warm on the sand” has several internal rhymes with the line “betcha on land, they understand, that they don’t reprimand their daughters.” While this lyric oozes teenaged drama it also suggests that Ariel interprets her father’s stern nature as a coldness.

What we hear as the first verse is really structurally two verses. We find this out retroactively when we encounter the final verse, either that or the last verse is a half verse. Each individual verse contains no rhyming. Rather the rhymes occur between the two verses. Thus, dancin’ in the first verse is rhymed with dancin’ in the second verse, while feet at the end of the first verse rhymes with street at the end of the second.

In the chorus we get much more rhyming. Run rhymes with sun, while free rhymes with be. The first pair connects mobility with emotional warmth (happiness). The second pair connects freedom with the nature of being.

The bridge of the song is longer than either the verse or the chorus, and thus does a bit more than its share of the storytelling. It commences by bringing up the idea of sacrifice, which will become central to the character’s narrative (“what would I give if I could live out of these waters?”). It is the very end of the bridge which truly establishes Ariel as a character interested in self betterment. “Bright young women, sick of swimmin’, ready to stand,” is certainly a self description from our protagonist. Here stand, which rhymes with the aforementioned, sand, land, understand, and reprimand, serves a dual purpose. Namely, stand is both literal and figurative. If Ariel gets legs, she will literally be able to stand up, but more importantly we understand her as meaning that she will figuratively be able to stand up for herself, take a stand, and stand with pride.

In the final verse we have Ariel communicates a desire for knowledge through questions and answers. She also again invokes warmth imagery through the words fire and burn. We understand that in a literal sense fire and burning is a foreign concept to someone who lives under water, and that this idea would likely be fascinating to Ariel. However, we realize that fire and burn are being used as a metaphor for passion. Taken in this context, I argue that burn is only second to stand in terms of important words in this song.

Given that there is no rhyming within the verse, how will the final, isolated verse deal with a lack of rhyming? The solution the songwriters developed was to use rhymes to lead into and out of the final verse, linking the verse to the bridge and final chorus. The final word of the bridge, stand, rhymes with the first word of the third verse (and). The final word of the third verse, burn, rhymes with the first half of the first line of the chorus (when’s it my turn?).

The final chorus demonstrate’s Ariel’s intellectual curiosity through the word explore, while sneaking in a rhyme with shore. In rhyming love with above we have a linkage between her passion and her fascination with the world on land. The use of rhyme and alliteration as a tool for highlighting meaningful words in this song is masterful, and is well worth any aspiring songwriter’s attention.

Announcement: Landscapes

Hello All: I’m pleased to announce that I have received professional development grant to work on the multichannel recordings for Landscapes. I suspected that it was likely that I would be awarded this grant, so I worked on the project a bit between when I applied, and when I was just informed of the grant decision. Thus, in that time I recorded a bass part for Landscape 1: Forest and Landscape 4: Sand Dunes, as well as a synthesizer part for Landscape 7: Mountains. I used my Moog Mother 32 for the synthesizer recording. I hope to start on a bass recording for Landscape 7 soon. As in previous years, I will post a monthly update to keep y’all appraised of my progress.  I will leave you with an updated recording of Landscape 7: Mountains.

Pure Data: Seventh Chord Stingers

In the previous post we looked at a random arpeggiator that uses diatonic chord progressions. In this entry we will be using the same technique for creating diatonic chord progressions, but we will be applying it to create block seventh chords that repeat in a sequencer like fashion. Again we use the same code used in the scale sequencer to translate tempo from beats per minute to time per beat (expressed in milliseconds). As mentioned, we use the same table, ; triads 0 0 4 7 11 2 5 9,  from the previous post to denote the notes of C Major, arranged as stacked thirds.

As we did in the previous patch, we can make a list of index numbers that relate to the triads table, which can be used to define the roots of a chord progression. In this case we are using the table ; progression 0 4 0 3 6 2. This results in the progression Dm7, CMaj7, Bm7(b5), Am7, G7. The other new element of this patch involves introducing a rhythmic pattern. This is accomplished using the table  rhythm, where we use 1 to indicate a chord happening, and 0 to mean a rest happening. The table includes 16 numbers, indicating a single measure of sixteenth notes. The resulting rhythm starts out using syncopation where the first three chord jabs occur once every three sixteenth notes (or a dotted eighth note). The final two chords occur on the off beats of beats three and four, yielding a pleasantly funky rhythm.

We use the rhythm table table in a very simple manner. We mod the counter to 16, resulting in a sixteenth note rhythm that repeats every measure. We then read the rhythm table. Multiplying that number, which will be a zero or a one, by 120 gives us a velocity. A velocity of zero results in makenote not generating a note, while the chord stabs will be reasonably loud at 120.

The number from tabread rhythm is then also passed to a sel statement. Remember that this note will only be a zero or a one. Thus, by using sel 0 1, and only using the outlet for 1, we only pass to the rest of the algorithm when a chord is supposed to occur. We then have a counter that is for the current chord, modding that to 5 gives us an index for reading the progression table.

The output of tabread progression then in turn feeds four similar parallel algorithms that generate the specific notes of the given chord. These four algorithms are laid out left to right, and correspond to the root, third, fifth, and seventh of the given chord. In case of the root, the output of tabread progression and uses it as the input to tabread triads, which will yield the root of the triad. This is also added to one of two random octaves, 36 or 48 which will yield a note in the bass clef.

The other three notes add a number to the output of tabread progression. These numbers, one, two, and three, correspond to the third, fifth, and seventh of the chord. Modding that number by seven wraps any number that goes beyond the length of the table back to the beginning. The output of those expr statements then feeds tabread triads, yielding specific pitches. These pitches are added to one of three random octaves, 60, 72, or 84 to get random voicings. All four outputs of the expr statements, which give the transposed MIDI note numbers of the root, third, fifth, and seventh are fed to makenote which creates the chord when it is fed a velocity of 120. The output of this patch sounds like this . . .

Pure Data: Chord Arpeggiator

In the previous post, we had an introduction to patches in Pure Data using a patch that plays a scale in quarter, eighth, and sixteenth notes in three different octaves. In this post we’ll be looking a way to generate diatonic triads using a chord progression. Again, these patches are intended to teach concepts of music theory along with concepts of music technology.

Some portions of this patch are similar to portions of the previous patch, so we’ll give them only a brief mention. For instance, the portion (in the upper left) that translates tempo (in this case 104 beats per minute) into time per beat (expressed in milliseconds) is essentially the same. Here is multiplied by .25 to yield a constant sixteenth note rhythm. Likewise, the portion of the patch that actually makes the notes and outputs them to MIDI (middle left) is essentially the same.

We have previously introduced loadbang and tables. Here we use tables to define diatonic triads in C Major. Using C as zero, triads 0 0 4 7 11 2 5 9 presents the notes of C major in stacked thirds (C, E, G, B, D, F, A respectively). If we pull out three consecutive numbers from this table, we will get a root, third, and fifth of a triad. We can wrap the table around to the beginning using modular mathematics (in this case mod seven) to yield thirds and fifths of the A chord, as well as the fifth of the F chord.

We can define a diatonic chord progression by noting the table position of the root of the chord in the triads table. Accordingly progression 0 0 2 6 5 gives us the roots C, G, A, and F. Given the layout of Major and minor thirds within a Major scale, this gives us the specific harmonies, C Major, G Major, A minor, and F Major. Fans of popular music will recognize this progression from numerous songs, including “Don’t Stop Believing,” “Can You Feel the Love Tonight,” and “Country Roads.”

The metronome used in the patch ticks off sixteenth note increments, so the % 64 object beneath the counter reduces the counter to a four measure sequence (four groups of 16 sixteenth notes adds up to 64 notes). This number is used in the object div 16, which yields the whole number portion (non-fractional portion) of the number being divided by 16. This will result in the values 0, 1, 2, or 3. This is essentially the current measure number in terms of the direction. Feeding this to tabread progression. Will give the index value for the root of that measure’s chord when used with the triads table.

The value from the tabread progression object is sent to the right inlet of the expr statement in the segment above. A random number between 0 and 2 inclusive is fed to the left inlet of the same expr statement. The random number represents whether the note created will be a root (0), a third (1), or a fifth (2). by adding these two values together we get the index of the specific pitch the triads table. Using mod 7, % 7 in the expr statement, insures that if we go beyond the end of the the triads table that we will wrap around the beginning of the table. This index is then passed to tabread triads, which returns the numeric value of the specified note.

Note that in the previous segment a second outlet from the number object beneath the expr statement is then sent to a bang. This activates the random code in the segment above, namely the selection of a random number between 0 and 2 inclusive. This number is passed to a sel statement, specifically, sel 0 1 2. This object activates one of the three leftmost outlets, depending upon whether it is passed a 0, 1, or 2 (respectively left to right). the rightmost outlet activates if anything besides 0, 1, or 2 is encountered. In this case we pass the three left most outlets to three messages, 6072, and 84. These numbers are three different octaves of C (middle C, C5, and C6 respectively). Those messages are fed to a number object, which in turn is fed to the right inlet of an expr statement. The left inlet of this expr statement comes from the output of tabread triads. Thus, in expr ($f1+$f2) the pitch is added to one of three octaves, yielding a random arpeggiation across three octaves of pitch space. Let’s listen to the results of this patch below.

Pure Data: Scale Sequencer

Inspired by the book Learning Music Theory With Logic, Max, And Finale by Geoffrey Kidde, I have decided to revise my curriculum in my entry level theory course. However, rather than use Max, I’ve opted to teach Pure Data, due to its low price ($0). Pure Data is just different enough from Max that you can’t really use teaching materials for the two programs interchangeably. Thus, teaching Pure Data is forcing me to learn it, which is something I’ve wanted to do for quite a while. I hope to put up occasional posts that share Pure Data patches that I have developed for my teaching.

The first of these is a patch that plays major scales in three different octaves, at three different speeds.

Let’s look at this patch in a little detail. For those who are new to Pure Data, loadbang is used to run part of your patch when that patch is loaded. This loadbang routine sets up a table called scale, and defines the scale. Note that I’m using numbers of half steps to define a major scale (0 2 4 5 7 9 11 12). Notice as well that there is seemingly an extra 0 at the beginning. However, that first 0 indicates where in the table you begin loading material, so if we were to write this as text, we’d say begin at position 0 (the start of the table), and load in 0, 2, 4, 5, 7, 9, 11, and 12. This table data is included in a message object, and starts with a semicolon followed by a return character.  We would change the data in this message to change the mode or type of scale desired. If you want to update the patch after adding or changing information in the message that defines the scale, all you have to do is click on the message object when not in edit mode.

The following segment of the patch translates a tempo, measured in beats per minute to a time per beat measured in milliseconds. The equation expr (60/$f1)*1000 casts the number in the inlet (120) to a float. It divides 60 by that number, resulting in half a second. Multiplying that by 1000 translates that time per beat to milliseconds.

Directly beneath this segment there are three segments that instantiate metronomes at the quarter, eighth, and sixteenth note levels respectively. The quarter note metronome is passed the outlet of the time per beat. For the eighth notes, that same output is halved using expr (.5*$f1). Likewise, the sixteenth note durations result by multiplying by 1/4, using expr (.25*$f). In each case, a duration is also sent (dur1dur2dur3).

Below the metronomes are counters. The top two objects are very commonly used in Pure Data. The object on the left creates and stores a floating point number (a number with a decimal). To the right, we have an object that increments that number by adding one. This is accomplished by feeding the outlet of the float to the inlet of “+ 1”, and feeding the outlet of “+ 1” into the right inlet sets a new value for the float.

Since a scale has eight notes, any number higher than this is fairly useless for generating a scale. Thus, the outlet of float also feeds to “% 8”. The percentage sign means mod (modulus mathematics). Technically speaking, what is happening here is the number being fed to “% 8” is divided by eight, but the remainder of that division (the number that is left over in whole number division) is then sent to the outlet. This will result in a number between zero and seven.

This number is then used to generate both a pitch and a velocity. It is used as an index to select a value out of the the scale table, which is then added to a base pitch to determine the pitch range. The quarter notes use note number 36 as the base pitch. Since middle C (C4) in MIDI (Musical Instrument Digital Interface) is 60, 36 would be two octaves beneath middle C, otherwise known as C2, or Cello C. The eighth notes use middle C (60) as its base, and the sixteenth notes use two octaves above middle C (C6 or 84) as its base pitch. The pitch is then sent via the send command (s for short) using the variables note1, note2, and note3 for the quarters, eighths, and sixteenths respectively.

Key velocity in MIDI is a measurement of how quickly a key is pressed down. Traditionally it is used to indicate how loud a note is. That is a key that is pressed quickly will be louder than a key that is depressed slowly. MIDI is largely a seven bit system, so velocity values run between zero (which also can be used to turn a note off), and 127 (which is the loudest a note can be played). The equation expr (($f1*10)+50) results in having the notes get louder as the pitch of the scale goes up. For instance, when the index is zero, the velocity will be 50, and when the index is seven, the velocity will be 120.

These values, the notes (note1, note2, note3) and the velocities (vel1, vel2, vel3), are then sent to the output stage. The object makenote receives in its inlets (left to right) MIDI note number, velocity (0-127), and duration (in milliseconds). The outlets of makenote then feed the two leftmost inlets of noteout. The rightmost inlet of noteout receives a MIDI channel. The original specifications for MIDI, which was released in 1983, allow for 16 MIDI channels. Here, the notes are being sent on the first channel. Three different instances of makenote are being used here to allow different velocities and durations to be happening simultaneously.

You may check out this patch in action below using a piano sampler from Apple’s LogicPro to realize the sound.

Landscape Update: December 12th, 2020

I’m a bit ahead of where I was last month at this time, but I could still be better positioned. In November I managed to revise the third phrase of the pedal steel part for Landscape 11: Farmland. Yesterday I recorded the synthesizer part for Landscape 1:  Forest using my Moog Mother 32. This month I will leave you with an update of Landscape 10: Rocky Coast, which includes more orchestral samples, some trombone chords that I played, as well as a musique concrete part.

Landscape Update: November 15th, 2020

Well, I’m a little bit behind where I’d like to be.  Last weekend I finally finished some revisions of Landscape 10: Rocky Coast. I added more string, horn, and trombone samples, as well as added a musique concrete part for one of the phrases.  The trombone samples were actually played by me.  I will likely share that recording next month.
For this month I will be sharing some revisions I made to Landscape 4: Sand Dunes in August, where I added more orchestral samples, and took away some of the musique concrete in the process.

Landscape Update: October 4th, 2020

I reached my goal of September by revising the piano part for the third phrase.  I wrote an orchestral part, which still needs a bit of work on the articulations before I’ll call it finished, for Landscape 13: River. I hope to book a recording session for it sometime before the end of 2020. Finally, I recorded a synthesizer part for Landscape 3: Pond, which I include here.

Landscape Update: September 6th, 2020

I just met my goal for August with plenty of time to spare. I revised the pedal steel part for the eighth phrase of Landscape 8: Palm Glade. I was also able to mix the audio files from my Musiversal reading on July 16th. During the editing and mixdown process, I also added more orchestral samples to Landscape 4: Sand Dunes, taking away some of the musique concrete in the process. I leave you with the current realization of Landscape 10: Rocky Coast, that contains the orchestral recordings I mixed this past month.