Band in Your Hand: De-blackboxing GarageBand

Abstract

GarageBand is a music software for Apple devices such as iPhone and iPad. It has a library of sound effects and can be used to create songs with multiple audio tracks. In this paper, I discuss the design principles in GarageBand, such as modularity, affordance, and constraint. I also examine whether GarageBand fulfills Alan Kay’s vision of meta-media.

1. Introduction

GarageBand is a music application for OSX and IOS systems. It enables users to create multi-track songs with a lot of pre-made virtual instruments such as keyboards and guitars. There are thousands of loops in its library of sound effects. It also can serve as a DJ machine. Projects created in GarageBand can be exported in many formats such as wav. It provides amateur musicians with powerful tools to play and compose music.

For this paper, I am focusing on the IOS version of GarageBand for iPad. I’m going to examine how design principles apply to the user interface design and the programs of actions of GarageBand. I’m also going to demonstrate some principles by re-creating Daft Punk’s song Give Life Back to Music with GarageBand. Here is the video I made for this song.

Give Life Back to Music, re-created by Jieshu Wang with GarageBand, originally by Daft Punk.

Here is the link to the GarageBand file that you can download and import into your GarageBand app.

2. Modularity

Modularity is a method with which designers divide systems into subsystems in order to manage the complexity. Every module hides their own complexity inside and interacts with other modules with interfaces. Each module is divided into more sub-modules. In this way, the overall system complexity is reduced[1].

With years of development and updates, GarageBand is becoming more and more complex. However, as a user, I never felt it complicated to use. That’s because its designers use the principle of modularity very well. Improvements in one module would not influence other modules, so users don’t need to change much their existing using habits to adapt new functions. Here I will examine the modularity in GarageBand to see how it helps improve user experience and manage system complexity.

2.1. Modules in GarageBand

2.1.1. Sections and tracks

The basic function of GarageBand is to create your own music. Each song or project you create will not impact each other unless you import one project into another one. So, each song can be seen as a module. This is the topmost level of modularity for users.

img_0090

Each project is a module.

Inside one of the projects, there are two dimensions of modules. They are like two coordinate axes in an XY plane. The vertical axis is for audio tracks, while the horizontal axis is for sections (time).

Sections and tracks in GarageBand as modules. Video/Jieshu Wang

The first dimension of modularity is audio tracks. Within one project, you can add no more than 32 audio tracks, more than enough for most amateur musicians. Each track serves as a module that hides its complexity—its timbre, chords, loops, melodies, and other properties. When you are editing one track, you can play the sound of other tracks in order to synchronize your beats without affecting them.

The second dimension of modules is song sections. Each section is made up of several bars. The default number of the bars in one section is eight, but you can easily increase or decrease the number as you wish. Each song can consist any number of sections. Each section is a module where you can add no more than 32 tracks. While you are in the interface of one section, each audio track can be easily moved, trimmed, looped, cut, and copied, but your action in one section would have no impact on other sections—except adding or deleting tracks, which would automatically add a blank track with the same instrument or delete the same tracks in other sections. If you want to edit other sections, you can click any area in the current section and drag it to the left or right to enter the section behind or before the current section.

Here’s another advantage of dividing one song into sections. Since one song normally lasts several minutes, with the size constraint of the iPad touchscreen, in order to squeeze the whole song into the limited width of the screen, the length of one bar would be extremely short, too small to recognize. Any small variation of the sound wave would be very hard to locate. Users would have to zoom in many times to find a specific bar he/she is looking for, and then zoom out before zoom in again to reach another bar. Sections resolve this problem perfectly. It provides users with a navigation system like longitude and latitude. For example, only three numbers are needed to locate one specific bar in one song—the ordinal numbers of the section, the track and the bar within the section. If GarageBand doesn’t have sections or has just one section for the entire song, it would be very difficult to locate one bar among hundreds if not thousands of bars in one interface.

In general, if you create a song with five sections, and each section has eight bars and four tracks, then you get 5X4=20 modules that you can edit separately. For the Give Life Back to Music, I created 7 sections and 21 tracks, totally 147 modules. Although modules can be edited independently, they combine together organically. When you finish your project, you can export the whole song with tracks perfectly mixing together and sections seamlessly connecting one by one. If there’s a mistake or a sound effect you’d like to add or change, all you have to do is to find the right module and modify it accordingly.

2.1.2. Modularity of sound effects

As I discussed above, a song project in GarageBand is divided into modules according to time and tracks. Inside each module, GarageBand provides us with a large number of options of sound effects. Those sound effects are divided into two main modules—Tracks and Live Loops.

In short, Tracks are mainly sound effects that imitate real instruments such as pianos and guitars, while Live Loops are pre-edited loops, each of which is consist of rich tracks with different genres or styles such as EDM and Hip Hop.

 Two modules of sound effects that you can add into audio tracks: Live Loops and Tracks

2.1.2.1. Live Loops

Both modules (Live Loops & Tracks) have many sub-modules according to instruments or genres. In Live Loops module, there are eleven pre-edited loops modules in different styles—EDM, Hip Hop, Dubstep, RnB, House, Chill, Rock, Electro Funk, Beat Masher, Chinese Traditional, and Chinese Modern. In each module, there are even small sub-modules. For example, in the module of EDM, there is a default setting that includes eleven mixed tracks with nine pre-edited loops—totally 11X9=99 editable modules.

 EDM Live Loop has 99 pre-edited modules. Users can add more as they wish.

The basic unit of loops all come from 1,638 so-called Apple loops stored in GarageBand. Users can choose from those 1,638 loops to mix their own loops, as well as import other audio files as loops. 1,638 is a large number. How can we find a loop that fulfills our need? For convenience, designers labeled loop units with three types of properties—instruments, genres, and descriptions, forming a three-dimensional selection network. In this way, they programmed the users’ action of selecting loops into three modules. For example, if I’d like use two or three bars of country music loop played by guitars that would relax my audiences, I would choose the keyword of “Relaxed” in descriptions, “Country” in genres, and “Guitars” in instruments. Then I get seven items left in the list, labeled with “Cheerful Mandolin”, “Down Home Dobro”, and “Front Porch Dobro”, which are exactly what I need.

loops-categories-1

1,638 Apple loops are categorized by three standards: 16 instruments, 14 genres, and 18 descriptions.

 2.1.2.2. Tracks

In the module of tracks, there are thirteen options or sub-modules:

  • Keyboard: Play an on-screen keyboard with piano, organ, and synth sounds.
    • There are seven types of timbre for users to choose—keyboards, classics, bass, leads, pads, FX, and other, totally 133 timbres that you can play with a virtual keyboard on the touchscreen.
    • According to different timbres, there are many sound properties that you can mess with. For example, for a timbre called “Deep House Bass”, you can modify the properties of filter attack, cutoff, renounce, filter decay, and pitch.
  • Drums: Tap on drums to create a beat. There are eight drum kits and eight drum machines.
  • Amp: Plug in your guitar and play through classic amps and stompboxes. Basically, it’s a virtual guitar amplifier and effector. There are four categories (clean, crunchy, distorted, and processed) of guitar amps—altogether 32 guitar amps and 16 bass amps.
  • Audio Recorder: Record your voice or any sound. There are nine effects you can choose, such as large room and robot.
  • Sampler: Record a sound, then play it with the onscreen music keyboard.
  • Smart Drums: Place drums on a grid to create beats.
  • Erhu: Tap and slide on strings to bow a traditional Chinese violin.
  • Smart String: Tap to play orchestral or solo string parts.
  • Smart Bass: Tap strings to play bass lines and grooves.
  • Smart Keyboard: Tap chords to create keyboard grooves.
  • Pipa: Tap the string to pluck a traditional Chinese lute.
  • Smart Guitar: Strum an onscreen guitar to play chords, notes, or grooves. Four styles (acoustic, classic clean, hard rock, roots rock)
  • Drummer: Create grooves and beats using a virtual session drummer

In general, the options for one track can be shown in the image below.

modules

credit: Jieshu Wang

2.1.3. Modularity of action

Under this modular organization, users’ actions of creating a song are also divided into modules. Users have to divide a song into several sections and edit each track in each section separately. For example, a song of 96 bars can be divided into 12 sections of 8 bars. Let’s say it is a simple pop song with 5 tracks—drum, two guitars, bass, and vocal. There are in total 12X5=60 modules that can be edited separately. Accordingly, the user can divide her action into 60 sub-actions. First, she would edit section A—firstly, the drum module of section A, then the two guitar tracks of section A, then the bass track of section A, and then the vocal track of section A. While she is editing the bass track of section A, she must play the three tracks (drum and two guitars) that she already edited in order to synchronize the beats. This function is an interface between modules of actions. Similarly, when she is recording her vocal for the fifth track of section A, she must wear her earphone to listen to the instrument accompaniment of the first four tracks, in order to follow the beat and tune of existing modules. If the user needs some backing vocal, she can add an additional vocal track and sing harmony all by herself.

There are interfaces between different sections. For example, many pop songs have some conventional chord progressions such I-VI-IV-V. In this case, users can simply copy and paste the repeating tracks into new sections. In addition, the drum doesn’t vary a lot during a song. So users can also copy and paste previous drum tracks into later sections, or just loop them to fill the whole song. In Give Life Back to Music, I copied and pasted many tracks, such as the drum tracks and keyboard tracks in order to save time.

With this modularity of action, the creating process of songs is simplified. It’s easy for amateur musicians to manage the complexity of the music.

2.2. GarageBand as a module for other systems

The music industry is a complex sociotechnical system. A lot of technologies, organizations, individuals, commercial companies, and academic institutes are involved in this global system. GarageBand is a part of it, serving as a module for many larger systems.

GarageBand is a module of the iLife software suite, which contains iMovie, iPhoto, iWeb, and other media software. These software all have their own functions, applications, and purposes. For example, GarageBand is a music software, while iPhoto is used to edit images and iWeb is a website creation tool. Meanwhile, they interact with one another through interfaces. For example, song projects created in GarageBand could be imported into iMovie, serving as background music for videos, which in turn can be imported into iWeb, as a part of the web page. In my video of Give Life Back to Music, I exported the song from GarageBand into iMovie.

imovie

GarageBand projects can be imported into iMovie

Moreover, as a part of Apple system of software and hardware, GarageBand projects can be transported very easily between Apple devices through AirDrop, a feature using Bluetooth technology, as shown below.

airdrop

A GarageBand project created on iPad was transmitted to MacBook using AirDrop. It can be edited further using the MacBook version of GarageBand or other software such as Logic Pro.

In addition, GarageBand can interact with other systems outside Apple system through interfaces. For example, Voice Synth is a virtual vocoder on iPad. Since there’s no function of vocoder in GarageBand, when users want to use vocoder, they have to turn to third-party applications such as Voice Synth, as shown on the upper panel in the image below. Here, I will show you the interface between GarageBand and Voice Synth. I used the “Robot” effect in Voice Synth to record me singing “let the music come tonight, we are gonna use it; let the music come tonight, give life back to music”, exported it as a wav format file, updated the audio file to my iCloud Drive on the cloud of Apple, and imported it into an audio track in GarageBand, where I can edit it further and mix it with other tracks. With third-party modules of applications, GarageBand doesn’t need to design its own vocoder module, which might cost a lot of money, and users don’t need to install Voice Synth if they don’t need vocoder effect—not all users want to distort their voice. The interfaces involved here include protocols that are shared by the audio processing community, such as the audio format, Cloud computing, and data transmission methods. On the other hand, the projects of GarageBand can also be exported into other apps such as Logic Pro for further manipulation, partially because they share the same audio engine.

vocoder-interaction-2

GarageBand also can be used as a module in a hardware system. Using an audio interface such as Apogee Jam, users can use GarageBand as a virtual amp for guitars and basses.

3. Affordance

Affordance is a “property in which the physical characteristics of an object or environment influence its function[1].” As Donald A. Norman mentioned in his book The Design of Everyday Things[2], affordance provides us with clues that how things “could possibly be used”. The design of user interfaces of GarageBand demonstrates this principle, too.

Many interfaces of virtual instruments imitate interfaces of real instruments. For example, there is a virtual keyboard in the module of keyboards. This imitation follows people’s existing mental model, so that users know how to play the keyboard at the first glance of the interface.

Interfaces of some keyboards

Interfaces of some keyboards

Icons on the interface follow people’s existing mental models, too. For example, the green triangle indicates “playing the music”, while the red dot indicates “recording.” And the virtual wheels and rotary knobs afford rotating, let alone the black and white keys that imitate piano, which afford pressing. When you are pressing one of the keys, the hue of the key you are pressing would be darker, imitating the shadow of real keys, indicating that you are “pressing down” a key.

press-keys-shadow

The shadow of the key that users are pressing.

The interfaces for drums also imitate real drums. There are several virtual drumheads that afford knocking. Sometimes, clicking different areas of the same drumhead would cause different sound effects, just like real drums. For example, tapping the center of the drumhead of the biggest drum in Chinese drum kit would cause a deep hit sound, while tapping the rim of the drum would sound like clear knocking. Moreover, the stronger you press the touchscreen, the louder the sound will be. In addition, different gestures would cause different effects, too. For example, in the Chinese drum kit, if you drag your finger around the rim of the biggest drum, it will sound like a stick sweeping across a rough surface—a “rattle” sound.

drum

Interfaces of some drums

However, in the function of “smart drum”, things are different. There are no virtual drums in the interface, but an 8X8 matrix. The two dimensions of the matrix are “Simple-Complex” and “Quiet-Loud”. There’s no such thing as a “drum matrix” in real life, but users know how to use the matrix once they see the interface—there are 64 squares in the matrix, and there are drum components with similar sizes arranging to the right of the matrix. It seems the components are waiting to be dragged into the matrix. So the components afford dragging. There is an icon of dice on the lower left. Physical dice affords rolling. So the perceived affordance of the dice is rolling in order to get a random result. Indeed, when you tap the icon of the dice, it will “roll” in its place, and the drum components will randomly “roll” into the matrix, forming a random beat pattern in a metric framework according to your tempo and time signature.

img_0111

Interface of smart drum

Another example of affordance is the interface of guitars. There is an icon of a switch on the upper right of the screen labeled “chords” and “notes,” which you can tap to switch between chords mode and notes mode. The notes mode imitates the interface of real guitars, with six strings, which you can tap to play or drag to produce a little variation of pitch. However, the interface is different from real guitar. A real guitar player would use his/her left hand to hold the chords and use his/her right hand to pluck or strum the strings. But in GarageBand, you only see the left part of the neck. However, it’s very easy for a guitar player to realize how to play the virtual strings—by tapping the string between frets, which afford tapping.

The chords mode imitates nothing in the real world, but it provides users with a perceived affordance of tapping as well. As you can see from the gif below, there is a rotary knob at the upper center labeled “autoplay”, with which you can choose from four pre-made chord progressions or you can turn off the autoplay. There are eight vertical bars, each with a chord name, according to your key. For example, for the key of C major, the eight bars are labeled as Em, Am, Dm, G, C, F, Bb, and Bdim. All of them are common chords used in C major. If the autoplay is off, six strings would remain on the screen, affording tapping. If you tap the chord label on the top of the vertical bars, the six strings would be “played” at the same time, imitating the sound effect of strumming. If you tap individual strings in the vertical bar labeled Em, it will play the sound of the corresponding string as if your left hand is holding the Em chord. If you turn on the autoplay mode, all you have to do is tapping the chord name, and GarageBand would play some pre-made chord progressions.

Interface of the Hard Rock guitar in GarageBand

Interface of the Hard Rock guitar in GarageBand

Other instruments like Smart Strings, Pipa, Erhu, and Smart Bass also have many well-designed affordances.

In a word, the designers of GarageBand are really good at using affordance. They imitate real instruments and use many icons, switches, and rotary knobs to integrate so many complex functions in a limited screen.

amp

The interface for amps is full of virtual rotary knobs.

However, many designs are not completely created by GarageBand designers. For example, there are a lot of music applications that imitate guitar and piano. Many of them use similar interfaces as GarageBand. But few apps combine keyboards with guitars in one app, and most of them don’t provide such flexibility as GarageBand. Some professional apps such as Logic Pro provide users with a massive library of sound effects and huge freedom to manipulate music, but they usually cost a lot of money and space. Logic Pro X is powerful but costs $199 for OS X system, and there’s no IOS version. On the contrary, GarageBand cost me just ¥30 (approximately $5) five years ago, and now it’s free for all iPad users!

4. Constraints

The IOS version of GarageBand has many constraints.

First of all, the app size is limited by the maximum size for IOS apps—4GB. The standard was set up by Apple and had increased from 2GB to 4GB in 2015[3]. The app size of the current IOS version of GarageBand is 1.28GB. It makes sure users have enough space to store their projects.

Second, the size of interface area of GarageBand is restricted by the physical size of the touchscreen of iPad. The most common sizes of iPad are 7.9-inch (iPad Mini) with 2048 X 1536 resolution, 9.7-inch with 2048 X 1536 resolution, and 12.9-inch with 2732 X 2048 resolution. It’s bigger than a cell phone but smaller than a laptop computer, so they need different designs. Everything must be on the touchscreen. That is one of the reasons why they design sections. Imagine we have a screen two meters long, maybe we can work without sections.

Furthermore, many music instruments are very long, such as piano, guitar, and erhu. A common piano has 88 keys, and a common guitar has 18 frets. How to put them on a small screen? The designers of GarageBand have many good ideas. For example, for keyboards, the default setting is two octaves from C2 to C4. You can scroll the keyboard to the left or right to play higher or lower pitches. In all, there are 10 octaves. Besides, there is a double-row mode with which you can play four octaves on the screen, as shown below.

keyboard-two-rows

The third constraint is that the gesture used in GarageBand is limited by the capacity of the touchscreen. Today, iPad’s multi-touch screen is very powerful. It can sense the pressure of fingers and responds accordingly. For example, the stronger you tap the virtual drums in GarageBand, the louder it will be. But GarageBand will not respond to the finger pressure lighter than the lower limit or stronger than the upper limit of the recognizable pressure of the touchscreen. Besides simple tapping, it also supports other gestures, such as dragging. Designers of GarageBand should choose gestures that are available in iPad, otherwise, the gestures will fail. Other versions of GarageBand have their own constraints depending on their platforms. For example, the OS X version of GarageBand doesn’t support multi-touch gestures because MacBook doesn’t have a touchscreen, but it has a much bigger library of sound because the processing capacity of MacBook is powerful than that of iPad.

5. Does GarageBand fulfill Alan Kay’s vision?

Alan Kay envisioned a universal media machine, with which people can remediate all kinds of media and create their own media with unlimited freedom[4]. Does GarageBand fulfill his vision? I don’t think so.

First of all, GarageBand doesn’t provide us with a flexible enough programming environment. In fact, it doesn’t provide any programming environment at all. It gives us a library of sound effects and pre-made loops, but it’s not easy for you to create your own. It doesn’t allow you to edit the properties of sound. For example, if I want to edit my voice, there are only nine effects for me to choose from. I can’t modify the acoustic characteristics as I wish. It’s like a coloring book with pre-printed line drawings that you can fill with colors but you don’t really “create” the art and it will not improve your creativity as well. You are restricted by the line drawings. It produces an illusion of “creativity.” Most times, when we are talking about “creating” music in GarageBand, we are just re-mixing pre-existed sound effects stored in GarageBand in pre-made ways. Just like my “re-creating” of Give Life Back to Music, there’s nothing creative in my “re-creating”. All the creativity came from Daft Punk.

Second, GarageBand cannot be used to edit media other than music. It has nothing to do with videos, texts, paintings, and so on. It is not a meta-medium.

However, I think GarageBand in some degree democratizes music. For example, I never succeeded in playing the F chord in guitar but I can play it in GarageBand. I can’t sing harmony with myself, but I can record harmony in different tracks in GarageBand and play them together as if I am singing with myself. I don’t know how to write a song, but when I re-create other people’s songs in GarageBand, I can learn the arrangement and composition of songs by decomposing them.

6. Conclusion

GarageBand is a music software with which amateur musicians can create songs on Apple devices. In this paper, I discussed the design principles in the iPad version of GarageBand, such as modularity, affordance, and constraint. In particular, I argue that GarageBand doesn’t fulfill Alan Kay’s vision of meta-medium, but it does simplify the process of creating music for amateur musicians.


References

[1] Lidwell, William, Kritina Holden, and Jill Butler. Universal Principles of Design. Gloucester, Mass: Rockport, 2003.

[2] Norman, Donald. The Design of Everyday Things. Basic Books, 2002. http://proquestcombo.safaribooksonline.com.proxy.library.georgetown.edu/9780465003945.

[3] Kumparak, Greg. “iOS Apps Can Now Be Twice As Big.” TechCrunch. Accessed December 18, 2016. http://social.techcrunch.com/2015/02/12/ios-app-size-limit/.

[4] Manovich, Lev. Software Takes Command. International Texts in Critical Media Aesthetics, volume#5. New York ; London: Bloomsbury, 2013.

发表评论

邮箱地址不会被公开。 必填项已用*标注