ChordBank for Alexa
ChordBank for Alexa brings ChordBank into your living room. Tune your guitar, look up chords, or jam out to backing tracks, without taking your fingers off the strings.
Forget a chord while you’re practicing or learning something new? You can use ChordBank to learn the chord from across the room:
Want to practice scales or solos?
Try playing along with a backing track, from rock, to salsa, to gospel or punk.
Play in any key you’d like:
Guitar out of tune?
ChordBank can help:
Just need a reference pitch?
No problem:
ChordBank for Alexa. The perfect practice companion.
How it’s made
This was an interesting product to build, both because Alexa represents a completely new kind of interaction in a domain I’ve spent so much time in, and because it coincides with a project I’ve been working on for the past six months or so:
To pull much of ChordBank’s functionality out of the iOS app that has helped millions of people learn guitar over the years, and into a series of portable tools that can be used on any platform.
Here is a complete walkthrough of what that means for one piece of ChordBank’s Alexa skill, and how the music theory that drives ChordBank gets from ChordBank’s toolbox into Alexa’s.
Giving Alexa the Answers
When a user asks Alexa, “how do I play D major?” that’s a question that ChordBank for iOS, and ChordBank.com, for example, already know how to answer:
So, how do we get that answer into Alexa?
And how do we get Alexa to actually play the chord so we can hear how it sounds?
Well, it starts with the command line, and an in-house custom CLI called cbnk
—ChordBank’s CLI:
cbnk
has a library of thousands of hand-curated guitar chords. It knows multiple fingerings for each type of chord in each key, it knows which fingers to use, and it knows the proper musical spelling for each note.
Ask cbnk
for a chord, and it returns it in a custom format called tabml
—human-readable tab markup language.
Here’s one entry for D major:
xxo``` @0 D x,x,D,A,D,F#
------
``````
```1`2
````3`
``````
``````
``````
_
You should see that, in this little piece of plain text, you have the placement of each finger, the starting fret, and the proper note spellings.
But this is just text.
To get nice, clear audio for each chord, we need to turn it into an mp3 file that Alexa can play. For that, we use another in-house tool, called cbmidi
.
Turning txt into music
cbmidi
works with cbnk
to get a chord, and then inflate a MIDI template with that chord. The MIDI template defines the strum pattern, and cbnk
supplies the pitches for each string. Here it is in action:
This produces a bunch of .mid
files:
Which, unfortunately, Alexa couldn’t care less about.
Alexa has two ways to process sound: you can use the AudioPlayer API, which essentially exits your skill, or you can use ssml
tags to play snippets of low-fi, mp3 audio, inline with other speech.
For teaching users how to play chords, these snippets were the way to go.
And that’s where things get a little weird: to generate mp3s, I used GarageBand to create tracks from the midi files.
But, twelve tracks per type of chord, bouncing each one to disk is… tedious.
Enter: Applescript.
Old-school automation
Yes. Applescript. That little ghost that takes control of your computer by clumbsy syntactical magic.
Here’s a small taste of the craziness:
on exportTrack(idx, filename)
get_window()
checkAllMuteBoxes()
tell application "System Events"
tell process "GarageBand"
set _exportInstrumentsGroup to group idx of group 1 of scroll area 1 of splitter group 2 of splitter group 1 of group 3 of _window
click _exportInstrumentsGroup
delay 0.1
set _muteBox to checkbox 1 of _exportInstrumentsGroup
set checkboxStatus to value of _muteBox as boolean
if checkboxStatus is true then click _muteBox
end tell
end tell
bounceToDisk(filename)
end exportTrack
Essentially, what’s going on here:
- each key is arranged as its own track, from C down to B.
- using AppleScript, our computer helper checks all the mute boxes.
- then, one by one, we uncheck the mutebox of the track we’re interested in
- then, we bounce to disk
Done live, Applescript becomes dizzying and beautiful. Here’s what it looks like:
One more processing step, and we’re left with a folder full of nice, readable mp3
files for Alexa:
Bringing it all together
So you can ask Alexa, “How do you play C major?” and she’ll very helpfully reply:
<speak>
<say-as interpret-as="spell-out">C</say-as> major <audio src="...c-pluck.mp3" />
<s>Start by placing your first finger</s><s>on the second string</s>
<s>at the first fret...
...
</speak>
And, she’ll even place a diagram with the instructions in your Alexa app, for easy reference:
What’s next
What’s next for ChordBank for Alexa?
That really depends on the players who use it.
It’s been exciting to explore this new medium in computer-human interaction, and I think that this app brings the best of ChordBank to your living room.
The real test will be: how useful is the skill to players and students in their everyday lives?
And how can ChordBank continue to grow with them to be even more essential than it is today?
It’s still early days, but I can’t wait to find out.