Making Music in the Browser

Austin Wulf is a wonderful human and developer. We’re lucky to learn from their efforts to merge music and tech.

One of my life-long hobbies is playing music. I play a variety of instruments, including guitar, bass, and piano. Recently, I've added JavaScript to my musical repertoire! You can too, even if you aren't musically inclined, thanks to an experimental browser API called the Web Audio API. Let's check it out!

The API is supported by most modern browsers (Chrome 34+, Firefox 25+, Safari 6+, Edge 12+). You'll find a global constructor AudioContext (webkitAudioContext in Chrome 34/35) that unlocks this functionality.

Oscillators

One of the main components of this API is the oscillator. If you've used a synthesizer before, you're likely already familiar with this concept. An oscillator produces a sound wave at a set frequency and pattern. This creates notes with different qualities and textures. Many oscillators, including the ones in the Web Audio API, support 4 different waveforms out of the box: sine, triangle, square, and sawtooth. Each has a unique quality that gives synthesizers their distinctive sound.

Let's set up an oscillator. We'll set the frequency to 440 and use a sine wave:

    const context = new AudioContext()
    const oscillator = context.createOscillator()

    oscillator.frequency.value = 440
    oscillator.type = 'sine'

This won't do much without an audio source to output sound to, of course. We can connect the oscillator to our computer's default audio output pretty simply:

    oscillator.connect(context.destination)

We can start and stop our oscillator using oscillator.start() and oscillator.stop(). Not so fast, though! The browser won't allow us to call these methods unless they're triggered by some user interaction first. This is meant to prevent any random website from making noises without users' permission. We'll add a button to the page and trigger our oscillator to play for a second.

    document.querySelector('button').addEventListener('click', () => {
      oscillator.start(context.currentTime)
      setTimeout(() => oscillator.stop(context.currentTime), 1000)
    })

Clicking the button should produce a tone from your speakers for about a second. If you click the button a second time, though, you won't hear anything. Why not? You should see an error in your console saying that start can't be called more than once on an oscillator.

Ok, so now we have to deal with the fact that we can't start an oscillator multiple times. One way to handle this is to create a new oscillator every time we click the button. So, let's move that code into our event handler.

    const context = new AudioContext()
    
    document.querySelector('button').addEventListener('click', () => {
      const oscillator = context.createOscillator()
    
      oscillator.frequency.value = 440
      oscillator.type = 'sine'
      oscillator.connect(context.destination)
    
      oscillator.start(context.currentTime)
      setTimeout(() => oscillator.stop(context.currentTime), 500)
    })

Envelopes (Gain)

Great! Now our button plays a tone every time we click it. But it sounds a bit... robotic. Real instruments don't cut off their sound immediately; instead, they have a natural decrease in volume over time. This is called decay. So, how can we implement decay here? We need to control the volume! In audio synthesis, this is referred to as an envelope. Let's create a volume envelope for our oscillator.

    const context = new AudioContext()
    
    document.querySelector('button').addEventListener('click', () => {
      const oscillator = context.createOscillator()
      const envelope = context.createGain()
      const decayRate = 1.5 // seconds
    
      oscillator.frequency.value = 440
      oscillator.type = 'sine'
      envelope.gain.value = 1
    
      oscillator.connect(envelope)
      envelope.connect(context.destination)
    
      oscillator.start(context.currentTime)
      envelope.gain.exponentialRampToValueAtTime(0.001, context.currentTime + decayRate)
      setTimeout(() => oscillator.stop(context.currentTime), decayRate * 1000)
    })

You'll notice that we've made a change to some of our existing code here. Before, we connected the oscillator directly to the audio source. Now, we first connect it to the envelope, and then connect the envelope to the audio source. We've also added a call to exponentialRampToValueAtTime on the envelope's gain node. This method progressively adjusts the gain to a given value over time. The time it expects is a precise moment, in seconds, which we use the currentTime of our AudioContext instance to calculate.

Musicality

Now we've got something that sounds a bit more like an instrument than just a beeping noise. So, how do we get from here to something we can play like an instrument? First, we'll need a broader range of notes. Our button is currently playing at the frequency 440, which corresponds to the note A in the fourth octave. If you're familiar with piano music, this is the A above middle C.

Mapping note names to frequencies is a bit tedious, so let's pull in a library to solve this problem for us. I recommend octavian. This library has a few different handy features, but importantly for us, it has a module that can translate notes in a range of octaves to their corresponding frequencies. You can use it like this:

    import {Note} from 'octavian'
    
    const {frequency} = new Note('A4')
    console.log(frequency) // => 440

Making It Playable

The next step is to bind a few keyboard keys to play different notes. We'll pull the contents of our previous click handler function into a new function, playNote, that takes a frequency argument. We'll use that argument to set the oscillator's frequency instead of hard-coding it to 440:

    function playNote(frequency) {
      /* ... */
      oscillator.frequency.value = frequency
      /* ... */
    }

Next, we'll set up a keypress event on the document to play a range of notes with this function depending on which key is pressed. For simplicity, let's map the home row to the octave of middle C (C4):

    const notes = {
      a: 'C4',
      s: 'D4',
      d: 'E4',
      f: 'F4',
      g: 'G4',
      h: 'A4',
      j: 'B4',
      k: 'C5'
    }
    
    document.addEventListener('keypress', ({key}) => {
      const note = notes[key]
      const {frequency} = new Note(note)
      playNote(frequency)
    })

Et voilà: a playable toy keyboard, right here in the browser!

Conclusion

This is a simplified example of what's possible with the Web Audio API, but there's a lot I didn't cover here. This API gives us all the tools we need to build a fully functional synthesizer in JavaScript. It supports filters, noise generation, and even custom waveform creation. I highly recommend looking at MDN's article on using the Web Audio API as a source for more information.

With this API being JavaScript, it opens us up to a lot of potential for creativity. What would it be like to combine Web Audio with websockets? What about adding a little randomization to your sounds? The possibilities are endless. I look forward to seeing what you create!

The contributors to JavaScript January are passionate engineers, designers and teachers. Emily Freeman is a developer advocate at Kickbox and curates the articles for JavaScript January.