A music synthesizer should produce nice periodic waveforms when a note is played. We should be able to see that regularity when we visualize the sound pressure with an oscilloscope, here demonstrated with the Yamaha DX7 emulator:
The Problem
We can see the regularity, but hold on. The waveform is jumping around, flickering left and right. It doesn't appear fixed in one spot. That makes it awfully hard to see how the waveform evolves as we hold a note down.
The basic problem is that the visualization update is not synchronized to the wave period. The waveform is drawn by taking a snapshot of audio data – say, 1024 samples – at successive instants. The snapshots are performed in this case by a Web Audio API analyzer node, ideally at 60 times per second. The position (phase) of the periodic wave will not appear aligned in successive snapshots (unless we're playing an E above middle C, which happens to be 659.25 hz, a near multiple of 60 hz). Hmm!
The Solution
We need 2 ingredients to really do this right.
- When we get some audio data to draw, we need to know the exact moment in time the data corresponds to. The Web Audio API provides this in the form of AudioContext.currentTime.
- We need to know the frequency of the note we're interested in drawing. Let's say whatever note was pressed last.
Every time we want to draw a frame of audio data, we divide the sampleTime
by the wave period
and call the remainder sampleOffset
. The units are in audio samples, running at 44100 samples per second.
Let's say we're drawing two successive frames of audio data. For these two frames, sampleTime
might be 10000 and 10705. The note pressed down is middle C at 440 hz, generating a waveform that repeats every 44100 / 440 = 100.2
samples. So we get a sampleOffset
of 10000 % 100.2 = 80.2
and 10705 % 100.2 = 83.8
. We need to draw the first frame shifted 80.2 samples to the left, and the second frame shifted 83.8 samples to the left. And so on.
Ah, much better! The little wobble at the end of this animation shows a pitch vibrato.
Here are the important parts in code. When we get a new note down, we update the periodicity for the visualizer:
var noteFrequency = frequencyFromNoteNumber(synth.getLatestNoteDown()); visualizer.setPeriod(sampleRate / noteFrequency);
and then in our draw loop, subtract the sampleOffset from the x-position:
analyzerNode.getFloatTimeDomainData(data); var sampleTime = sampleRate * analyzerNode.context.currentTime; var sampleOffset = sampleTime % this.period; ... for (var i = 0, l = data.length; i < l; i++) { var x = (i - sampleOffset) * WAVE_PIXELS_PER_SAMPLE; var y = data[i]; graphics.lineTo(x, y); ... }
This doesn't so well for polyphonic synthesis, as multiple notes will have different wave periods running all at once. It works nicely if you hold a high note, and then play an octave or a fifth lower. You can see the consonance (and dissonance) in the waveform as you play various intervals.
Christopher Allen
/ May 5, 2020 QuoteWhy not do it the way oscilloscopes do, and figure out where to start plotting by setting up a trigger on a rising or falling edge? Then you wouldn't need to know the frequency/wavelength.
Of course with more complicated (and ever-changing) waveforms it can be hard to choose a trigger condition which is reliably met only once per period; in that case you could use the wavelength to choose amongst possible triggers, which should provide rock-solid stability.
Matt
/ May 5, 2020 QuoteGood point! I implemented this without investigating the way scopes do it. But I have since discovered cool projects like https://github.com/maxim-zhao/SidWizPlus. People use it for old game music on YouTube https://www.youtube.com/watch?v=7tUbo1j6qVU.
Mike Horn
/ October 5, 2021 QuoteThanks! This worked wonderfully for me. I had to change this line
var x = (i - sampleOffset) * WAVE_PIXELS_PER_SAMPLE;
to be
var x = (i + sampleOffset) * WAVE_PIXELS_PER_SAMPLE;
but otherwise it was perfect.