P5.js post: Wrap all the code blocks in figure shortcodes and fix the railroad diagrams

This commit is contained in:
Eryn Wells 2022-10-21 11:07:45 -07:00
parent 23fc76b1f5
commit ff14784468

View file

@ -23,11 +23,13 @@ visualizations. By the end, we'll have something like this:
HTML has the ability to [embed audio][mdn-audio-tag] in a page with the HTML has the ability to [embed audio][mdn-audio-tag] in a page with the
`<audio>` tag. This one declares a single MP3 file as a source. `<audio>` tag. This one declares a single MP3 file as a source.
{{< figures/code >}}
```html ```html
<audio id="amen"> <audio id="amen">
<source src="amen.mp3" type="audio/mpeg"> <source src="amen.mp3" type="audio/mpeg">
</audio> </audio>
``` ```
{{< /figures/code >}}
In this form, the `<audio>` element doesn't do anything except declare some In this form, the `<audio>` element doesn't do anything except declare some
audio that can be played. It's invisible and the user can't interact with it or audio that can be played. It's invisible and the user can't interact with it or
@ -47,6 +49,7 @@ destinations could be your computer's speakers or a file.
Here's the entire code snippet that sets up the audio processing I need for the Here's the entire code snippet that sets up the audio processing I need for the
sketch: sketch:
{{< figures/code >}}
```js {linenostart=2} ```js {linenostart=2}
let analyzerNode = null; let analyzerNode = null;
let samples = null; let samples = null;
@ -67,6 +70,7 @@ let audioContext = (() => {
return audioContext; return audioContext;
})(); })();
``` ```
{{< /figures/code >}}
The [`AudioContext`][mdn-audio-context] is the object that encapsulates the The [`AudioContext`][mdn-audio-context] is the object that encapsulates the
entire node graph. On line 10, I create a new `AudioContext`. entire node graph. On line 10, I create a new `AudioContext`.
@ -85,19 +89,20 @@ the audio context's `destination` node that routes to the computer's speakers.
Our audio processing graph looks like this: Our audio processing graph looks like this:
{{< figures/railroad id="audioContextDiagram" >}} {{< figures/railroad id="audioContextDiagram" >}}
return rr.Diagram( {{< scripts/railroad >}}
rr.Sequence( return rr.Diagram(
rr.Terminal("<audio>"), rr.Sequence(
rr.Terminal("Analyzer"), rr.Terminal("<audio>"),
rr.Terminal("destination"))); rr.Terminal("Analyzer"),
{{< /figures/railroad >}} rr.Terminal("destination")));
{{< /scripts/railroad >}}
{{< figures/railroad id="audioContextDiagram" class="narrow-only" >}} {{< scripts/railroad narrow=1 >}}
return rr.Diagram( return rr.Diagram(
rr.Stack( rr.Stack(
rr.Terminal("<audio>"), rr.Terminal("<audio>"),
rr.Terminal("Analyzer"), rr.Terminal("Analyzer"),
rr.Terminal("destination"))); rr.Terminal("destination")));
{{< /scripts/railroad >}}
{{< /figures/railroad >}} {{< /figures/railroad >}}
By itself the AudioContext doesn't actually play any audio. I'll tackle that By itself the AudioContext doesn't actually play any audio. I'll tackle that
@ -109,6 +114,7 @@ Next up is starting playback. The following snippet creates a Play button using
P5.js's DOM manipulation API, and hooks up the button's `click` event to start P5.js's DOM manipulation API, and hooks up the button's `click` event to start
and stop playback. and stop playback.
{{< figures/code >}}
```js {linenostart=29} ```js {linenostart=29}
const playPauseButton = p.createButton('Play'); const playPauseButton = p.createButton('Play');
playPauseButton.position(10, 10); playPauseButton.position(10, 10);
@ -131,6 +137,7 @@ playPauseButtonElement.addEventListener('click', function() {
} }
}); });
``` ```
{{< /figures/code >}}
Something I found odd while working with these audio components is there isn't a Something I found odd while working with these audio components is there isn't a
way to ask any of them if audio is playing back at any given moment. Instead it way to ask any of them if audio is playing back at any given moment. Instead it
@ -147,12 +154,14 @@ The last bit of playback state tracking to do is to listen for when playback
ends because it reached the end of the audio file. I did that with the `ended` ends because it reached the end of the audio file. I did that with the `ended`
event: event:
{{< figures/code >}}
```js {linenostart=53} ```js {linenostart=53}
audioElement.addEventListener('ended', function() { audioElement.addEventListener('ended', function() {
playPauseButtonElement.dataset.playing = 'false'; playPauseButtonElement.dataset.playing = 'false';
playPauseButtonElement.innerHTML = '<span>Play</span>'; playPauseButtonElement.innerHTML = '<span>Play</span>';
}, false); }, false);
``` ```
{{< /figures/code >}}
This handler resets the `playing` flag and the label of the button. This handler resets the `playing` flag and the label of the button.
@ -160,6 +169,7 @@ This handler resets the `playing` flag and the label of the button.
Now it's time to draw some waveforms! The main part of a P5 sketch is the `draw` method. Here's mine: Now it's time to draw some waveforms! The main part of a P5 sketch is the `draw` method. Here's mine:
{{< figures/code >}}
```js {linenostart=57} ```js {linenostart=57}
const amplitude = p.height / 2; const amplitude = p.height / 2;
const axis = p.height / 2; const axis = p.height / 2;
@ -184,12 +194,15 @@ for (let i = 0; i < samples.length; i++) {
p.point(i, axis + amplitude * sampleValue); p.point(i, axis + amplitude * sampleValue);
} }
``` ```
{{< /figures/code >}}
The most interesting part of this function starts at line 66 where we get an array of samples from the analyzer node. The `samples` variable is a JavaScript `Float32Array`, with one element for each pixel of width. The most interesting part of this function starts at line 66 where we get an array of samples from the analyzer node. The `samples` variable is a JavaScript `Float32Array`, with one element for each pixel of width.
{{< figures/code >}}
```js {linenostart=30} ```js {linenostart=30}
samples = new Float32Array(p.width); samples = new Float32Array(p.width);
``` ```
{{< /figures/code >}}
Once the sample data is populated from the analyzer, we can render them by Once the sample data is populated from the analyzer, we can render them by
plotting them along the X axis, scaling them to the height of the sketch. plotting them along the X axis, scaling them to the height of the sketch.