├── 01 Introducing the Web Audio API ├── .vscode │ └── launch.json ├── HelloWorld │ ├── HelloWorld.html │ ├── HelloWorld2.html │ └── HelloWorld3.html ├── README.md ├── UserInteraction │ ├── UserInteraction.html │ └── UserInteraction.js └── index.html ├── 02 Oscillators ├── Detuning │ ├── detune.html │ └── detune.js ├── OscillatorNode │ ├── Oscillator.html │ └── Oscillator.js ├── PeriodicWave │ ├── CustomSquareWave.html │ └── PulseWave.html ├── README.md └── index.html ├── 03 Audio Buffer Sources ├── Backwards.html ├── BufferedNoise.html ├── BufferedSquareWave.html ├── Pause.html ├── Playback │ ├── Playback.html │ └── Playback.js ├── README.md └── index.html ├── 04 Constant Sources ├── ConstantSourceSquareWave │ ├── ConstantSourceSquareWave.html │ └── NoConstantSourceSquareWave.html ├── DCoffset.html ├── Grouping tracks │ ├── 01_Congas.wav │ ├── 02_Bass.wav │ ├── 03_UkeleleMic.wav │ ├── 04_UkeleleDI.wav │ ├── 05_Viola.wav │ ├── 06_Cello.wav │ ├── 07_LeadVox.wav │ ├── 08_BackingVox.wav │ ├── Grouping.html │ ├── Grouping.js │ └── Readme.txt ├── README.md └── index.html ├── 05 Parameter automation ├── Beep │ ├── Beep.js │ └── beep.html ├── Bells │ ├── Bells.html │ └── Bells.js ├── Crossfade │ ├── Crossfade.html │ └── Crossfade.js ├── README.md ├── RepeatBeep │ └── RepeatBeep.html ├── SetValueCurve.html └── index.html ├── 06 Connecting audio parameters and modulation ├── AM Synthesis │ ├── AMSynthesis.html │ └── AMSynthesis.js ├── FM Synthesis │ ├── FMSynthesis.html │ └── FMSynthesis.js ├── README.md └── index.html ├── 07 Analysis and visualization ├── Analyser.html ├── Analyser.js ├── README.md └── index.html ├── 08 Loading and recording ├── LevelMeter.html ├── MediaElement │ ├── MediaElement.html │ ├── MediaElement2.html │ ├── SimpleMediaElement.html │ └── beat1.mp3 ├── MediaStreamToAudioBuffer.html ├── README.md ├── Record an Audio Node │ ├── MediaRecorderExample.html │ ├── RecorderExample.html │ └── recorder.js ├── decodeAudioData │ ├── beat1.mp3 │ ├── decodeWithFetch.html │ ├── decodeWithRequest.html │ └── playbackMultipleInSequence.html └── index.html ├── 09 OfflineAudioContext ├── OfflineContext.html ├── OfflineContext2.html ├── OfflineContext3.html ├── README.md ├── bufferToWave.js └── index.html ├── 10 Delay ├── KarplusStrong │ ├── KarplusStrong.html │ └── KarplusStrong.js ├── README.md ├── Vibrato │ ├── trumpet.wav │ ├── vibrato.html │ └── vibrato.js ├── combFilter.html ├── doppler.html ├── feedbackDelay.html └── index.html ├── 11 Filters ├── BiquadFilterNode │ ├── biquad.html │ └── biquad.js ├── IIRFilterNode │ ├── IIRFilter.html │ ├── audio.js │ ├── flange love.mp3 │ └── graphics.js ├── Instability │ ├── BiquadInstability.html │ ├── IIRInstability.html │ └── flange love.mp3 ├── README.md └── index.html ├── 12 Waveshaper ├── BitCrusher.html ├── Clipper.html ├── README.md └── index.html ├── 13 Dynamic Range Compression ├── Attack and release │ ├── AttackRelease.html │ └── AttackRelease.js ├── Compressor │ ├── compressor.html │ └── compressor.js ├── README.md └── index.html ├── 14 Reverb ├── ConvolutionReverb.html ├── NormalizeIR │ ├── NormalizeIR.html │ ├── applause.mp3 │ ├── unit_long.wav │ └── unit_short.wav ├── README.md └── index.html ├── 15 Mixing audio ├── ChannelFlip.html ├── PingPongDelay.html ├── README.md └── index.html ├── 16 Stereo panning ├── README.md ├── Stereo Enhancer │ ├── Stereo speech.wav │ ├── stereoEnhancer.html │ └── symphonic_warmup.wav ├── index.html └── stereoPanning.html ├── 17 Spatial audio ├── README.md ├── index.html ├── listener │ ├── listener.html │ └── listener.js ├── notes.TXT └── pannerNode │ ├── drone.wav │ ├── panner.html │ └── panner.js ├── 18 Working with AudioWorklets ├── 1Noise worklet │ ├── basicNoise.html │ └── basicNoise.js ├── 2async await format │ ├── asyncNoise.html │ └── basicnoise.js ├── 3Multiple Inputs │ ├── multipleInputs.html │ └── multipleInputs.js ├── 4PanX │ ├── panX.html │ └── panX.js ├── 5Gain worklet │ ├── gain.html │ └── worklets.js ├── 6Smoothing Filter worklet │ ├── noiseWorklet.js │ ├── smoothing.html │ └── smoothingWorklet.js ├── 7First order filter options │ ├── filterOptions.html │ ├── filterOptionsWorklet.js │ └── noiseWorklet.js ├── 8First order filter messaging │ ├── filterMessaging.html │ ├── filterMessagingWorklet.js │ └── noiseWorklet.js ├── README.md └── index.html ├── 19 Wonders of audio worklets ├── Bit crusher worklet │ ├── bitCrush.html │ └── bitCrushWorklet.js ├── CircularBufferWorklets.js ├── CompressorWorklet │ ├── compressor.html │ ├── compressor.js │ └── compressorWorklet.js ├── KarplusStrong │ ├── KSWorklets.js │ ├── KarplusStrongV2.html │ └── KarplusStrongV2.js ├── PulseWave │ ├── Pulse.html │ └── PulseWorklet.js ├── README.md ├── StereoWidenerWorklet │ ├── Stereo speech.wav │ ├── stereoWidenerNode.html │ └── stereoWidenerWorklet.js └── index.html ├── README.md └── index.html /01 Introducing the Web Audio API/.vscode/launch.json: -------------------------------------------------------------------------------- 1 | { 2 | // Use IntelliSense to learn about possible attributes. 3 | // Hover to view descriptions of existing attributes. 4 | // For more information, visit: https://go.microsoft.com/fwlink/?linkid=830387 5 | "version": "0.2.0", 6 | "configurations": [ 7 | { 8 | "type": "pwa-chrome", 9 | "request": "launch", 10 | "name": "Launch Chrome against localhost", 11 | "url": "http://localhost:8080", 12 | "webRoot": "${workspaceFolder}" 13 | } 14 | ] 15 | } -------------------------------------------------------------------------------- /01 Introducing the Web Audio API/HelloWorld/HelloWorld.html: -------------------------------------------------------------------------------- 1 | 2 | 8 | -------------------------------------------------------------------------------- /01 Introducing the Web Audio API/HelloWorld/HelloWorld2.html: -------------------------------------------------------------------------------- 1 | 2 | -------------------------------------------------------------------------------- /01 Introducing the Web Audio API/HelloWorld/HelloWorld3.html: -------------------------------------------------------------------------------- 1 | 2 | 9 | -------------------------------------------------------------------------------- /01 Introducing the Web Audio API/README.md: -------------------------------------------------------------------------------- 1 | # Introducing the Web Audio API 2 | 3 | 'hello world' application presented as a code example, showing perhaps the simplest use of the Web Audio API to produce and manipulate sound. 4 | 5 | We then extend this application to show alternative approaches to its implementation, coding practices, and how sound is manipulated in an audio graph. 6 | -------------------------------------------------------------------------------- /01 Introducing the Web Audio API/UserInteraction/UserInteraction.html: -------------------------------------------------------------------------------- 1 | 2 |

Gain

3 | 4 | 5 | 6 | -------------------------------------------------------------------------------- /01 Introducing the Web Audio API/UserInteraction/UserInteraction.js: -------------------------------------------------------------------------------- 1 | var context = new AudioContext() 2 | var Tone = context.createOscillator() 3 | var Volume = new GainNode(context,{gain:0.1}) 4 | Tone.start() 5 | var Connected = false 6 | Volume.connect(context.destination) 7 | function StartStop() { 8 | if (Connected == false) { 9 | context.resume() 10 | Tone.connect(Volume) 11 | Connected = true 12 | } else { 13 | Connected = false 14 | Tone.disconnect(Volume) 15 | } 16 | } 17 | VolumeSlider.oninput = function() { 18 | VolumeLabel.innerHTML = this.value 19 | Volume.gain.value = this.value 20 | } 21 | -------------------------------------------------------------------------------- /01 Introducing the Web Audio API/index.html: -------------------------------------------------------------------------------- 1 |

Introducing the Web Audio API

2 | 3 | 'hello world' application presented as a code example, showing perhaps the simplest use of the Web Audio API to produce and manipulate sound. 4 | 5 | We then extend this application to show alternative approaches to its implementation, coding practices, and how sound is manipulated in an audio graph. 6 | 7 |
    8 |
  1. Hello World: possibly the simplest example of using the Web Audio API to make a sound 9 |
  2. UserInteraction: extending the Hello World with a gain control 10 |
11 | 12 | Code examples correspond to examples from Chapter 1 in the book 'Working with the Web Audio API'. 13 | 14 | Feel free to contact me at joshua.reiss@qmul.ac.uk 15 | -------------------------------------------------------------------------------- /02 Oscillators/Detuning/detune.html: -------------------------------------------------------------------------------- 1 | 2 |
3 | 18 |

19 | Frequency (Hz)
20 | 21 |

22 | Detune (cents)
23 | 24 |
25 | 26 | -------------------------------------------------------------------------------- /02 Oscillators/Detuning/detune.js: -------------------------------------------------------------------------------- 1 | var context = new AudioContext() 2 | var osc = new OscillatorNode(context,{frequency:261.63}) 3 | osc.connect(context.destination) 4 | osc.start(0) 5 | freqNumber.oninput = function () { 6 | osc.frequency.value = freqNumber.value 7 | freqSlider.value = this.value 8 | } 9 | freqSlider.oninput = function () { 10 | osc.frequency.value = freqNumber.value 11 | freqNumber.value = this.value 12 | } 13 | detuneNumber.oninput = function () { 14 | osc.detune.value = detuneNumber.value 15 | detuneSlider.value = this.value 16 | } 17 | detuneSlider.oninput = function () { 18 | osc.detune.value = detuneNumber.value 19 | detuneNumber.value = this.value 20 | } 21 | interval.oninput = function () { 22 | osc.detune.value = this.value 23 | detuneSlider.value = this.value 24 | detuneNumber.value = this.value 25 | } 26 | -------------------------------------------------------------------------------- /02 Oscillators/OscillatorNode/Oscillator.html: -------------------------------------------------------------------------------- 1 | 2 |

Frequency

3 | 4 |

Gain

5 | 6 |

Type

7 | 13 | 14 | -------------------------------------------------------------------------------- /02 Oscillators/OscillatorNode/Oscillator.js: -------------------------------------------------------------------------------- 1 | //Define variables 2 | var context = new AudioContext() 3 | var Tone = new OscillatorNode(context) 4 | var Amplitude = new GainNode(context,{gain: 0.2 }) 5 | 6 | //Set up audio graph 7 | Tone.connect(Amplitude) 8 | Amplitude.connect(context.destination) 9 | Tone.start() 10 | 11 | //User interface callbacks 12 | Frequency.oninput = function() { Tone.frequency.value = this.value } 13 | Volume.oninput = function() { Amplitude.gain.value = this.value } 14 | Type.onchange = function() { Tone.type = this.value } 15 | -------------------------------------------------------------------------------- /02 Oscillators/PeriodicWave/CustomSquareWave.html: -------------------------------------------------------------------------------- 1 | 2 |

Frequency

3 | 4 | 16 | -------------------------------------------------------------------------------- /02 Oscillators/PeriodicWave/PulseWave.html: -------------------------------------------------------------------------------- 1 | 2 |

Frequency

3 | 4 |

Duty Cycle

5 | 6 | 26 | -------------------------------------------------------------------------------- /02 Oscillators/README.md: -------------------------------------------------------------------------------- 1 | # oscillators 2 | 3 | This Chapter focuses on OscillatorNodes. 4 | 5 | - OscillatorNode demonstrates use of the built-in oscillator types and varying the oscillator's frequency 6 | - Detuning shows use of detuning the pitch and how it relates to musical notation. 7 | - There are two examples of creating custom oscillators using the PeriodicWave. 8 | - creating a square wave 9 | - creating a pulse wave, which is like a square wave but with a duty cycle parameter which controls how much of its period it spends near its peak value. 10 | -------------------------------------------------------------------------------- /02 Oscillators/index.html: -------------------------------------------------------------------------------- 1 |

Oscillators

2 | 3 | This Chapter focuses on OscillatorNodes. 4 | 5 |
    6 |
  1. OscillatorNode demonstrates use of the built-in oscillator types and varying the oscillator's frequency 7 |
  2. Detuning shows use of detuning the pitch and how it relates to musical notation. 8 |
  3. There are two examples of creating custom oscillators using the PeriodicWave. 9 |
  4. 10 | 14 |
15 | 16 | Code examples correspond to examples from Chapter 02 the book 'Working with the Web Audio API'. 17 | 18 | Feel free to contact me at joshua.reiss@qmul.ac.uk 19 | -------------------------------------------------------------------------------- /03 Audio Buffer Sources/Backwards.html: -------------------------------------------------------------------------------- 1 | 2 | 3 | 43 | -------------------------------------------------------------------------------- /03 Audio Buffer Sources/BufferedNoise.html: -------------------------------------------------------------------------------- 1 | 2 | 12 | -------------------------------------------------------------------------------- /03 Audio Buffer Sources/BufferedSquareWave.html: -------------------------------------------------------------------------------- 1 | 2 | Frequency: 3 | 4 | 31 | -------------------------------------------------------------------------------- /03 Audio Buffer Sources/Pause.html: -------------------------------------------------------------------------------- 1 | 2 | 3 | 30 | -------------------------------------------------------------------------------- /03 Audio Buffer Sources/Playback/Playback.html: -------------------------------------------------------------------------------- 1 |

2 | 3 | Playback rate
4 | Offset
5 | Duration

6 | 7 | Loop
8 | Loop start
9 | Loop end
10 | 11 | 12 | -------------------------------------------------------------------------------- /03 Audio Buffer Sources/Playback/Playback.js: -------------------------------------------------------------------------------- 1 | let context= new AudioContext() 2 | var source = new AudioBufferSourceNode(context) 3 | let bufferDuration = 3*context.sampleRate 4 | let array = new Float32Array(bufferDuration) 5 | let instantFreq 6 | for (var i=0;i source.playbackRate.value = rate.value 23 | loop.oninput = () => source.loop = loop.checked 24 | loopstart.oninput = () => source.loopStart = loopstart.value 25 | loopend.oninput = () => source.loopEnd = loopend.value 26 | -------------------------------------------------------------------------------- /03 Audio Buffer Sources/README.md: -------------------------------------------------------------------------------- 1 | # Audio Buffer Sources 2 | 3 | Sound samples, or audio assets, are represented in the Web Audio as an array of floats in AudioBuffers. These buffers may be created by copying an array of data, setting each value in the buffer directly, or loading data from an audio file. The Web Audio API makes a clear distinction between such assets and their playback state at a particular point in the audio graph. This separation is needed since applications could involve multiple versions of the same buffer playing simultaneously. Hence, the audio buffer is not a source node. For that we have the AudioBufferSourceNode. 4 | This Chapter looks at how to work with audio buffers; creating them, loading audio into them, using them in source nodes, and changing the playback state of the node. It starts by introducing the essential Web Audio API functionality, with examples of creating buffer sources and varying the properties of buffer source nodes. Careful attention is paid to the precise timing that can be achieved with their start, duration and looping times. Further examples are then given for constructing a square wave using looped buffer sources, for pausing playback, and for playing an audio asset backwards. 5 | 6 | The source code examples here are; 7 | 8 | - Playback: showing the full functionality of the BufferSourceNode using a Chirp signal 9 | - BufferedNoise: creating noise using a buffer source, and playing it back using a BufferSourceNode 10 | - BufferedSquareWave: creating a square wave using many BufferSourceNodes, each one playing the same buffer source at different playback rates 11 | - Pause: pausing playback of a BufferSourceNode by setting playbackRate to zero 12 | - Backwards: playing audio forwards and backwards by using two BufferSourceNodes and keeping track of the playback position 13 | -------------------------------------------------------------------------------- /03 Audio Buffer Sources/index.html: -------------------------------------------------------------------------------- 1 |

Audio Buffer Sources

2 | 3 | Sound samples, or audio assets, are represented in the Web Audio as an array of floats in AudioBuffers. These buffers may be created by copying an array of data, setting each value in the buffer directly, or loading data from an audio file. The Web Audio API makes a clear distinction between such assets and their playback state at a particular point in the audio graph. This separation is needed since applications could involve multiple versions of the same buffer playing simultaneously. Hence, the audio buffer is not a source node. For that we have the AudioBufferSourceNode. 4 | This Chapter looks at how to work with audio buffers; creating them, loading audio into them, using them in source nodes, and changing the playback state of the node. It starts by introducing the essential Web Audio API functionality, with examples of creating buffer sources and varying the properties of buffer source nodes. Careful attention is paid to the precise timing that can be achieved with their start, duration and looping times. Further examples are then given for constructing a square wave using looped buffer sources, for pausing playback, and for playing an audio asset backwards. 5 | 6 | The source code examples here are; 7 | 8 |
    9 |
  1. Playback: showing the full functionality of the BufferSourceNode using a Chirp signal 10 |
  2. BufferedNoise: creating noise using a buffer source, and playing it back using a BufferSourceNode 11 |
  3. BufferedSquareWave: creating a square wave using many BufferSourceNodes, each one playing the same buffer source at different playback rates 12 |
  4. Pause: pausing playback of a BufferSourceNode by setting playbackRate to zero 13 |
  5. Backwards: playing audio forwards and backwards by using two BufferSourceNodes and keeping track of the playback position 14 |
15 | 16 | Code examples correspond to examples from Chapter 03 of the book 'Working with the Web Audio API'. 17 | 18 | Feel free to contact me at joshua.reiss@qmul.ac.uk 19 | -------------------------------------------------------------------------------- /04 Constant Sources/ConstantSourceSquareWave/ConstantSourceSquareWave.html: -------------------------------------------------------------------------------- 1 | 2 | Fundamental frequency: 3 | 4 | 23 | -------------------------------------------------------------------------------- /04 Constant Sources/ConstantSourceSquareWave/NoConstantSourceSquareWave.html: -------------------------------------------------------------------------------- 1 | 2 | Frequency: 3 | 4 | 20 | -------------------------------------------------------------------------------- /04 Constant Sources/DCoffset.html: -------------------------------------------------------------------------------- 1 |
2 | Offset 3 | 15 | -------------------------------------------------------------------------------- /04 Constant Sources/Grouping tracks/01_Congas.wav: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/joshreiss/Working-with-the-Web-Audio-API/25a2e41c01e11d553cb36bcaba09321e34425e7c/04 Constant Sources/Grouping tracks/01_Congas.wav -------------------------------------------------------------------------------- /04 Constant Sources/Grouping tracks/02_Bass.wav: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/joshreiss/Working-with-the-Web-Audio-API/25a2e41c01e11d553cb36bcaba09321e34425e7c/04 Constant Sources/Grouping tracks/02_Bass.wav -------------------------------------------------------------------------------- /04 Constant Sources/Grouping tracks/03_UkeleleMic.wav: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/joshreiss/Working-with-the-Web-Audio-API/25a2e41c01e11d553cb36bcaba09321e34425e7c/04 Constant Sources/Grouping tracks/03_UkeleleMic.wav -------------------------------------------------------------------------------- /04 Constant Sources/Grouping tracks/04_UkeleleDI.wav: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/joshreiss/Working-with-the-Web-Audio-API/25a2e41c01e11d553cb36bcaba09321e34425e7c/04 Constant Sources/Grouping tracks/04_UkeleleDI.wav -------------------------------------------------------------------------------- /04 Constant Sources/Grouping tracks/05_Viola.wav: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/joshreiss/Working-with-the-Web-Audio-API/25a2e41c01e11d553cb36bcaba09321e34425e7c/04 Constant Sources/Grouping tracks/05_Viola.wav -------------------------------------------------------------------------------- /04 Constant Sources/Grouping tracks/06_Cello.wav: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/joshreiss/Working-with-the-Web-Audio-API/25a2e41c01e11d553cb36bcaba09321e34425e7c/04 Constant Sources/Grouping tracks/06_Cello.wav -------------------------------------------------------------------------------- /04 Constant Sources/Grouping tracks/07_LeadVox.wav: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/joshreiss/Working-with-the-Web-Audio-API/25a2e41c01e11d553cb36bcaba09321e34425e7c/04 Constant Sources/Grouping tracks/07_LeadVox.wav -------------------------------------------------------------------------------- /04 Constant Sources/Grouping tracks/08_BackingVox.wav: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/joshreiss/Working-with-the-Web-Audio-API/25a2e41c01e11d553cb36bcaba09321e34425e7c/04 Constant Sources/Grouping tracks/08_BackingVox.wav -------------------------------------------------------------------------------- /04 Constant Sources/Grouping tracks/Grouping.html: -------------------------------------------------------------------------------- 1 | Congas
2 | Bass
3 | Ukelele Mic
4 | Ukelele DI
5 | Viola
6 | Volume: 7 | 8 | 9 | 10 | -------------------------------------------------------------------------------- /04 Constant Sources/Grouping tracks/Grouping.js: -------------------------------------------------------------------------------- 1 | var file=[],tracks=[],gains=[] 2 | var fileName=['01_Congas','02_Bass','03_UkeleleMic','04_UkeleleDI','05_Viola'] 3 | context = new AudioContext() 4 | for (i=0;i<5;i++) { 5 | file[i] = new Audio(fileName[i]+'.wav') 6 | tracks[i] = context.createMediaElementSource(file[i]) 7 | gains[i] = new GainNode(context,{gain:0}) 8 | tracks[i].connect(gains[i]) 9 | gains[i].connect(context.destination) 10 | } 11 | let constantNode = context.createConstantSource() 12 | constantNode.start() 13 | volume.oninput = function() {constantNode.offset.value = volume.value} 14 | playButton.onclick = function() { 15 | context.resume() 16 | for (i=0;i<5;i++) file[i].play() 17 | } 18 | function Checks(element) { 19 | var checkNumber=parseInt(element.id.substring(5)) 20 | if (element.checked) constantNode.connect(gains[checkNumber].gain) 21 | else constantNode.disconnect(gains[checkNumber].gain) 22 | } 23 | -------------------------------------------------------------------------------- /04 Constant Sources/Grouping tracks/Readme.txt: -------------------------------------------------------------------------------- 1 | Raw multitrack excerpt from Anna Blanton's 'Rachel'. This file is provided for educational purposes only, and the material contained in it should not be used for any commercial purpose without the express permission of the copyright holders. Please refer to https://creativecommons.org/licenses/by-sa/4.0/ and www.cambridge-mt.com for further licensing details. 2 | 3 | Comprises 8 WAV files at 24-bit/44.1kHz resolution. 4 | 5 | Tempo: approx 63.5bpm. -------------------------------------------------------------------------------- /04 Constant Sources/README.md: -------------------------------------------------------------------------------- 1 | # Constant sources 2 | 3 | The ConstantSourceNode represents an audio source whose output is a constant value. Examples are provided showing; 4 | 5 | - DCoffset: applying a DC offset to an oscillator, which also demonstrates what happens when rendering audio outside the range -1 to +1 6 | - ConstantSourceSquareWave: use of a constant source node to construct a square wave 7 | - Grouping tracks: grouping tracks in a multitrack audio, and applying the same processing on all tracks in a groupwhen working with multitrack audio. 8 | -------------------------------------------------------------------------------- /04 Constant Sources/index.html: -------------------------------------------------------------------------------- 1 |

Constant sources

2 | 3 | The ConstantSourceNode represents an audio source whose output is a constant value. Examples are provided showing; 4 | 5 |
    6 |
  1. DCoffset: applying a DC offset to an oscillator, which also demonstrates what happens when rendering audio outside the range -1 to +1 7 |
  2. ConstantSourceSquareWave: use of a constant source node to construct a square wave 8 |
  3. Grouping tracks: grouping tracks in a multitrack audio, and applying the same processing on all tracks in a groupwhen working with multitrack audio. 9 |
10 | 11 | Code examples correspond to examples from Chapter 04 of the book 'Working with the Web Audio API'. 12 | 13 | Feel free to contact me at joshua.reiss@qmul.ac.uk 14 | -------------------------------------------------------------------------------- /05 Parameter automation/Beep/Beep.js: -------------------------------------------------------------------------------- 1 | var context = new AudioContext() 2 | var Volume = new GainNode(context,{gain:0}) 3 | var Tone = new OscillatorNode(context,{type:'sawtooth'}) 4 | Tone.connect(Volume) 5 | Volume.connect(context.destination) 6 | Tone.start() 7 | triggerBeep.onclick = function() { 8 | context.resume() 9 | let now = context.currentTime 10 | Tone.frequency.value = Frequency.value 11 | Volume.gain.setValueAtTime(0.0, now) 12 | Volume.gain.linearRampToValueAtTime(1,now + Attack.value/1000) 13 | Volume.gain.linearRampToValueAtTime(0,now + Attack.value/1000 + Decay.value/1000) 14 | } 15 | -------------------------------------------------------------------------------- /05 Parameter automation/Beep/beep.html: -------------------------------------------------------------------------------- 1 | 2 |

Frequency

3 | 4 |

Attack

5 | 6 |

Decay

7 | 8 | 9 | -------------------------------------------------------------------------------- /05 Parameter automation/Bells/Bells.html: -------------------------------------------------------------------------------- 1 | 2 |

Duration

3 | 4 |

Pitch

5 | 6 | 7 | -------------------------------------------------------------------------------- /05 Parameter automation/Bells/Bells.js: -------------------------------------------------------------------------------- 1 | var context = new AudioContext() 2 | var DBToAmp = function(db) { return Math.pow(10, db / 20) } 3 | Strike.onclick = function() { 4 | context.resume() 5 | var gNode=[],osc=[],FinalAmp 6 | for (i=0;i 2 | 3 | 4 |

Duration

5 | 6 | 7 | -------------------------------------------------------------------------------- /05 Parameter automation/Crossfade/Crossfade.js: -------------------------------------------------------------------------------- 1 | var context = new AudioContext() 2 | var Tone1 = new OscillatorNode(context,{type:'sine',frequency:500}) 3 | var Tone2 = new OscillatorNode(context,{type:'sine',frequency:300}) 4 | var Gain1 = new GainNode(context,{gain:1}) 5 | var Gain2 = new GainNode(context,{gain:0}) 6 | Tone1.start() 7 | Tone2.start() 8 | Tone1.connect(Gain1).connect(context.destination) 9 | Tone2.connect(Gain2).connect(context.destination) 10 | var N=100 11 | var curveUp = new Float32Array(N), curveDown = new Float32Array(N) 12 | for (i=0;i 2 | -------------------------------------------------------------------------------- /05 Parameter automation/SetValueCurve.html: -------------------------------------------------------------------------------- 1 | 2 |

Duration

3 | 4 | 17 | -------------------------------------------------------------------------------- /05 Parameter automation/index.html: -------------------------------------------------------------------------------- 1 |

Parameter automation

2 | 3 | Here, we dive into how audio parameters are set and scheduled in the Web Audio API. 4 | There are 5 source code examples; 5 | 6 |
    7 |
  1. Beep: Making a beep sound, a basic demonstration of audio parameter automation 8 |
  2. RepeatBeep:shows how parameter automation can be looped using JavaScript's setInterval method. 9 |
  3. SetValueCure: creates a beep sound using a custom automation curve 10 |
  4. Crossfade: Uses setValueCurveAtTime to show the difference between a linear crossfade and an equal power crossfade of two signals. 11 |
  5. Bells: we show how bell sounds can be created using additive synthesis, by applying envelopes to a set of oscillators. 12 |
13 | 14 | Code examples correspond to examples from Chapter 05 of the book 'Working with the Web Audio API'. 15 | 16 | Feel free to contact me at joshua.reiss@qmul.ac.uk 17 | -------------------------------------------------------------------------------- /06 Connecting audio parameters and modulation/AM Synthesis/AMSynthesis.html: -------------------------------------------------------------------------------- 1 | 2 |

Carrier Frequency

3 | 4 | 5 |

Modulation Depth

6 | 7 | 8 |

Modulation Frequency

9 | 10 | 11 | 29 | 30 | -------------------------------------------------------------------------------- /06 Connecting audio parameters and modulation/AM Synthesis/AMSynthesis.js: -------------------------------------------------------------------------------- 1 | context = new AudioContext() 2 | var Carrier = new OscillatorNode(context) 3 | var ModulatorOsc = new OscillatorNode(context,{frequency:100,type:'square'}) 4 | var Modulator = context.createGain() 5 | var AM = context.createGain() 6 | Carrier.start() 7 | ModulatorOsc.start() 8 | ModulatorOsc.connect(Modulator) 9 | Modulator.connect(AM.gain) 10 | Carrier.connect(AM) 11 | AM.connect(context.destination) 12 | -------------------------------------------------------------------------------- /06 Connecting audio parameters and modulation/FM Synthesis/FMSynthesis.html: -------------------------------------------------------------------------------- 1 | 2 |

Carrier Frequency

3 | 4 | 5 |

Modulation Depth

6 | 7 | 8 |

Modulation Frequency

9 | 10 | 11 | 30 | 31 | -------------------------------------------------------------------------------- /06 Connecting audio parameters and modulation/FM Synthesis/FMSynthesis.js: -------------------------------------------------------------------------------- 1 | //Define context and nodes 2 | context = new AudioContext() 3 | var Carrier = new OscillatorNode(context,{frequency:CarrierFrequency.value}) 4 | var ModulatorOsc = new OscillatorNode(context,{frequency:ModulationFrequency.value}) 5 | var Modulator = new GainNode(context,{gain:ModulationDepth.value}) 6 | //start any scheduled sources 7 | Carrier.start() 8 | ModulatorOsc.start() 9 | //Modulator is a gain node, so this multiples ModulatorOsc with the modulation depth 10 | ModulatorOsc.connect(Modulator) 11 | //Carrier frequency is the sum of the intrinsic frequency offset and the modulator 12 | Modulator.connect(Carrier.frequency) 13 | Carrier.connect(context.destination) 14 | -------------------------------------------------------------------------------- /06 Connecting audio parameters and modulation/README.md: -------------------------------------------------------------------------------- 1 | # Connecting audio parameters and modulation 2 | 3 | Here, we look at how an audio node’s output can be connected to a parameter of another node. There are two examples. 4 | 5 | - FMSynthesis: shows how FM synthesis is performed by connecting a Low Frequency Oscillator to the frequency parameter of another oscillator. 6 | - AMSynthesis: for implementing AM synthesis by connecting a Low Frequency Oscillator to the gain parameter of a gain node. 7 | -------------------------------------------------------------------------------- /06 Connecting audio parameters and modulation/index.html: -------------------------------------------------------------------------------- 1 |

Connecting audio parameters and modulation

2 | 3 | Here, we look at how an audio node’s output can be connected to a parameter of another node. There are two examples. 4 | 5 | 9 | 10 | Code examples correspond to examples from Chapter 06 of the book 'Working with the Web Audio API'. 11 | 12 | Feel free to contact me at joshua.reiss@qmul.ac.uk 13 | -------------------------------------------------------------------------------- /07 Analysis and visualization/Analyser.html: -------------------------------------------------------------------------------- 1 |
2 | 3 | Spectrum
4 | Frequency 5 | 6 | -------------------------------------------------------------------------------- /07 Analysis and visualization/Analyser.js: -------------------------------------------------------------------------------- 1 | let canvasContext= canvas.getContext('2d') 2 | let audioContext= new AudioContext() 3 | let source= new OscillatorNode(audioContext,{type:'square'}) 4 | let analyser= audioContext.createAnalyser() 5 | source.connect(analyser) 6 | source.start() 7 | var data = new Uint8Array(analyser.frequencyBinCount) 8 | Frequency.oninput = function() { source.frequency.value = this.value} 9 | draw() 10 | function draw() { 11 | let height=canvas.height, width=canvas.width 12 | if (Spectrum.checked) analyser.getByteFrequencyData(data) 13 | else analyser.getByteTimeDomainData(data) 14 | canvasContext.clearRect(0,0,canvas.width,canvas.height) 15 | for (let i = 0; i < data.length; i++) { 16 | if (!Spectrum.checked) canvasContext.fillRect(i,height*(1-data[i]/256)-1,1,1) 17 | else canvasContext.fillRect(i,height*(1-data[i]/256),1,height*data[i]/256) 18 | } 19 | requestAnimationFrame(draw) 20 | } 21 | -------------------------------------------------------------------------------- /07 Analysis and visualization/README.md: -------------------------------------------------------------------------------- 1 | # Analysis and Visualization 2 | 3 | This introduces the Web Audio API's AnalyserNode. 4 | 5 | An example is given for visualizing time or frequency data returned by the AnalyserNode. 6 | -------------------------------------------------------------------------------- /07 Analysis and visualization/index.html: -------------------------------------------------------------------------------- 1 |

Analysis and visualisation

2 | 3 | This introduces the Web Audio API's AnalyserNode. 4 | 5 | An example is given for visualizing time or frequency data returned by the AnalyserNode. 6 | 7 | The code example corresponds to the example from Chapter 07 of the book 'Working with the Web Audio API'. 8 | 9 | Feel free to contact me at joshua.reiss@qmul.ac.uk 10 | -------------------------------------------------------------------------------- /08 Loading and recording/LevelMeter.html: -------------------------------------------------------------------------------- 1 | 2 | 3 | 25 | -------------------------------------------------------------------------------- /08 Loading and recording/MediaElement/MediaElement.html: -------------------------------------------------------------------------------- 1 | 4 | 10 | -------------------------------------------------------------------------------- /08 Loading and recording/MediaElement/MediaElement2.html: -------------------------------------------------------------------------------- 1 | 2 | 12 | -------------------------------------------------------------------------------- /08 Loading and recording/MediaElement/SimpleMediaElement.html: -------------------------------------------------------------------------------- 1 |
2 | Volume
3 | Rate
4 | Loop 5 | Preserve pitch
6 | 7 |
8 | 15 | -------------------------------------------------------------------------------- /08 Loading and recording/MediaElement/beat1.mp3: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/joshreiss/Working-with-the-Web-Audio-API/25a2e41c01e11d553cb36bcaba09321e34425e7c/08 Loading and recording/MediaElement/beat1.mp3 -------------------------------------------------------------------------------- /08 Loading and recording/MediaStreamToAudioBuffer.html: -------------------------------------------------------------------------------- 1 | 2 | 3 | -------------------------------------------------------------------------------- /08 Loading and recording/README.md: -------------------------------------------------------------------------------- 1 | # Loading and Recording 2 | 3 | Essential functionality in the Web Audio API is the ability to play back audio from a stream or a file, and to record or stream the output of a node. Here, we introduce audio nodes that enable getting audio into an audio context from different sources, like media streams and media elements, as well as into buffers using decodeAudioData, and to different destinations, like media streams and audio files. Some of the functionality also relies on methods from outside the Web Audio API, like getUserMedia from the MediaStream API. 4 | 5 | - MediaElement 6 | - SimpleMediaElement: basic use of an