Movatterモバイル変換


[0]ホーム

URL:


  1. Web
  2. Web APIs
  3. AudioWorkletGlobalScope

AudioWorkletGlobalScope

Baseline Widely available *

This feature is well established and works across many devices and browser versions. It’s been available across browsers since ⁨April 2021⁩.

* Some parts of this feature may have varying levels of support.

TheAudioWorkletGlobalScope interface of theWeb Audio API represents a global execution context for user-supplied code, which defines customAudioWorkletProcessor-derived classes.

EachBaseAudioContext has a singleAudioWorklet available under theaudioWorklet property, which runs its code in a singleAudioWorkletGlobalScope.

As the global execution context is shared across the currentBaseAudioContext, it's possible to define any other variables and perform any actions allowed in worklets — apart from definingAudioWorkletProcessor derived classes.

WorkletGlobalScope AudioWorkletGlobalScope

Instance properties

This interface also inherits properties defined on its parent interface,WorkletGlobalScope.

currentFrameRead only

Returns an integer that represents the ever-increasing current sample-frame of the audio block being processed. It is incremented by 128 (the size of a render quantum) after the processing of each audio block.

currentTimeRead only

Returns a double that represents the ever-increasing context time of the audio block being processed. It is equal to thecurrentTime property of theBaseAudioContext the worklet belongs to.

sampleRateRead only

Returns a float that represents the sample rate of the associatedBaseAudioContext.

portRead onlyExperimental

Returns aMessagePort for custom, asynchronous communication between code in the main thread and the global scope of an audio worklet.This allows for custom messages, such as sending and receiving control data or global settings.

Instance methods

This interface also inherits methods defined on its parent interface,WorkletGlobalScope.

registerProcessor()

Registers a class derived from theAudioWorkletProcessor interface. The class can then be used by creating anAudioWorkletNode, providing its registered name.

Examples

In this example we output all global properties into the console in the constructor of a customAudioWorkletProcessor.

First we need to define the processor, and register it. Note that this should be done in a separate file.

js
// AudioWorkletProcessor defined in : test-processor.jsclass TestProcessor extends AudioWorkletProcessor {  constructor() {    super();    // Logs the current sample-frame and time at the moment of instantiation.    // They are accessible from the AudioWorkletGlobalScope.    console.log(currentFrame);    console.log(currentTime);  }  // The process method is required - output silence,  // which the outputs are already filled with.  process(inputs, outputs, parameters) {    return true;  }}// Logs the sample rate, that is not going to change ever,// because it's a read-only property of a BaseAudioContext// and is set only during its instantiation.console.log(sampleRate);// You can declare any variables and use them in your processors// for example it may be an ArrayBuffer with a wavetableconst usefulVariable = 42;console.log(usefulVariable);registerProcessor("test-processor", TestProcessor);

Next, in our main scripts file we'll load the processor, create an instance ofAudioWorkletNode — passing the name of the processor to it — and connect the node to an audio graph. We should see the output ofconsole.log() calls in the console:

js
const audioContext = new AudioContext();await audioContext.audioWorklet.addModule("test-processor.js");const testNode = new AudioWorkletNode(audioContext, "test-processor");testNode.connect(audioContext.destination);

Specifications

Specification
Web Audio API
# AudioWorkletGlobalScope

Browser compatibility

See also

Help improve MDN

Learn how to contribute

This page was last modified on byMDN contributors.


[8]ページ先頭

©2009-2025 Movatter.jp