Movatterモバイル変換


[0]ホーム

URL:


  1. Web
  2. Web APIs
  3. AudioContext

AudioContext

Baseline Widely available *

This feature is well established and works across many devices and browser versions. It’s been available across browsers since ⁨April 2021⁩.

* Some parts of this feature may have varying levels of support.

TheAudioContext interface represents an audio-processing graph built from audio modules linked together, each represented by anAudioNode.

An audio context controls both the creation of the nodes it contains and the execution of the audio processing, or decoding. You need to create anAudioContext before you do anything else, as everything happens inside a context. It's recommended to create one AudioContext and reuse it instead of initializing a new one each time, and it's OK to use a singleAudioContext for several different audio sources and pipeline concurrently.

EventTarget BaseAudioContext AudioContext

Constructor

AudioContext()

Creates and returns a newAudioContext object.

Instance properties

Also inherits properties from its parent interface,BaseAudioContext.

AudioContext.baseLatencyRead only

Returns the number of seconds of processing latency incurred by theAudioContext passing the audio from theAudioDestinationNode to the audio subsystem.

AudioContext.outputLatencyRead only

Returns an estimation of the output latency of the current audio context.

AudioContext.sinkIdRead onlyExperimentalSecure context

Returns the sink ID of the current output audio device.

Instance methods

Also inherits methods from its parent interface,BaseAudioContext.

AudioContext.close()

Closes the audio context, releasing any system audio resources that it uses.

AudioContext.createMediaElementSource()

Creates aMediaElementAudioSourceNode associated with anHTMLMediaElement. This can be used to play and manipulate audio from<video> or<audio> elements.

AudioContext.createMediaStreamSource()

Creates aMediaStreamAudioSourceNode associated with aMediaStream representing an audio stream which may come from the local computer microphone or other sources.

AudioContext.createMediaStreamDestination()

Creates aMediaStreamAudioDestinationNode associated with aMediaStream representing an audio stream which may be stored in a local file or sent to another computer.

AudioContext.createMediaStreamTrackSource()

Creates aMediaStreamTrackAudioSourceNode associated with aMediaStream representing a media stream track.

AudioContext.getOutputTimestamp()

Returns a newAudioTimestamp object containing two audio timestamp values relating to the current audio context.

AudioContext.resume()

Resumes the progression of time in an audio context that has previously been suspended/paused.

AudioContext.setSinkId()ExperimentalSecure context

Sets the output audio device for theAudioContext.

AudioContext.suspend()

Suspends the progression of time in the audio context, temporarily halting audio hardware access and reducing CPU/battery usage in the process.

Events

sinkchangeExperimental

Fired when the output audio device (and therefore, theAudioContext.sinkId) has changed.

Examples

Basic audio context declaration:

js
const audioCtx = new AudioContext();const oscillatorNode = audioCtx.createOscillator();const gainNode = audioCtx.createGain();const finish = audioCtx.destination;// etc.

Specifications

Specification
Web Audio API
# AudioContext

Browser compatibility

See also

Help improve MDN

Learn how to contribute

This page was last modified on byMDN contributors.


[8]ページ先頭

©2009-2025 Movatter.jp