RTCEncodedAudioFrame
Baseline2023 *Newly available
Since August 2023, this feature works across the latest devices and browser versions. This feature might not work in older devices or browsers.
* Some parts of this feature may have varying levels of support.
Note: This feature is available inDedicated Web Workers.
TheRTCEncodedAudioFrame
of theWebRTC API represents an encoded audio frame in the WebRTC receiver or sender pipeline, which may be modified using aWebRTC Encoded Transform.
The interface provides methods and properties to get metadata about the frame, allowing its format and order in the sequence of frames to be determined.Thedata
property gives access to the encoded frame data as a buffer, which might be encrypted, or otherwise modified by a transform.
Instance properties
RTCEncodedAudioFrame.timestamp
Read onlyDeprecatedNon-standardReturns the timestamp at which sampling of the frame started.
RTCEncodedAudioFrame.data
Return a buffer containing the encoded frame data.
Instance methods
RTCEncodedAudioFrame.getMetadata()
Returns the metadata associated with the frame.
Examples
This code snippet shows a handler for thertctransform
event in aWorker
that implements aTransformStream
, and pipes encoded frames through it from theevent.transformer.readable
toevent.transformer.writable
(event.transformer
is aRTCRtpScriptTransformer
, the worker-side counterpart ofRTCRtpScriptTransform
).
If the transformer is inserted into an audio stream, thetransform()
method is called with aRTCEncodedAudioFrame
whenever a new frame is enqueued onevent.transformer.readable
.Thetransform()
method shows how this might be read, modified using a fictional encryption function, and then enqueued on the controller (this ultimately pipes it through to theevent.transformer.writable
, and then back into the WebRTC pipeline).
addEventListener("rtctransform", (event) => { const transform = new TransformStream({ async transform(encodedFrame, controller) { // Reconstruct the original frame. const view = new DataView(encodedFrame.data); // Construct a new buffer const newData = new ArrayBuffer(encodedFrame.data.byteLength); const newView = new DataView(newData); // Encrypt frame bytes using the encryptFunction() method (not shown) for (let i = 0; i < encodedFrame.data.byteLength; ++i) { const encryptedByte = encryptFunction(~view.getInt8(i)); newView.setInt8(i, encryptedByte); } encodedFrame.data = newData; controller.enqueue(encodedFrame); }, }); event.transformer.readable .pipeThrough(transform) .pipeTo(event.transformer.writable);});
Note that more complete examples are provided inUsing WebRTC Encoded Transforms.
Specifications
Specification |
---|
WebRTC Encoded Transform # rtcencodedaudioframe |