Changes

Jump to: navigation, search

Audio Data API

131 bytes added, 03:00, 26 May 2010
API Tutorial
== API Tutorial ==
We have developed a proof of concept, experimental build of Firefox ([[#Obtaining_Code_and_Builds|builds provided below]]) which extends the HTMLMediaElement (e.g., affecting <video> and <audio>) and HTMLAudioElement, and implements the following basic API for reading and writing raw audio data:
===== Reading Audio =====
</pre>
The '''LoadedMetadata''' event is a standard part of HTML5, and has been extended to provide more detailed information about the audio stream. Specifically, developers can obtain the number of channels and , sample rate per second , and size of the audioframebuffer that will be used in audiowritten events. This event is fired once as the media resource is first loaded, and is useful for interpreting or writing the audio data.
The '''AudioWritten''' event provides two pieces of data. The first is a framebuffer (i.e., an array) containing sample data for the current frame. The second is the time (e.g., milliseconds) for the start of this frame.
===== Writing Audio =====
It is also possible to setup an audio element for raw writing from script (i.e., without a ''src'' attribute). Content scripts can specify the audio stream's characteristics, then write audio frames samples using the following methods:
<code>mozSetup(channels, sampleRate, volume)</code>
// Write samples using a JS Array
var samples = [0.242, 0.127, 0.0, -0.058, -0.242, ...];
var numberSamplesWritten = audioOutput.mozWriteAudio(samples);
// Write samples using a Typed Array
var samples = new Float32Array([0.242, 0.127, 0.0, -0.058, -0.242, ...]);
var numberSamplesWritten = audioOutput.mozWriteAudio(samples);
</pre>
Confirm
656
edits

Navigation menu