The AudioContext
interface represents an audio-processing graph built from audio modules linked together, each represented by an AudioNode
.
Documentation AudioContext by Mozilla Contributors, licensed under CC-BY-SA 2.5.
See also:
Constructor
Methods
Closes the audio context, releasing any system audio resources that it uses.
Throws:
Creates a MediaElementAudioSourceNode
associated with an HTMLMediaElement
. This can be used to play and manipulate audio from video
or audio
elements.
Throws:
Suspends the progression of time in the audio context, temporarily halting audio hardware access and reducing CPU/battery usage in the process.
Throws:
Inherited Variables
Returns a double representing an ever-increasing hardware time in seconds used for scheduling. It starts at 0
.
Returns an AudioDestinationNode
representing the final destination of all audio in the context. It can be thought of as the audio-rendering device.
Returns a float representing the sample rate (in samples per second) used by all nodes in this context. The sample-rate of an AudioContext
cannot be changed.
Inherited Methods
Creates an AnalyserNode
, which can be used to expose audio time and frequency data and for example to create data visualisations.
Throws:
Creates a BiquadFilterNode
, which represents a second order filter configurable as several different common filter types: high-pass, low-pass, band-pass, etc
Throws:
Creates a ChannelMergerNode
, which is used to combine channels from multiple audio streams into a single audio stream.
Throws:
Creates a ChannelSplitterNode
, which is used to access the individual channels of an audio stream and process them separately.
Throws:
Creates a ConstantSourceNode
object, which is an audio source that continuously outputs a monaural (one-channel) sound signal whose samples all have the same value.
Throws:
Creates a ConvolverNode
, which can be used to apply convolution effects to your audio graph, for example a reverberation effect.
Throws:
@:value({ maxDelayTime : 1.0 })createDelay(maxDelayTime:Float = 1.0):DelayNode
Creates a DelayNode
, which is used to delay the incoming audio signal by a certain amount. This node is also useful to create feedback loops in a Web Audio API graph.
Throws:
Creates a GainNode
, which can be used to control the overall volume of the audio graph.
Throws:
Creates an IIRFilterNode
, which represents a second order filter configurable as several different common filter types.
Throws:
Creates an OscillatorNode
, a source representing a periodic waveform. It basically generates a tone.
Throws:
Creates a PannerNode
, which is used to spatialise an incoming audio stream in 3D space.
Throws:
Creates a StereoPannerNode
, which can be used to apply stereo panning to an audio source.
Throws:
Creates a WaveShaperNode
, which is used to implement non-linear distortion effects.
Throws:
Asynchronously decodes audio file data contained in an ArrayBuffer
. In this case, the ArrayBuffer is usually loaded from an XMLHttpRequest
's response
attribute after setting the responseType
to arraybuffer
. This method only works on complete files, not fragments of audio files.
Throws:
Resumes the progression of time in an audio context that has previously been suspended/paused.
Throws:
Register an event handler of a specific event type on the EventTarget
.
Throws: