package js.html.audio

..
AnalyserNode

The AnalyserNode interface represents a node able to provide real-time frequency and time-domain analysis information. It is an AudioNode that passes the audio stream unchanged from the input to the output, but allows you to take the generated data, process it, and create audio visualizations.

AnalyserOptions

AudioBuffer

Objects of these types are designed to hold small audio snippets, typically less than 45 s. For longer sounds, objects implementing the MediaElementAudioSourceNode are more suitable. The buffer contains data in the following format:  non-interleaved IEEE754 32-bit linear PCM with a nominal range between -1 and +1, that is, 32bits floating point buffer, with each samples between -1.0 and 1.0. If the AudioBuffer has multiple channels, they are stored in separate buffer.

AudioBufferOptions

AudioBufferSourceNode

The AudioBufferSourceNode interface is an AudioScheduledSourceNode which represents an audio source consisting of in-memory audio data, stored in an AudioBuffer. It's especially useful for playing back audio which has particularly stringent timing accuracy requirements, such as for sounds that must match a specific rhythm and can be kept in memory rather than being played from disk or the network.

AudioBufferSourceOptions

AudioContext

The AudioContext interface represents an audio-processing graph built from audio modules linked together, each represented by an AudioNode.

AudioContextOptions

The AudioContextOptions dictionary is used to specify configuration options when constructing a new AudioContext object to represent a graph of web audio nodes.

AudioContextState

AudioDestinationNode

AudioDestinationNode has no output (as it is the output, no more AudioNode can be linked after it in the audio graph) and one input. The number of channels in the input must be between 0 and the maxChannelCount value or an exception is raised.

AudioListener

It is important to note that there is only one listener per context and that it isn't an AudioNode.

AudioNode

The AudioNode interface is a generic interface for representing an audio processing module. Examples include:

AudioNodeOptions

The AudioNodeOptions dictionary of the Web Audio API specifies options that can be used when creating new AudioNode objects.

AudioParam

The Web Audio API's AudioParam interface represents an audio-related parameter, usually a parameter of an AudioNode (such as GainNode.gain).

AudioProcessingEvent

The Web Audio API AudioProcessingEvent represents events that occur when a ScriptProcessorNode input buffer is ready to be processed.

AudioScheduledSourceNode

The AudioScheduledSourceNode interface—part of the Web Audio API—is a parent interface for several types of audio source node interfaces which share the ability to be started and stopped, optionally at specified times. Specifically, this interface defines the start() and stop() methods, as well as the onended event handler.

AudioWorkletGlobalScope

AudioWorkletNodeOptions

AudioWorkletProcessor

BaseAudioContext

The BaseAudioContext interface acts as a base definition for online and offline audio-processing graphs, as represented by AudioContext and OfflineAudioContext respectively.

BiquadFilterNode

The BiquadFilterNode interface represents a simple low-order filter, and is created using the AudioContext.createBiquadFilter() method. It is an AudioNode that can represent different kinds of filters, tone control devices, and graphic equalizers.

BiquadFilterOptions

BiquadFilterType

ChannelCountMode

ChannelInterpretation

ChannelMergerNode

ChannelMergerOptions

ChannelSplitterNode

ChannelSplitterOptions

ConstantSourceNode

The ConstantSourceNode interface—part of the Web Audio API—represents an audio source (based upon AudioScheduledSourceNode) whose output is single unchanging value. This makes it useful for cases in which you need a constant value coming in from an audio source. In addition, it can be used like a constructible AudioParam by automating the value of its offset or by connecting another node to it; see Controlling multiple parameters with ConstantSourceNode.

ConstantSourceOptions

ConvolverNode

The ConvolverNode interface is an AudioNode that performs a Linear Convolution on a given AudioBuffer, often used to achieve a reverb effect. A ConvolverNode always has exactly one input and one output.

ConvolverOptions

DelayNode

The DelayNode interface represents a delay-line; an AudioNode audio-processing module that causes a delay between the arrival of an input data and its propagation to the output.

DelayOptions

DistanceModelType

DynamicsCompressorNode

Inherits properties from its parent, AudioNode.

DynamicsCompressorOptions

GainNode

The GainNode interface represents a change in volume. It is an AudioNode audio-processing module that causes a given gain to be applied to the input data before its propagation to the output. A GainNode always has exactly one input and one output, both with the same number of channels.

GainOptions

IIRFilterNode

The IIRFilterNode interface of the Web Audio API is a AudioNode processor which implements a general infinite impulse response (IIR)  filter; this type of filter can be used to implement tone control devices and graphic equalizers as well. It lets the parameters of the filter response be specified, so that it can be tuned as needed.

IIRFilterOptions

MediaElementAudioSourceNode

A MediaElementSourceNode has no inputs and exactly one output, and is created using the AudioContext.createMediaElementSource method. The amount of channels in the output equals the number of channels of the audio referenced by the HTMLMediaElement used in the creation of the node, or is 1 if the HTMLMediaElement has no audio.

MediaElementAudioSourceOptions

MediaStreamAudioDestinationNode

Inherits properties from its parent, AudioNode.

MediaStreamAudioSourceNode

A MediaStreamAudioSourceNode has no inputs and exactly one output, and is created using the AudioContext.createMediaStreamSource method. The number of channels in the output equals the number of channels in AudioMediaStreamTrack. If there is no valid media stream, then the number of output channels will be one silent channel.

MediaStreamAudioSourceOptions

OfflineAudioCompletionEvent

The Web Audio API OfflineAudioCompletionEvent interface represents events that occur when the processing of an OfflineAudioContext is terminated. The complete event implements this interface.

OfflineAudioContext

The OfflineAudioContext interface is an AudioContext interface representing an audio-processing graph built from linked together AudioNodes. In contrast with a standard AudioContext, an OfflineAudioContext doesn't render the audio to the device hardware; instead, it generates it, as fast as it can, and outputs the result to an AudioBuffer.

OfflineAudioContextOptions

OscillatorNode

The OscillatorNode interface represents a periodic waveform, such as a sine wave. It is an AudioScheduledSourceNode audio-processing module that causes a specified frequency of a given wave to be created—in effect, a constant tone.

OscillatorOptions

OscillatorType

OverSampleType

PannerNode

A PannerNode always has exactly one input and one output: the input can be mono or stereo but the output is always stereo (2 channels); you can't have panning effects without at least two audio channels!

PannerOptions

PanningModelType

PeriodicWave

PeriodicWave has no inputs or outputs; it is used to define custom oscillators when calling OscillatorNode.setPeriodicWave(). The PeriodicWave itself is created/returned by AudioContext.createPeriodicWave().

PeriodicWaveConstraints

PeriodicWaveOptions

ScriptProcessorNode

Documentation ScriptProcessorNode by Mozilla Contributors, licensed under CC-BY-SA 2.5.

StereoPannerNode

The pan property takes a unitless value between -1 (full left pan) and 1 (full right pan). This interface was introduced as a much simpler way to apply a simple panning effect than having to use a full PannerNode.

StereoPannerOptions

WaveShaperNode

A WaveShaperNode always has exactly one input and one output.

WaveShaperOptions