files/en-us/web/api/audioprocessingevent/inputbuffer/index.md
{{APIRef("Web Audio API")}}{{Deprecated_header}}
The inputBuffer read-only property of the {{domxref("AudioProcessingEvent")}} interface represents the input buffer of an audio processing event.
The input buffer is represented by an {{domxref("AudioBuffer")}} object, which contains a collection of audio channels, each of which is an array of floating-point values representing the audio signal waveform encoded as a series of amplitudes. The number of channels and the length of each channel are determined by the channel count and buffer size properties of the AudioBuffer.
An {{domxref("AudioBuffer")}} object.
In this example, a {{domxref("ScriptProcessorNode")}} is created with a buffer size of 256 samples, 2 input channels, and 2 output channels. When an {{domxref("ScriptProcessorNode/audioprocess_event", "audioprocess")}} event is fired, the input and output buffers are retrieved from the event object. The audio data in the input buffer is processed, and the result is written to the output buffer. In this case, the audio data is scaled down by a factor of 0.5.
const audioContext = new AudioContext();
const processor = audioContext.createScriptProcessor(256, 2, 2);
processor.addEventListener("audioprocess", (event) => {
const inputBuffer = event.inputBuffer;
const outputBuffer = event.outputBuffer;
for (let channel = 0; channel < outputBuffer.numberOfChannels; channel++) {
const inputData = inputBuffer.getChannelData(channel);
const outputData = outputBuffer.getChannelData(channel);
// Process the audio data here
for (let i = 0; i < outputBuffer.length; i++) {
outputData[i] = inputData[i] * 0.5;
}
}
});
processor.connect(audioContext.destination);
{{Specifications}}
{{Compat}}