files/en-us/web/api/baseaudiocontext/createbuffer/index.md
{{ APIRef("Web Audio API") }}
The createBuffer() method of the {{ domxref("BaseAudioContext") }}
Interface is used to create a new, empty {{ domxref("AudioBuffer") }} object, which
can then be populated by data, and played via an {{ domxref("AudioBufferSourceNode")}}.
For more details about audio buffers, check out the {{ domxref("AudioBuffer") }} reference page.
[!NOTE]
createBuffer()used to be able to take compressed data and give back decoded samples, but this ability was removed from the specification, because all the decoding was done on the main thread, socreateBuffer()was blocking other code execution. The asynchronous methoddecodeAudioData()does the same thing — takes compressed audio, such as an MP3 file, and directly gives you back an {{ domxref("AudioBuffer") }} that you can then play via an {{ domxref("AudioBufferSourceNode") }}. For simple use cases like playing an MP3,decodeAudioData()is what you should be using.
For an in-depth explanation of how audio buffers work, including what the parameters do, read Audio buffers: frames, samples and channels from our Basic concepts guide.
createBuffer(numOfChannels, length, sampleRate)
numOfChannels
length
numOfChannels). To determine the length to use for a
specific number of seconds of audio, use numSeconds * sampleRate.sampleRate
An {{domxref("AudioBuffer")}} configured based on the specified options.
NotSupportedError {{domxref("DOMException")}}
numberOfChannels being higher than supported, or a
sampleRate outside the nominal range).First, a couple of simple trivial examples, to help explain how the parameters are used:
const audioCtx = new AudioContext();
const buffer = audioCtx.createBuffer(2, 22050, 44100);
If you use this call, you will get a stereo buffer (two channels), that, when played back on an AudioContext running at 44100Hz (very common, most normal sound cards run at this rate), will last for 0.5 seconds: 22050 frames / 44100Hz = 0.5 seconds.
const audioCtx = new AudioContext();
const buffer = audioCtx.createBuffer(1, 22050, 22050);
If you use this call, you will get a mono buffer (one channel), that, when played back
on an AudioContext running at 44100Hz, will be automatically resampled to
44100Hz (and therefore yield 44100 frames), and last for 1.0 second: 44100 frames /
44100Hz = 1 second.
[!NOTE] Audio resampling is very similar to image resizing: say you've got a 16 x 16 image, but you want it to fill a 32x32 area: you resize (resample) it. the result has less quality (it can be blurry or edgy, depending on the resizing algorithm), but it works, and the resized image takes up less space. Resampled audio is exactly the same — you save space, but in practice you will be unable to properly reproduce high frequency content (treble sound).
Now let's look at a more complex createBuffer() example, in which we
create a three-second buffer, fill it with white noise, and then play it via an {{domxref("AudioBufferSourceNode")}}. The comment should clearly explain what is going on.
You can also run the code live, or view the source.
const audioCtx = new AudioContext();
// Create an empty three-second stereo buffer at the sample rate of the AudioContext
const myArrayBuffer = audioCtx.createBuffer(
2,
audioCtx.sampleRate * 3,
audioCtx.sampleRate,
);
// Fill the buffer with white noise;
// just random values between -1.0 and 1.0
for (let channel = 0; channel < myArrayBuffer.numberOfChannels; channel++) {
// This gives us the actual ArrayBuffer that contains the data
const nowBuffering = myArrayBuffer.getChannelData(channel);
for (let i = 0; i < myArrayBuffer.length; i++) {
// Math.random() is in [0; 1.0]
// audio needs to be in [-1.0; 1.0]
nowBuffering[i] = Math.random() * 2 - 1;
}
}
// Get an AudioBufferSourceNode.
// This is the AudioNode to use when we want to play an AudioBuffer
const source = audioCtx.createBufferSource();
// set the buffer in the AudioBufferSourceNode
source.buffer = myArrayBuffer;
// connect the AudioBufferSourceNode to the
// destination so we can hear the sound
source.connect(audioCtx.destination);
// start the source playing
source.start();
{{Specifications}}
{{Compat}}