BaseAudioContext:createBufferSource() 方法
BaseAudioContext
接口的 createBufferSource()
方法用于创建一个新的 AudioBufferSourceNode
,该节点可用于播放包含在 AudioBuffer
对象中的音频数据。AudioBuffer
使用 BaseAudioContext.createBuffer
创建,或者在 BaseAudioContext.decodeAudioData
成功解码音频轨道时返回。
注意:AudioBufferSourceNode()
构造函数是创建 AudioBufferSourceNode
的推荐方法;请参阅创建音频节点。
语法
js
createBufferSource()
参数
无。
返回值
示例
在此示例中,我们创建了一个两秒钟的缓冲区,用白噪声填充它,然后通过 AudioBufferSourceNode
播放它。注释应该清楚地解释正在发生的事情。
js
const audioCtx = new AudioContext();
const button = document.querySelector("button");
const pre = document.querySelector("pre");
const myScript = document.querySelector("script");
pre.textContent = myScript.textContent;
// Stereo
const channels = 2;
// Create an empty two second stereo buffer at the
// sample rate of the AudioContext
const frameCount = audioCtx.sampleRate * 2.0;
const myArrayBuffer = audioCtx.createBuffer(
channels,
frameCount,
audioCtx.sampleRate,
);
button.onclick = () => {
// Fill the buffer with white noise;
//just random values between -1.0 and 1.0
for (let channel = 0; channel < channels; channel++) {
// This gives us the actual ArrayBuffer that contains the data
const nowBuffering = myArrayBuffer.getChannelData(channel);
for (let i = 0; i < frameCount; i++) {
// Math.random() is in [0; 1.0]
// audio needs to be in [-1.0; 1.0]
nowBuffering[i] = Math.random() * 2 - 1;
}
}
// Get an AudioBufferSourceNode.
// This is the AudioNode to use when we want to play an AudioBuffer
const source = audioCtx.createBufferSource();
// set the buffer in the AudioBufferSourceNode
source.buffer = myArrayBuffer;
// connect the AudioBufferSourceNode to the
// destination so we can hear the sound
source.connect(audioCtx.destination);
// start the source playing
source.start();
};
规范
规范 |
---|
Web Audio API # dom-baseaudiocontext-createbuffersource |
浏览器兼容性
BCD 表格仅在启用 JavaScript 的浏览器中加载。