Sound buffering using the web audio API

I have an incoming live stream from a desktop application through a webcam connection on my webpage. I convert a PCM stream to a float 32 array. When I start the track, I hear glitches, so I created a buffer array to store part of the stream before playing. The problem I get is that after creating the buffer, I cannot add any blocks to the array.

Here I get the data from the socket and save it in an array. I am waiting for 100 pieces before starting the flow.

 sock.onmessage = function(e) {
        var samples = e.data;
        obj = JSON.parse(samples);

        stream = stream.concat(obj);

        if(j == 1000){
            console.log(stream);
            playPcm(stream);
        }

        console.log(j);
        j ++;
    }

This method processes sound bites.

function playPcm(data){
    var audio = new Float32Array(data);
    var source = context.createBufferSource();
    var audioBuffer = context.createBuffer(1, audio.length, 44100);

    audioBuffer.getChannelData(0).set(audio);
    // console.log(audioBuffer);

    source.buffer = audioBuffer;


    source.connect(context.destination);

    source.start(AudioStart);
    AudioStart += audioBuffer.duration;
}

I read about scriptProcessorNode, but could not understand what to do with it. Now I'm almost stuck as I am not very familiar with the web audio API.

+4
1

- ( Chrome , ). , , , , , AudioBuffer/BufferSourceNode. (.. , , , node .)

+3

Source: https://habr.com/ru/post/1544583/


All Articles