Xuggler audio playback not continuous

I have a job where I have to take continuous screenshots and capture sound from the desktop, and then publish them as a live video stream. I am using Wowza Media Server 3.0.3 to publish streams. I also use Xuggler to generate image frames and put them with sound buffers in packages. I have the following problem:

I run my program, and image frames and sound packets are published. The Wowza console tells me that packages are being published. When I open the media player (in this case, VLC), the video part of the stream works like a charm (I see that the graphic frames taken from my desktop continuously), but the audio part is very bad. I mean, when I start playing livestream, VLC buffers the approximately 3-second audio portion recorded from my desktop and plays it at a faster speed. After a longer break, it is buffered again and plays the next part. In my code, I constantly send iBuffers audio encoded in MP3 and publish them in packets, so I can’t understand why the sound does not play continuously, like image frames.

Can anyone get an answer or any experience in my problem?

I made a copy from my code, where I just transmit the desktop sound, not the image. This is the snippet where I receive the sound and send it for encoding and publishing:

while (true) { byte buffer[] = new byte[line.available()]; int count = line.read(buffer, 0, buffer.length); IBuffer iBuf = IBuffer.make(null, buffer, 0, count); //Itt írjuk a stream-be az audioframe-et _AudioWriter.encodeFrameToStream(iBuf, buffer, firstTimeStamp); try { Thread.sleep(100); } catch (InterruptedException e) { // TODO Auto-generated catch block e.printStackTrace(); } } 

This is the part where I get iBuffer and encode it to mp3. After publication as a package:

 public void encodeFrameToStream(IBuffer ibuffer, byte[] buffer, long firstTimeStamp) { long now = System.currentTimeMillis(); long timeStamp = (now - firstTimeStamp); IAudioSamples outChunk = IAudioSamples.make(ibuffer, 1, IAudioSamples.Format.FMT_S16); if (outChunk == null) { return; } long numSample = buffer.length / outChunk.getSampleSize(); outChunk.setComplete(true, numSample, 44100, 1, Format.FMT_S16, timeStamp); //System.out.println(outChunk + " =========== " + outChunk.getPts()); IPacket packet2 = IPacket.make(); packet2.setStreamIndex(0); getCoder2().encodeAudio(packet2, outChunk, 0); outChunk.delete(); if (packet2.isComplete()) { //System.out.println("completed"); getContainer().writePacket(packet2); //System.out.println("Size: "+packet2.getSize()); } } 
+4
source share
1 answer

We will have to debug a little more to find out all the factors.

  • Usually, when the audio stream is played in a different step, this means that the sample rate of the input and output does not match. Currently, you manually set the sampling format to FMT_S16 and the sampling frequency up to 44.100 Hz. This will work fine if the input is already formatted that way.

    You might want to make sure that the packets have the correct number of channels, sample format, and sample rate using IAudioResampler between input and output. Use the IMediaWriter and IStreamCoder functions getChannels() and getSampleRate() as input for IAudioResampler .

  • I am not familiar with Wowza Media Server, but it seems to be doing some kind of transcoding. I can’t tell from your code, but it seems that you are directly passing Wowza, and not using a file container. You can try to output to a file and see if you can play it. Thus, you can verify the correct encoding of audio / video data.

    If so, the problem probably lies with Wowza. Check and make sure that it presents any specific restrictions on the codec, sample format, sample size, channels, sample rate and bit rate.

    If the output file does not play, just try to record the audio stream and leave the video. If so, then the problem is the formation of packets from audio and video data.

  • Finally, could you try and display the timestamp of each video frame and sound sample when you write them? This way you can ensure that all data packets are neatly aligned in chronology. If at some point in the video file these packages are in the wrong order, the file cannot be transferred and played correctly.

For example, this is a distorted video file:

 0ms video frame 1 0ms audio sample 1 10ms video sample 2 8ms audio frame 2` 
0
source

Source: https://habr.com/ru/post/1443702/


All Articles