IOS - How to read audio from a stream and play audio

Thanks to everyone who takes the time to read the question!

So, I created a stream using MultipeerConnectivity. I can record audio in CMSampleBuffer and convert this buffer to UInt8 data. Then send this data to the peer using the method:

outputStream!.write(u8ptr, maxLength: Int(buffer.mDataByteSize)) 

Then, when the data appears on the inputStream, the following method is called:

 func stream(_ aStream: Stream, handle eventCode: Stream.Event) { 

I have print statements, so this part is working fine. When the data really appears, I call my function

 func readFromStream() { 

I know that I need to call the inputStream.read method to actually read from the stream, but I'm not sure how to actually read the data and then convert it to NSData so that it can be used with AVAudioPlayer.

(If you do not know a better, more efficient way)

This is what I have so far, but I have not tested it, and I assume there will be problems.

 func readFromStream() { var buffer = [UInt8](repeating: 0, count: 1024) while (inputStream!.hasBytesAvailable) { let length = inputStream!.read(&buffer, maxLength: buffer.count) if (length > 0) { if (audioEngine!.isRunning) { audioEngine!.stop() audioEngine!.reset() } print("\(#file) > \(#function) > \(length) bytes read") let audioBuffer = bytesToAudioBuffer(buffer) let mainMixer = audioEngine!.mainMixerNode audioEngine!.connect(audioPlayer!, to: mainMixer, format: audioBuffer.format) audioPlayer!.scheduleBuffer(audioBuffer, completionHandler: nil) do { try audioEngine!.start() } catch let error as NSError { print("\(#file) > \(#function) > error: \(error.localizedDescription)") } audioPlayer!.play() } } } 

Based on what I have, there is no sound. This is silence, but I see that the sound is received by one of the devices.

So basically, my question is: how to convert this buffer to the correct data type so that it can be played live?

Thanks for the help! If you need more information, please let me know.

+5
source share
2 answers

Instead of using CMSampleBuffers, I used AVAudioPCMBuffers. They can be created by recording from AVAudioEngine. Basically this is how I converted AVAudioPCMBuffer to NSData and vice versa.

 func audioBufferToNSData(PCMBuffer: AVAudioPCMBuffer) -> NSData { let channelCount = 1 // given PCMBuffer channel count is 1 let channels = UnsafeBufferPointer(start: PCMBuffer.floatChannelData, count: channelCount) let data = NSData(bytes: channels[0], length:Int(PCMBuffer.frameLength * PCMBuffer.format.streamDescription.pointee.mBytesPerFrame)) return data } func dataToPCMBuffer(format: AVAudioFormat, data: NSData) -> AVAudioPCMBuffer { let audioBuffer = AVAudioPCMBuffer(pcmFormat: format, frameCapacity: UInt32(data.length) / format.streamDescription.pointee.mBytesPerFrame) audioBuffer.frameLength = audioBuffer.frameCapacity let channels = UnsafeBufferPointer(start: audioBuffer.floatChannelData, count: Int(audioBuffer.format.channelCount)) data.getBytes(UnsafeMutableRawPointer(channels[0]) , length: data.length) return audioBuffer } 
+4
source

To convert UInt8 to NSData> NSData from UInt8

Once you have done this, just use AVAudioPlayer

 let player = AVAudioPlayer(data: data) player.play() 
+4
source

Source: https://habr.com/ru/post/1263759/


All Articles