Thanks to everyone who takes the time to read the question!
So, I created a stream using MultipeerConnectivity. I can record audio in CMSampleBuffer and convert this buffer to UInt8 data. Then send this data to the peer using the method:
outputStream!.write(u8ptr, maxLength: Int(buffer.mDataByteSize))
Then, when the data appears on the inputStream, the following method is called:
func stream(_ aStream: Stream, handle eventCode: Stream.Event) {
I have print statements, so this part is working fine. When the data really appears, I call my function
func readFromStream() {
I know that I need to call the inputStream.read method to actually read from the stream, but I'm not sure how to actually read the data and then convert it to NSData so that it can be used with AVAudioPlayer.
(If you do not know a better, more efficient way)
This is what I have so far, but I have not tested it, and I assume there will be problems.
func readFromStream() { var buffer = [UInt8](repeating: 0, count: 1024) while (inputStream!.hasBytesAvailable) { let length = inputStream!.read(&buffer, maxLength: buffer.count) if (length > 0) { if (audioEngine!.isRunning) { audioEngine!.stop() audioEngine!.reset() } print("\(#file) > \(#function) > \(length) bytes read") let audioBuffer = bytesToAudioBuffer(buffer) let mainMixer = audioEngine!.mainMixerNode audioEngine!.connect(audioPlayer!, to: mainMixer, format: audioBuffer.format) audioPlayer!.scheduleBuffer(audioBuffer, completionHandler: nil) do { try audioEngine!.start() } catch let error as NSError { print("\(#file) > \(#function) > error: \(error.localizedDescription)") } audioPlayer!.play() } } }
Based on what I have, there is no sound. This is silence, but I see that the sound is received by one of the devices.
So basically, my question is: how to convert this buffer to the correct data type so that it can be played live?
Thanks for the help! If you need more information, please let me know.