How to play video on iOS manually downloaded from a socket?

Rather difficult to do ...

I am trying to transfer video (H264) over a network on iOS. However, I receive video data to the buffer through an open socket to a remote server (using CocoaAsyncSocket), so I don’t have a URL for the video that I can use to create AVAsset or MPMoviePlayer. The video is a real-time stream, so the data will just continue (i.e. there is no set duration) if that matters.

I need to do this like a server is an RTSP server. I wrote my own RTSP client to send commands and get responses, and now I'm trying to do something useful with the video data that comes through the connection.

Any ideas on how I can play this video? The only thing I can think about at the moment is to somehow save the file and load it (but I don’t see how it will work, since I will constantly upload new data), or resorting to doing it manually somehow with something like ffmpeg. And no, unfortunately, I can’t get the server to do HTTP Live Streaming.

Any help would be greatly appreciated!

+4
source share
1 answer

I haven't had to dig this into AVFoundation yet, but you could take it off by creating AVAsset using AVAssetWriter . You provide an AVAssetWriter instance of AVAssetWriterInput that accepts CMSampleBuffer data and packs it for AVAssetWriter .

Based on docs for AVAssetWriterInput it is designed to receive data from a source in real time.

I would like to help, but I hope this will point you in the right direction.

0
source

Source: https://habr.com/ru/post/1335004/


All Articles