I would like to play a video (either from a local file, or from a remote URL), and its audio tracks, and get a pixel buffer of each frame of the video to draw it in OpenGL texture.
Here is the code I'm using in iOS 6 (it works great):
Initiate video
- (void) readMovie:(NSURL *)url { NSLog(@"Playing video %@", param.url); AVURLAsset * asset = [AVURLAsset URLAssetWithURL:url options:nil]; [asset loadValuesAsynchronouslyForKeys:[NSArray arrayWithObject:@"tracks"] completionHandler: ^{ dispatch_async(dispatch_get_main_queue(), ^{ NSError* error = nil; AVKeyValueStatus status = [asset statusOfValueForKey:@"tracks" error:&error]; if (status == AVKeyValueStatusLoaded) { NSDictionary* settings = @{ (id)kCVPixelBufferPixelFormatTypeKey : [NSNumber numberWithInt:kCVPixelFormatType_32BGRA] }; AVPlayerItemVideoOutput* output = [[AVPlayerItemVideoOutput alloc] initWithPixelBufferAttributes:settings]; AVPlayerItem* playerItem = [AVPlayerItem playerItemWithAsset:asset]; [playerItem addOutput:output]; AVPlayer* player = [AVPlayer playerWithPlayerItem:playerItem]; [self setPlayer:player]; [self setPlayerItem:playerItem]; [self setOutput:output]; [[NSNotificationCenter defaultCenter] addObserver:self selector:@selector(bufferingVideo:) name:AVPlayerItemPlaybackStalledNotification object:nil]; [[NSNotificationCenter defaultCenter] addObserver:self selector:@selector(videoEnded:) name:AVPlayerItemDidPlayToEndTimeNotification object:nil]; [[NSNotificationCenter defaultCenter] addObserver:self selector:@selector(videoFailed:) name:AVPlayerItemFailedToPlayToEndTimeNotification object:nil]; [[self player] addObserver:self forKeyPath:@"rate" options:0 context:nil]; [[self player] addObserver:self forKeyPath:@"status" options:0 context:NULL]; [player play]; } else { NSLog(@"%@ Failed to load the tracks.", self); } }); }]; }
Read the video buffer (in the update function called each frame)
- (void) readNextMovieFrame { CMTime outputItemTime = [[self playerItem] currentTime]; float interval = [self maxTimeLoaded]; CMTime t = [[self playerItem] currentTime]; CMTime d = [[self playerItem] duration]; NSLog(@"Video : %f/%f (loaded : %f) - speed : %f", (float)t.value / (float)t.timescale, (float)d.value / (float)d.timescale, interval, [self player].rate); [videoBar updateProgress:(interval / CMTimeGetSeconds(d))]; [videoBar updateSlider:(CMTimeGetSeconds(t) / CMTimeGetSeconds(d))]; if ([[self output] hasNewPixelBufferForItemTime:outputItemTime]) { CVPixelBufferRef buffer = [[self output] copyPixelBufferForItemTime:outputItemTime itemTimeForDisplay:nil];
So this code works fine with iOS 6, and I would like it to work on iOS 5, but AVPlayerItemVideoOutput not part of iOS 5, so I can still play the video, but I don't know how to extract a pixel buffer for each frame video.
Do you have any idea what I can use instead of AVPlayerItemVideoOutput to get a pixel buffer for each frame of the video? (it should work with both local and remote video, and I also want to play the audio track).
Many thanks for your help!
source share