2014-03-06 38 views
1

我正在使用avplayer實現一個帶有OSX的HTTP實況流播放器。 我能夠正確流暢地查找並獲取持續時間等。 現在我想要使用OpenCV拍攝屏幕截圖並處理幀。 我去使用AVASSetImageGenerator。但是沒有與player.currentItem關聯的AVAsset的音頻和視頻軌道。HTTP實時流AVAsset

曲目出現在player.currentItem.tracks中。 所以我不能起訴AVAssetGenerator。任何人都可以幫助找出解決方案,在這種情況下提取截圖和單個框架?

請在下面找到我有多提前發起HTTP實時流

感謝代碼。

NSURL* url = [NSURL URLWithString:@"http://devimages.apple.com/iphone/samples/bipbop/bipbopall.m3u8"]; 
playeritem = [AVPlayerItem playerItemWithURL:url]; 

[playeritem addObserver:self forKeyPath:@"status" options:0 context:AVSPPlayerStatusContext]; 
[self setPlayer:[AVPlayer playerWithPlayerItem:playeritem]]; 
[self addObserver:self forKeyPath:@"player.rate" options:NSKeyValueObservingOptionNew context:AVSPPlayerRateContext]; 
[self addObserver:self forKeyPath:@"player.currentItem.status" options:NSKeyValueObservingOptionNew context:AVSPPlayerItemStatusContext]; 
AVPlayerLayer *newPlayerLayer = [AVPlayerLayer playerLayerWithPlayer:[self player]]; 
[newPlayerLayer setFrame:[[[self playerView] layer] bounds]]; 
[newPlayerLayer setAutoresizingMask:kCALayerWidthSizable | kCALayerHeightSizable]; 
[newPlayerLayer setHidden:YES]; 
[[[self playerView] layer] addSublayer:newPlayerLayer]; 
[self setPlayerLayer:newPlayerLayer]; 
[self addObserver:self forKeyPath:@"playerLayer.readyForDisplay" options:NSKeyValueObservingOptionInitial | NSKeyValueObservingOptionNew context:AVSPPlayerLayerReadyForDisplay]; 
[self.player play];  

以下就是我如何檢查視頻軌道是否存在與資產

case AVPlayerItemStatusReadyToPlay: 

       [self setTimeObserverToken:[[self player] addPeriodicTimeObserverForInterval:CMTimeMake(1, 10) queue:dispatch_get_main_queue() usingBlock:^(CMTime time) { 
        [[self timeSlider] setDoubleValue:CMTimeGetSeconds(time)]; 
        NSLog(@"%f,%f,%f",[self currentTime],[self duration],[[self player] rate]); 
        AVPlayerItem *item = playeritem; 
        if(item.status == AVPlayerItemStatusReadyToPlay) 
        { 
        AVAsset *asset = (AVAsset *)item.asset; 
        long audiotracks = [[asset tracks] count]; 
        long videotracks = [[asset availableMediaCharacteristicsWithMediaSelectionOptions]count]; 

        NSLog(@"Track info Audio = %ld,Video=%ld",audiotracks,videotracks); 
        } 
       }]]; 



       AVPlayerItem *item = self.player.currentItem; 
       if(item.status != AVPlayerItemStatusReadyToPlay) 
        return; 
       AVURLAsset *asset = (AVURLAsset *)item.asset; 
       long audiotracks = [[asset tracksWithMediaType:AVMediaTypeAudio]count]; 
       long videotracks = [[asset tracksWithMediaType:AVMediaTypeVideo]count]; 

       NSLog(@"Track info Audio = %ld,Video=%ld",audiotracks,videotracks); 
+1

嗨,你如何設法使用HLS獲取曲目?請幫忙 –

回答

1

這是一個老問題,但如果有人需要幫助,我有一個答案

AVURLAsset *asset = /* Your Asset here! */; 
AVAssetImageGenerator *generator = [[AVAssetImageGenerator alloc] initWithAsset:asset]; 
generator.requestedTimeToleranceAfter = kCMTimeZero; 
generator.requestedTimeToleranceBefore = kCMTimeZero; 
for (Float64 i = 0; i < CMTimeGetSeconds(asset.duration) * /* Put the FPS of the source video here */ ; i++){ 
    @autoreleasepool { 
     CMTime time = CMTimeMake(i, /* Put the FPS of the source video here */); 

     NSError *err; 
     CMTime actualTime; 
     CGImageRef image = [generator copyCGImageAtTime:time actualTime:&actualTime error:&err]; 

     // Do what you want with the image, for example save it as UIImage 
     UIImage *generatedImage = [[UIImage alloc] initWithCGImage:image]; 

     CGImageRelease(image); 
    } 
} 

您可以使用此代碼輕鬆獲得視頻的FPS:

float fps=0.00; 
if (asset) { 
    AVAssetTrack * videoATrack = [asset tracksWithMediaType:AVMediaTypeVideo][0]; 
    if(videoATrack) 
    { 
     fps = [videoATrack nominalFrameRate]; 
    } 
} 

希望能夠幫助那些詢問如何從視頻中獲取所有幀的用戶,或者只是某些特定的幀(例如CMTime)幀。請記住,將所有幀保存到陣列幾乎不會影響內存!

+0

喜歡它。我在2016年來到這裏,需要這個答案。謝謝! – sudo