2

我從CMSampleBufferRef視頻緩衝器得到一個的UIImage每N個視頻幀,如:內存泄漏CMSampleBufferGetImageBuffer

- (void)imageFromVideoBuffer:(void(^)(UIImage* image))completion { 
    CMSampleBufferRef sampleBuffer = _myLastSampleBuffer; 
    if (sampleBuffer != nil) { 
     CFRetain(sampleBuffer); 
     CIImage *ciImage = [CIImage imageWithCVPixelBuffer:CMSampleBufferGetImageBuffer(sampleBuffer)]; 
     _lastAppendedVideoBuffer.sampleBuffer = nil; 
     if (_context == nil) { 
      _context = [CIContext contextWithOptions:nil]; 
     } 
     CVPixelBufferRef buffer = CMSampleBufferGetImageBuffer(sampleBuffer); 
     CGImageRef cgImage = [_context createCGImage:ciImage fromRect: 
           CGRectMake(0, 0, CVPixelBufferGetWidth(buffer), CVPixelBufferGetHeight(buffer))]; 
     __block UIImage *image = [UIImage imageWithCGImage:cgImage]; 

     CGImageRelease(cgImage); 
     CFRelease(sampleBuffer); 

     if(completion) completion(image); 

     return; 
    } 
    if(completion) completion(nil); 
} 

Xcode和儀器檢測內存泄漏,但我不能擺脫它。 我釋放CGImageRef和CMSampleBufferRef像往常一樣:

CGImageRelease(cgImage); 
CFRelease(sampleBuffer); 

[更新] 我放在AVCapture輸出回調以獲取sampleBuffer

- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection { 
    if (captureOutput == _videoOutput) { 
     _lastVideoBuffer.sampleBuffer = sampleBuffer; 
     id<CIImageRenderer> imageRenderer = _CIImageRenderer; 

     dispatch_async(dispatch_get_main_queue(), ^{ 
      @autoreleasepool { 
       CIImage *ciImage = nil; 
       ciImage = [CIImage imageWithCVPixelBuffer:CMSampleBufferGetImageBuffer(sampleBuffer)]; 
       if(_context==nil) { 
        _context = [CIContext contextWithOptions:nil]; 
       } 
       CGImageRef processedCGImage = [_context createCGImage:ciImage 
                  fromRect:[ciImage extent]]; 
       //UIImage *image=[UIImage imageWithCGImage:processedCGImage]; 
       CGImageRelease(processedCGImage); 
       NSLog(@"Captured image %@", ciImage); 
      } 
     }); 

泄漏的代碼是createCGImage:ciImage

CGImageRef processedCGImage = [_context createCGImage:ciImage 
                  fromRect:[ciImage extent]]; 

甚至具有autoreleasepool,所述CGImage參考的CGImageReleaseCIContext作爲實例屬性。

這似乎是同樣的問題,解決這裏:Can't save CIImage to file on iOS without memory leaks

[更新] 泄漏似乎是由於一個錯誤。這個問題在 Memory leak on CIContext createCGImage at iOS 9?

很好地描述一個示例項目演示如何重現此泄漏:http://www.osamu.co.jp/DataArea/VideoCameraTest.zip

的最新留言確保

看起來他們固定這9.1b3。如果有人需要一種解決方法 iOS上9.0.x的作品,我能得到它這方面的工作:

在測試代碼

(Swift在這種情況下):

[self.stillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection completionHandler: ^(CMSampleBufferRef imageSampleBuffer, NSError *error) 
    { 
     if (error) return; 

     __block NSString *filePath = [NSTemporaryDirectory() stringByAppendingPathComponent:[NSString stringWithFormat:@"ipdf_pic_%i.jpeg",(int)[NSDate date].timeIntervalSince1970]]; 

     NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageSampleBuffer]; 
     dispatch_async(dispatch_get_main_queue(),^ 
     { 

      @autoreleasepool 
      { 
       CIImage *enhancedImage = [CIImage imageWithData:imageData]; 

       if (!enhancedImage) return; 

       static CIContext *ctx = nil; if (!ctx) ctx = [CIContext contextWithOptions:nil]; 

       CGImageRef imageRef = [ctx createCGImage:enhancedImage fromRect:enhancedImage.extent format:kCIFormatBGRA8 colorSpace:nil]; 

       UIImage *image = [UIImage imageWithCGImage:imageRef scale:1.0 orientation:UIImageOrientationRight]; 

       [[NSFileManager defaultManager] createFileAtPath:filePath contents:UIImageJPEGRepresentation(image, 0.8) attributes:nil]; 

       CGImageRelease(imageRef); 
      } 
     }); 
    }]; 

和解決方法iOS9.0應該

extension CIContext { 
    func createCGImage_(image:CIImage, fromRect:CGRect) -> CGImage { 
     let width = Int(fromRect.width) 
     let height = Int(fromRect.height) 

     let rawData = UnsafeMutablePointer<UInt8>.alloc(width * height * 4) 
     render(image, toBitmap: rawData, rowBytes: width * 4, bounds: fromRect, format: kCIFormatRGBA8, colorSpace: CGColorSpaceCreateDeviceRGB()) 
     let dataProvider = CGDataProviderCreateWithData(nil, rawData, height * width * 4) {info, data, size in UnsafeMutablePointer<UInt8>(data).dealloc(size)} 
     return CGImageCreate(width, height, 8, 32, width * 4, CGColorSpaceCreateDeviceRGB(), CGBitmapInfo(rawValue: CGImageAlphaInfo.PremultipliedLast.rawValue), dataProvider, nil, false, .RenderingIntentDefault)! 
    } 
} 
+0

儀器說漏了什麼?在哪裏設置了_myLastSampleBuffer和_lastAppendedVideoBuffer.sampleBuffer? – ChrisH

+0

@ChrisH請參閱上面的代碼。 – loretoparisi

+0

這不是Apple代碼中的泄漏,只是在copyNextSampleBuffer的返回結果上不調用CFRetain(sampleBuffer),代碼就可以正常工作。 – MoDJ

回答

3

我們正在經歷的,我們創建了一個應用程序,我們正在處理與OpenCV的功能的關鍵點每幀類似的問題,並關閉發送一幀每兩秒鐘。經過一段時間的跑步後,我們最終會得到相當多的記憶壓力信息。

我們設法通過像這樣它自己的自動釋放池中運行我們的處理代碼來糾正這種(jpegDataFromSampleBufferAndCrop做類似於你在做什麼東西,添加了裁剪):

- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection 
{ 
     @autoreleasepool { 

      if ([self.lastFrameSentAt timeIntervalSinceNow] < -kContinuousRateInSeconds) { 

       NSData *imageData = [self jpegDataFromSampleBufferAndCrop:sampleBuffer]; 

       if (imageData) { 
        [self processImageData:imageData]; 
       } 

       self.lastFrameSentAt = [NSDate date]; 

       imageData = nil; 
      } 
     } 
    } 
} 
+0

謝謝,我找到了泄露的代碼(見上文)。 – loretoparisi

+0

Zomg!有用!! –

+0

所以,看起來泄漏是由於iOS上的'''createCGImage'''中的錯誤引起的。看看https://forums.developer.apple.com/message/50981#50981 – loretoparisi

1

我可以證實,這內存泄漏在iOS 9.2上仍然存在。 (我也發佈在Apple Developer Forum上。)

我在iOS 9.2上得到了相同的內存泄漏。我測試了使用MetalKit和MLKDevice來刪除EAGLContext。我已經測試過使用CIContext的不同方法,如drawImage,createCGImage和render,但似乎沒有任何工作。

很明顯,這是iOS 9以來的一個bug。通過從Apple下載示例應用程序(請參見下文)嘗試一下自己,然後在iOS 8.4的設備上運行相同的項目,然後在iOS 9.2的設備,並注意Xcode中的內存量表。與此

[EAGLContext setCurrentContext:_context]; 
_ciContext = [CIContext contextWithEAGLContext:_context]; 

118和finaly更換APLEAGLView.m:

下載https://developer.apple.com/library/ios/samplecode/AVBasicVideoOutput/Introduction/Intro.html#//apple_ref/doc/uid/DTS40013109

一下添加到APLEAGLView.h:20

@property (strong, nonatomic) CIContext* ciContext; 

更換APLEAGLView.m 341- 343與此

glDrawArrays(GL_TRIANGLE_STRIP, 0, 4); 

    @autoreleasepool 
    { 
     CIImage* sourceImage = [CIImage imageWithCVPixelBuffer:pixelBuffer]; 
     CIFilter* filter = [CIFilter filterWithName:@"CIGaussianBlur" keysAndValues:kCIInputImageKey, sourceImage, nil]; 
     CIImage* filteredImage = filter.outputImage; 

     [_ciContext render:filteredImage toCVPixelBuffer:pixelBuffer]; 
    } 

glBindRenderbuffer(GL_RENDERBUFFER, _colorBufferHandle);