2014-02-11 125 views
0

我正在研究豐富的圖形iOS應用程序。在一個實例中,我們的應用程序佔用的內存爲250 MB。我會從相機取出每一幀,用OpenGL着色器處理它並提取一些數據。每次使用相機獲取處理幀時,我都會看到內存增加到280 MB。當我停止捕捉幀時,內存恢復正常到250 MB。如果我重複啓動相機並退出10次(讓我們說),我收到一個內存警告(雖然沒有觀察到內存泄漏)。這裏我沒有使用ARC。我正在維護一個包含整個幀處理的自動發佈池。在分析時我沒有看到任何泄漏。 10次​​後,內存似乎站在250 MB。我不確定記憶警告的原因。任何見解?我很高興提供進一步的信息。 OpenGL的版本 - ES 2.0,IOS版本 - 7.0內存警告OpenGL iOS應用程序

回答

0

你必須使用ARC時,會自動釋放不好的記憶,讓你的應用進行了優化

0

沒有看到代碼,誰知道?你只是使用EAGLContext的presentRenderbuffer方法渲染幀緩衝區?那麼,你對傳遞給CVOpenGLESTextureCacheCreateTextureFromImage的pixelBuffer做了什麼?在典型的使用場景中,像素緩衝區是大量內存的唯一來源。但是,如果您將渲染緩衝區中的數據交換到另一個緩衝區(例如glReadPixels),那麼您已經引入了幾個內存管理器中的一個。如果您交換的緩衝區是CoreGraphics緩衝區,例如CGDataProvider,您是否包含數據發佈回調,或者在創建提供程序時是否將nil作爲參數傳遞?交換緩衝區之後你有沒有感覺到?

這些是我可以確定答案,如果您提供代碼的問題;如果你認爲你能解決這個沒有這樣做,而是希望看到的工作,成功地管理內存在最艱苦的用例場景代碼有可能可能是:

https://demonicactivity.blogspot.com/2016/11/tech-serious-ios-developers-use-every.html

爲方便起見,我下面提供了一些代碼。將其放置到presentRenderbuffer方法的調用後,註釋掉調用,如果你不想渲染緩衝區在CAEAGLLayer顯示(如我的樣品中沒有下文):

// [_context presentRenderbuffer:GL_RENDERBUFFER ]。

dispatch_async(dispatch_get_main_queue(), ^{ 
    @autoreleasepool { 
     // To capture the output to an OpenGL render buffer... 
     NSInteger myDataLength = _backingWidth * _backingHeight * 4; 
     GLubyte *buffer = (GLubyte *) malloc(myDataLength); 
     glPixelStorei(GL_UNPACK_ALIGNMENT, 8); 
     glReadPixels(0, 0, _backingWidth, _backingHeight, GL_RGBA, GL_UNSIGNED_BYTE, buffer); 

     // To swap the pixel buffer to a CoreGraphics context (as a CGImage) 
     CGDataProviderRef provider; 
     CGColorSpaceRef colorSpaceRef; 
     CGImageRef imageRef; 
     CVPixelBufferRef pixelBuffer; 
     @try { 
      provider = CGDataProviderCreateWithData(NULL, buffer, myDataLength, &releaseDataCallback); 
      int bitsPerComponent = 8; 
      int bitsPerPixel = 32; 
      int bytesPerRow = 4 * _backingWidth; 
      colorSpaceRef = CGColorSpaceCreateDeviceRGB(); 
      CGBitmapInfo bitmapInfo = kCGBitmapByteOrderDefault; 
      CGColorRenderingIntent renderingIntent = kCGRenderingIntentDefault; 
      imageRef = CGImageCreate(_backingWidth, _backingHeight, bitsPerComponent, bitsPerPixel, bytesPerRow, colorSpaceRef, bitmapInfo, provider, NULL, NO, renderingIntent); 
     } @catch (NSException *exception) { 
      NSLog(@"Exception: %@", [exception reason]); 
     } @finally { 
      if (imageRef) { 
       // To convert the CGImage to a pixel buffer (for writing to a file using AVAssetWriter) 
       pixelBuffer = [CVCGImageUtil pixelBufferFromCGImage:imageRef]; 
       // To verify the integrity of the pixel buffer (by converting it back to a CGIImage, and thendisplaying it in a layer) 
       imageLayer.contents = (__bridge id)[CVCGImageUtil cgImageFromPixelBuffer:pixelBuffer context:_ciContext]; 
      } 
      CGDataProviderRelease(provider); 
      CGColorSpaceRelease(colorSpaceRef); 
      CGImageRelease(imageRef); 
     } 

    } 
}); 

。 。 。

回調以釋放數據的CGDataProvider類的實例:

static void releaseDataCallback (void *info, const void *data, size_t size) { 
    free((void*)data); 
} 

的CVCGImageUtil類的接口和實現文件,分別爲:

@import Foundation; 
@import CoreMedia; 
@import CoreGraphics; 
@import QuartzCore; 
@import CoreImage; 
@import UIKit; 

@interface CVCGImageUtil : NSObject 

+ (CGImageRef)cgImageFromPixelBuffer:(CVPixelBufferRef)pixelBuffer context:(CIContext *)context; 

+ (CVPixelBufferRef)pixelBufferFromCGImage:(CGImageRef)image; 

+ (CMSampleBufferRef)sampleBufferFromCGImage:(CGImageRef)image; 

@end 

#import "CVCGImageUtil.h" 

@implementation CVCGImageUtil 

+ (CGImageRef)cgImageFromPixelBuffer:(CVPixelBufferRef)pixelBuffer context:(CIContext *)context 
{ 
    // CVPixelBuffer to CoreImage 
    CIImage *image = [CIImage imageWithCVPixelBuffer:pixelBuffer]; 
    image = [image imageByApplyingTransform:CGAffineTransformMakeRotation(M_PI)]; 
    CGPoint origin = [image extent].origin; 
    image = [image imageByApplyingTransform:CGAffineTransformMakeTranslation(-origin.x, -origin.y)]; 

    // CoreImage to CGImage via CoreImage context 
    CGImageRef cgImage = [context createCGImage:image fromRect:[image extent]]; 

    // CGImage to UIImage (OPTIONAL) 
    //UIImage *uiImage = [UIImage imageWithCGImage:cgImage]; 
    //return (CGImageRef)uiImage.CGImage; 

    return cgImage; 
} 

+ (CVPixelBufferRef)pixelBufferFromCGImage:(CGImageRef)image 
{ 
    CGSize frameSize = CGSizeMake(CGImageGetWidth(image), 
            CGImageGetHeight(image)); 
    NSDictionary *options = 
    [NSDictionary dictionaryWithObjectsAndKeys: 
    [NSNumber numberWithBool:YES], 
    kCVPixelBufferCGImageCompatibilityKey, 
    [NSNumber numberWithBool:YES], 
    kCVPixelBufferCGBitmapContextCompatibilityKey, 
    nil]; 
    CVPixelBufferRef pxbuffer = NULL; 

    CVReturn status = 
    CVPixelBufferCreate(
         kCFAllocatorDefault, frameSize.width, frameSize.height, 
         kCVPixelFormatType_32ARGB, (__bridge CFDictionaryRef)options, 
         &pxbuffer); 
    NSParameterAssert(status == kCVReturnSuccess && pxbuffer != NULL); 

    CVPixelBufferLockBaseAddress(pxbuffer, 0); 
    void *pxdata = CVPixelBufferGetBaseAddress(pxbuffer); 

    CGColorSpaceRef rgbColorSpace = CGColorSpaceCreateDeviceRGB(); 
    CGContextRef context = CGBitmapContextCreate(
               pxdata, frameSize.width, frameSize.height, 
               8, CVPixelBufferGetBytesPerRow(pxbuffer), 
               rgbColorSpace, 
               (CGBitmapInfo)kCGBitmapByteOrder32Little | 
               kCGImageAlphaPremultipliedFirst); 

    CGContextDrawImage(context, CGRectMake(0, 0, CGImageGetWidth(image), 
              CGImageGetHeight(image)), image); 
    CGColorSpaceRelease(rgbColorSpace); 
    CGContextRelease(context); 

    CVPixelBufferUnlockBaseAddress(pxbuffer, 0); 

    return pxbuffer; 
} 

+ (CMSampleBufferRef)sampleBufferFromCGImage:(CGImageRef)image 
{ 
    CVPixelBufferRef pixelBuffer = [CVCGImageUtil pixelBufferFromCGImage:image]; 
    CMSampleBufferRef newSampleBuffer = NULL; 
    CMSampleTimingInfo timimgInfo = kCMTimingInfoInvalid; 
    CMVideoFormatDescriptionRef videoInfo = NULL; 
    CMVideoFormatDescriptionCreateForImageBuffer(
               NULL, pixelBuffer, &videoInfo); 
    CMSampleBufferCreateForImageBuffer(kCFAllocatorDefault, 
             pixelBuffer, 
             true, 
             NULL, 
             NULL, 
             videoInfo, 
             &timimgInfo, 
             &newSampleBuffer); 

    return newSampleBuffer; 
} 

@end