2

我使用默認的AVCaptureSession捕獲攝像頭視圖。
一切工作正常,我沒有任何泄漏,但是當我啓動並關閉AVCaptureDevice後使用Allocations來查找被遺棄的內存時,它顯示了大約230個仍在運行的對象。AVCaptureSession被遺棄的內存 - 分配 - 工具

這裏是我的代碼:

或者Controller.h:

@interface Controller : UIViewController <AVCaptureVideoDataOutputSampleBufferDelegate> { 
AVCaptureSession *captureSession; 
AVCaptureDevice *device; 

IBOutlet UIView *previewLayer; 
} 
@property (nonatomic, retain) AVCaptureSession *captureSession; 
@property (nonatomic, retain) UIView *previewLayer; 

- (void)setupCaptureSession; 
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer 
     fromConnection:(AVCaptureConnection *)connection; 
- (UIImage *) imageFromSampleBuffer:(CMSampleBufferRef) sampleBuffer; 

Controller.m或者:

- (void)setupCaptureSession {  
    NSError *error = nil; 

    [self setCaptureSession: [[AVCaptureSession alloc] init]]; 

    self.captureSession.sessionPreset = AVCaptureSessionPresetMedium; 

    device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo]; 

    if ([device isFocusModeSupported:AVCaptureFocusModeContinuousAutoFocus] && [device lockForConfiguration:&error]) { 
     [device setFocusMode:AVCaptureFocusModeContinuousAutoFocus]; 
     [device unlockForConfiguration]; 
    } 

    AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device 
                     error:&error]; 
    if (!input) { 
     // TODO: Obsługa błędu, gdy nie uda się utworzyć wejścia 
    } 
    [[self captureSession] addInput:input]; 

    AVCaptureVideoDataOutput *output = [[[AVCaptureVideoDataOutput alloc] init] autorelease]; 
    [[self captureSession] addOutput:output]; 

    dispatch_queue_t queue = dispatch_queue_create("myQueue", NULL); 
    [output setSampleBufferDelegate:self queue:queue]; 
    dispatch_release(queue); 

    output.videoSettings = 
    [NSDictionary dictionaryWithObject: 
    [NSNumber numberWithInt:kCVPixelFormatType_32BGRA] 
           forKey:(id)kCVPixelBufferPixelFormatTypeKey]; 


    output.minFrameDuration = CMTimeMake(1, 15); 

    [[self captureSession] startRunning]; 

    AVCaptureVideoPreviewLayer *captureVideoPreviewLayer = [AVCaptureVideoPreviewLayer layerWithSession:self.captureSession]; 
    captureVideoPreviewLayer.frame = previewLayer.bounds; 
    [previewLayer.layer insertSublayer:captureVideoPreviewLayer atIndex:0]; 
    [previewLayer setHidden:NO]; 
} 

// Delegate routine that is called when a sample buffer was written 
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer 
     fromConnection:(AVCaptureConnection *)connection { 
    if (mutex && ![device isAdjustingFocus] && ![device isAdjustingExposure] && ![device isAdjustingWhiteBalance]) { 
     // Create a UIImage from the sample buffer data 
     mutex = NO; 
     UIImage *image = [self imageFromSampleBuffer:sampleBuffer]; 

     image = [Tools rotateImage:image andRotateAngle:UIImageOrientationUp]; 

     CGRect rect; 
     rect.size.width = 210; 
     rect.size.height = 50; 
     rect.origin.x = 75; 
     rect.origin.y = 175; 

     UIImage *croppedImage = [image resizedImage:image.size interpolationQuality:kCGInterpolationHigh]; 
     croppedImage = [croppedImage croppedImage:rect]; 

     croppedImage = [self processImage:croppedImage]; 
     [NSThread detachNewThreadSelector:@selector(threadedReadAndProcessImage:) toTarget:self withObject:croppedImage]; 
    } 
} 

// Create a UIImage from sample buffer data 
- (UIImage *) imageFromSampleBuffer:(CMSampleBufferRef) sampleBuffer { 
    // Get a CMSampleBuffer's Core Video image buffer for the media data 
    CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer); 
    // Lock the base address of the pixel buffer 
    CVPixelBufferLockBaseAddress(imageBuffer, 0); 

    // Get the number of bytes per row for the pixel buffer 
    void *baseAddress = CVPixelBufferGetBaseAddress(imageBuffer); 

    // Get the number of bytes per row for the pixel buffer 
    size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer); 
    // Get the pixel buffer width and height 
    size_t width = CVPixelBufferGetWidth(imageBuffer); 
    size_t height = CVPixelBufferGetHeight(imageBuffer); 

    // Create a device-dependent RGB color space 
    CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB(); 

    // Create a bitmap graphics context with the sample buffer data 
    CGContextRef context = CGBitmapContextCreate(baseAddress, width, height, 8, 
               bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst); 
    // Create a Quartz image from the pixel data in the bitmap graphics context 
    CGImageRef quartzImage = CGBitmapContextCreateImage(context); 
    // Unlock the pixel buffer 
    CVPixelBufferUnlockBaseAddress(imageBuffer,0); 

    CGContextRelease(context); 
    CGColorSpaceRelease(colorSpace); 

    UIImage *image = [UIImage imageWithCGImage:quartzImage]; 

    CGImageRelease(quartzImage); 

    return (image); 
} 

而且我清理一切與此代碼:

- (void)cancelTapped { 
    [[self captureSession] stopRunning], self.captureSession = nil; 

    for (UIView *view in self.previewLayer.subviews) { 
     [view removeFromSuperview]; 
    } 

    [self dismissModalViewControllerAnimated:YES]; 
} 

- (void)dealloc { 
    [super dealloc]; 

    [captureSession release]; 
    [device release]; 
    [previewLayer release]; 
} 

儀器向我展示了索姆ething這樣的: http://i.stack.imgur.com/NBWgZ.png

http://i.stack.imgur.com/1GB6C.png

任何想法我做錯了嗎?

+0

你能看看這裏的問題嗎? http://stackoverflow.com/questions/11717962/how-to-cropresize-capture-image-from-iphone-and-selecting-from-photo-album – 2012-07-31 10:04:51

回答

2
- (void)setupCaptureSession {  
    NSError *error = nil; 

    [self setCaptureSession: [[AVCaptureSession alloc] init]]; 
    ... 

泄漏捕捉會話,這將讓所有的輸入和輸出,所有的小幫手內部活着。

兩個選項:

AVCaptureSession *session = [[AVCaptureSession alloc] init]; 
self.captureSession = session; 
[session release], session = nil; 
// or: 
self.captureSession = [[[AVCaptureSession alloc] init] autorelease]; 
+0

但它有什麼問題呢?我釋放它在dealloc [captureSession發佈]; – woojtekr 2011-03-11 16:58:59

+0

你聲明瞭'@property(nonatomic,_retain_)AVCaptureSession * captureSession;'所以這個setter保留你新創建的對象,這是(因爲alloc/init)_you已經擁有__。 – danyowdee 2011-03-11 17:01:54

+0

@woojtekr我編輯了我的答案以包含修復程序。 – danyowdee 2011-03-11 17:07:36

1
- (void)dealloc { 
    [super dealloc]; 

    [captureSession release]; 
    [device release]; 
    [previewLayer release]; 
} 

超級的dealloc應該在其他發行後調用,或者您的實例的記憶可能不包含有效指針到這些對象,你是釋放,因此你不會釋放他們,特別是如果他們是零。