2012-08-03 54 views
6

捕獲iOS設備屏幕時,是否有方法捕獲CAEmitterCells(使用CAEmitterLayer生成)?
UIGetScreenImage()工程,但因爲它是一個私人的方法我不允許使用它。
UIGraphicsBeginImageContext似乎不起作用,粒子從結果圖像中簡單地省略。iOS:在屏幕上捕獲CAEmitterLayer粒子

編輯: 這是我目前用來捕捉視圖的代碼。實際上,我使用由here提供的代碼錄製了30秒長的屏幕視頻。它通過每秒記錄25個圖像本身(它的一個UIView子類)和它的子視圖(在我們的例子中包括UIView的圖層是CAEmitterLayer)並使用AVAssetWriter來組成記錄。

這是相當口的,所以我只在這裏放置相關的代碼: 我使用XCode中的ARC工具ARC代碼,所以代碼可能有點不同於內存管理方式。

- (CGContextRef) createBitmapContextOfSize:(CGSize) size { 
    CGContextRef context = NULL; 
    CGColorSpaceRef colorSpace; 
    int    bitmapByteCount; 
    int    bitmapBytesPerRow; 

    bitmapBytesPerRow = (size.width * 4); 
    bitmapByteCount  = (bitmapBytesPerRow * size.height); 
    colorSpace = CGColorSpaceCreateDeviceRGB(); 
    if (bitmapData != NULL) { 
     free(bitmapData); 
    } 
    bitmapData = malloc(bitmapByteCount); 
    if (bitmapData == NULL) { 
     fprintf (stderr, "Memory not allocated!"); 
     return NULL; 
    } 

    context = CGBitmapContextCreate (bitmapData, 
            size.width, 
            size.height, 
            8,  // bits per component 
            bitmapBytesPerRow, 
            colorSpace, 
            kCGImageAlphaNoneSkipFirst); 

    CGContextSetAllowsAntialiasing(context,NO); 
    if (context== NULL) { 
     free (bitmapData); 
     fprintf (stderr, "Context not created!"); 
     return NULL; 
    } 
    CGColorSpaceRelease(colorSpace); 

    return context; 
} 

//static int frameCount = 0;   //debugging 
- (void) drawRect:(CGRect)rect { 
    NSDate* start = [NSDate date]; 
    CGContextRef context = [self createBitmapContextOfSize:self.frame.size]; 

    //not sure why this is necessary...image renders upside-down and mirrored 
    CGAffineTransform flipVertical = CGAffineTransformMake(1, 0, 0, -1, 0, self.frame.size.height); 
    CGContextConcatCTM(context, flipVertical); 

    [self.layer renderInContext:context]; 

    CGImageRef cgImage = CGBitmapContextCreateImage(context); 
    UIImage* background = [UIImage imageWithCGImage: cgImage]; 
    CGImageRelease(cgImage); 

    self.currentScreen = background; 

    //debugging 
    //if (frameCount < 40) { 
    //  NSString* filename = [NSString stringWithFormat:@"Documents/frame_%d.png", frameCount]; 
    //  NSString* pngPath = [NSHomeDirectory() stringByAppendingPathComponent:filename]; 
    //  [UIImagePNGRepresentation(self.currentScreen) writeToFile: pngPath atomically: YES]; 
    //  frameCount++; 
    //} 

    //NOTE: to record a scrollview while it is scrolling you need to implement your UIScrollViewDelegate such that it calls 
    //  'setNeedsDisplay' on the ScreenCaptureView. 
    if (_recording) { 
     float millisElapsed = [[NSDate date] timeIntervalSinceDate:startedAt] * 1000.0; 
     [self writeVideoFrameAtTime:CMTimeMake((int)millisElapsed, 1000)]; 
    } 

    float processingSeconds = [[NSDate date] timeIntervalSinceDate:start]; 
    float delayRemaining = (1.0/self.frameRate) - processingSeconds; 

    CGContextRelease(context); 

    //redraw at the specified framerate 
    [self performSelector:@selector(setNeedsDisplay) withObject:nil afterDelay:delayRemaining > 0.0 ? delayRemaining : 0.01]; 
} 

真的希望這有助於。感謝您的支持!

+0

順便說一句 - 你可以爲'CGBitmapContextCreate()'的__data__參數傳遞NULL,在這種情況下,數據將被CG – nielsbot 2012-08-13 07:26:46

+0

自動分配/解除分配,我也沒有看到你的渲染代碼這個片段? – nielsbot 2012-08-13 07:27:39

+0

你能解決這個問題嗎? – 2012-09-26 17:45:30

回答

0

您是否嘗試過使用-[CALayer renderInContext:]

+0

我認爲他想要整個屏幕圖像。啊,也許你的意思是渲染他想要的視圖(可能包含CAEmitterLayer)。聽起來很有希望。 – 2012-08-08 18:07:01

+0

@DavidH正是...... – nielsbot 2012-08-08 19:47:46

+0

進一步思考它,'UIWindow'也是'UIView',因此有一個支持層 - 你可以渲染它。 – nielsbot 2012-08-08 19:50:30