2014-02-13 42 views
6

我正在使用AVFoundation構建應用程序。AVFoundation:將文本添加到CMSampleBufferRef視頻幀

就在我撥打[assetWriterInput appendSampleBuffer:sampleBuffer]之前 - (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection -method。

我操縱樣本緩衝區中的像素(使用像素緩衝區來應用效果)。

但客戶希望我在框架中輸入文本(時間戳& framecounter),但我還沒有找到一種方法來完成此操作。

我試圖samplebuffer轉換爲圖像,在圖像上應用文本和圖像轉換回samplebuffer,但隨後

CMSampleBufferDataIsReady(sampleBuffer) 

失敗。

這裏是我的UIImage類方法:

+ (UIImage *) imageFromSampleBuffer:(CMSampleBufferRef) sampleBuffer 
    { 
    CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer); 

    CVPixelBufferLockBaseAddress(imageBuffer,0); 

    uint8_t *baseAddress = (uint8_t *)CVPixelBufferGetBaseAddress(imageBuffer); 
    size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer); 
    size_t width = CVPixelBufferGetWidth(imageBuffer); 
    size_t height = CVPixelBufferGetHeight(imageBuffer); 


    CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB(); 
    CGContextRef newContext = CGBitmapContextCreate(baseAddress, width, height, 8, bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst); 

    CGImageRef newImage = CGBitmapContextCreateImage(newContext); 

    CGContextRelease(newContext); 
    CGColorSpaceRelease(colorSpace); 

    UIImage *newUIImage = [UIImage imageWithCGImage:newImage]; 

    CFRelease(newImage); 

    return newUIImage; 
    } 

而且

- (CMSampleBufferRef) cmSampleBuffer 
    { 
     CGImageRef image = self.CGImage; 

     NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys: 
           [NSNumber numberWithBool:YES], kCVPixelBufferCGImageCompatibilityKey, 
           [NSNumber numberWithBool:YES], kCVPixelBufferCGBitmapContextCompatibilityKey, 
           nil]; 
     CVPixelBufferRef pxbuffer = NULL; 

     CVReturn status = CVPixelBufferCreate(kCFAllocatorDefault, 
               self.size.width, 
               self.size.height, 
               kCVPixelFormatType_32ARGB, 
               (__bridge CFDictionaryRef) options, 
               &pxbuffer); 
     NSParameterAssert(status == kCVReturnSuccess && pxbuffer != NULL); 

     CVPixelBufferLockBaseAddress(pxbuffer, 0); 
     void *pxdata = CVPixelBufferGetBaseAddress(pxbuffer); 
     NSParameterAssert(pxdata != NULL); 

     CGColorSpaceRef rgbColorSpace = CGColorSpaceCreateDeviceRGB(); 
     CGContextRef context = CGBitmapContextCreate(pxdata, self.size.width, 
                self.size.height, 8, 4*self.size.width, rgbColorSpace, 
                kCGImageAlphaNoneSkipFirst); 
     NSParameterAssert(context); 
     CGContextConcatCTM(context, CGAffineTransformMakeRotation(0)); 
     CGContextDrawImage(context, CGRectMake(0, 0, CGImageGetWidth(image), 
               CGImageGetHeight(image)), image); 
     CGColorSpaceRelease(rgbColorSpace); 
     CGContextRelease(context); 
     CVPixelBufferUnlockBaseAddress(pxbuffer, 0); 
     CMVideoFormatDescriptionRef videoInfo = NULL; 
     CMSampleBufferRef sampleBuffer = NULL; 
     CMSampleBufferCreateForImageBuffer(kCFAllocatorDefault, 
              pxbuffer, true, NULL, NULL, videoInfo, NULL, &sampleBuffer); 
     return sampleBuffer; 
    } 

任何想法?

編輯:

我改變了我的代碼與託尼的回答。 (謝謝!) 此代碼:

CVPixelBufferRef pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer); 

    CVPixelBufferLockBaseAddress(pixelBuffer, 0); 

    EAGLContext *eaglContext = [[EAGLContext alloc] initWithAPI:kEAGLRenderingAPIOpenGLES2]; 
    CIContext *ciContext = [CIContext contextWithEAGLContext:eaglContext options:@{kCIContextWorkingColorSpace : [NSNull null]} ]; 

    UIFont *font = [UIFont fontWithName:@"Helvetica" size:40]; 
    NSDictionary *attributes = @{NSFontAttributeName: font, 
           NSForegroundColorAttributeName: [UIColor lightTextColor]}; 

    UIImage *img = [UIImage imageFromText:@"01 - 13/02/2014 15:18:21:654" withAttributes:attributes]; 
    CIImage *filteredImage = [[CIImage alloc] initWithCGImage:img.CGImage]; 

    [ciContext render:filteredImage toCVPixelBuffer:pixelBuffer bounds:[filteredImage extent] colorSpace:CGColorSpaceCreateDeviceRGB()]; 

    CVPixelBufferUnlockBaseAddress(pixelBuffer, 0); 
+2

實物,但你能不能把你的共享源的UIImage * IMG = [UIImage的imageFromText:@「01 - 13/02/2014 15:18: 21:654「withAttributes:attributes]; – chrisallick

+0

@chrisallick請看這裏:https://stackoverflow.com/questions/2765537/how-do-i-use-the-nsstring-draw-functionality-to-create-a-uiimage-from-text –

+0

你發現了嗎在CMSampleBuffer上添加文本的解決方案? – user924

回答

2

你應該從蘋果指的CIFunHouse樣品,你可以使用這個API直接繪製到緩衝區

-(void)render:(CIImage *)image toCVPixelBuffer:(CVPixelBufferRef)buffer bounds:(CGRect)r colorSpace:(CGColorSpaceRef)cs

你可以在這裏下載WWDC2013

創建環境

_eaglContext = [[EAGLContext alloc] initWithAPI:kEAGLRenderingAPIOpenGLES2]; 
_ciContext = [CIContext contextWithEAGLContext:_eaglContext options:@{kCIContextWorkingColorSpace : [NSNull null]} ]; 

現在呈現圖像的隨機

CVPixelBufferRef renderedOutputPixelBuffer = NULL; 
OSStatus err = CVPixelBufferPoolCreatePixelBuffer(nil, self.pixelBufferAdaptor.pixelBufferPool, &renderedOutputPixelBuffer); 
[_ciContext render:filteredImage toCVPixelBuffer:renderedOutputPixelBuffer bounds:[filteredImage extent] 
+0

這可以很好地工作,只有透明圖像在黑盒子裏。任何想法爲什麼? :) – JoriDor

+0

@JoriDor你找出它爲什麼是黑色的? –

+0

鏈接不工作,Swift中的任何示例? – user924