我與斯威夫特3一起工作,我使用相機AVFoundation如何從AVFoundation獲取光照值?
誰知道有什麼辦法來知道光的能力?
我知道的方法之一是利用環境光傳感器,但它是不鼓勵,最終Apps不會允許市場
我發現的問題非常接近,我需要
detecting if iPhone is in a dark room
這傢伙解釋說,我可以用ImageIO framework, read the metadata that's coming in with each frame of the video feed
- (void)captureOutput:(AVCaptureOutput *)captureOutput
didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
fromConnection:(AVCaptureConnection *)connection
{
CFDictionaryRef metadataDict = CMCopyDictionaryOfAttachments(NULL,
sampleBuffer, kCMAttachmentMode_ShouldPropagate);
NSDictionary *metadata = [[NSMutableDictionary alloc]
initWithDictionary:(__bridge NSDictionary*)metadataDict];
CFRelease(metadataDict);
NSDictionary *exifMetadata = [[metadata
objectForKey:(NSString *)kCGImagePropertyExifDictionary] mutableCopy];
float brightnessValue = [[exifMetadata
objectForKey:(NSString *)kCGImagePropertyExifBrightnessValue] floatValue];
}
但我在的iOS新秀並不知道如何轉換此代碼在斯威夫特
在此先感謝!
有一些問題:AVCaptureStillImageOutput已被棄用,另一方面,這個方法'AVCaptureStillImageOutput.jpegStillImageNSDataRepresentation(sampleBuffer)'返回'數據',但這種方法'getEXIFFromImage(image:NSData)'需要獲得'NSData' ...什麼你認爲? –
無論如何還有一個問題......如果我嘗試通過鍵'kCGImagePropertyExifBrightnessValue'獲取值,我會得到'nil' ...你知道爲什麼嗎? –