我試圖做一個簡單的應用程序將在iPhone屏幕的上半部分顯示的背部攝像頭看到的原始預覽,而在下半區相同預覽但應用了各種濾鏡。子畫面視頻預覽扭曲
我第一次得到了原始預覽部分的工作,不是太辛苦感謝幾個SO和博客文章。我顯示的UIImageView佔用了該部分的整個屏幕。
爲了得到一個半屏視圖我只是除以二的圖像視圖的高度,然後將其contentMode展現的一切,同時保持相同的寬高比:
imageView = UIImageView(frame: CGRectMake(0,0, self.view.frame.size.width, self.view.frame.size.height/2))
imageView.contentMode = UIViewContentMode.ScaleAspectFit
高度減少工作,但圖像在視圖中被垂直壓縮(例如,直接觀看的硬幣看起來像水平的橢圓形)。我不認爲這是一個巧合,預覽外觀看起來像contentMode默認ScaleToFill,但我沒有試過改變模式。
完整的代碼如下 - 該項目有一個場景與一個視圖控制器類;一切都以編程方式完成。
謝謝!
import UIKit
import AVFoundation
class ViewController: UIViewController, AVCaptureVideoDataOutputSampleBufferDelegate
{
var imageView : UIImageView!
override func viewDidLoad()
{
super.viewDidLoad()
imageView = UIImageView(frame: CGRectMake(0,0, self.view.frame.size.width, self.view.frame.size.height/2))
imageView.contentMode = UIViewContentMode.ScaleAspectFit
view.addSubview(imageView)
let captureSession = AVCaptureSession()
captureSession.sessionPreset = AVCaptureSessionPresetHigh
let backCamera = AVCaptureDevice.defaultDeviceWithMediaType(AVMediaTypeVideo)
do
{
let input = try AVCaptureDeviceInput(device: backCamera)
captureSession.addInput(input)
}
catch
{
print("Camera not available")
return
}
// Unused but required for AVCaptureVideoDataOutputSampleBufferDelegate:captureOutput() events to be fired
let previewLayer = AVCaptureVideoPreviewLayer(session: captureSession)
view.layer.addSublayer(previewLayer)
let videoOutput = AVCaptureVideoDataOutput()
videoOutput.setSampleBufferDelegate(self, queue: dispatch_queue_create("SampleBufferDelegate", DISPATCH_QUEUE_SERIAL))
if captureSession.canAddOutput(videoOutput)
{
captureSession.addOutput(videoOutput)
}
videoOutput.connectionWithMediaType(AVMediaTypeVideo).videoOrientation = AVCaptureVideoOrientation.Portrait
captureSession.startRunning()
}
func captureOutput(captureOutput: AVCaptureOutput!, didOutputSampleBuffer sampleBuffer: CMSampleBuffer!, fromConnection connection: AVCaptureConnection!)
{
let pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer)
let cameraImage = CIImage(CVPixelBuffer: pixelBuffer!)
dispatch_async(dispatch_get_main_queue())
{
self.imageView.image = UIImage(CIImage: cameraImage)
}
}
}
謝謝,亞歷克斯。這些並不是我熟悉的點,因爲我是iOS編程新手。 – Logickle