我開始開發iOS應用程序,這是我的第一篇SO帖子。我正在嘗試實現UI視圖,該視圖可以顯示後置攝像頭的預覽視頻並處理捕獲的幀。我的預覽圖層完美地工作,我可以在我的用戶界面視圖中看到圖片顯示。但是,captureOutput函數永遠不會被調用。captureOutput函數未使用setSampleBufferDelegate調用
我已經在網上搜索silimar問題和解決方案一段時間,並試圖調整不同的東西,包括輸出,連接和調度隊列設置,但都沒有工作。任何人都可以幫我解決或分享一些見解和方向?提前感謝!
這是我的代碼,我使用Xcode 11 beta
和iOS 10
作爲構建目標。
class ThreeDScanningViewController: UIViewController,
AVCaptureVideoDataOutputSampleBufferDelegate {
@IBOutlet weak var imageView: UIImageView!
var session : AVCaptureSession!
var device : AVCaptureDevice!
var output : AVCaptureVideoDataOutput!
var previewLayer : AVCaptureVideoPreviewLayer!
override func viewDidLoad() {
super.viewDidLoad()
//NotificationCenter.default.addObserver(self, selector: #selector(self.startedNotif), name: NSNotification.name.CaptureSessionDidStartRunningNotification, object: nil)
func initCamera() -> Bool {
session = AVCaptureSession()
session.sessionPreset = AVCaptureSession.Preset.medium
let devices = AVCaptureDevice.devices()
for d in devices {
if ((d as AnyObject).position == AVCaptureDevice.Position.back) {
device = d as! AVCaptureDevice
}
}
if device == nil {
return false
}
do {
// Set up the input
let input : AVCaptureDeviceInput!
try input = AVCaptureDeviceInput(device: device)
if session.canAddInput(input) {
session.addInput(input)
} else {
return false
}
// Set up the device
try device.lockForConfiguration()
device.activeVideoMinFrameDuration = CMTimeMake(1, 15)
device.unlockForConfiguration()
// Set up the preview layer
previewLayer = AVCaptureVideoPreviewLayer(session: session)
previewLayer.frame = imageView.bounds
imageView.layer.addSublayer(previewLayer)
// Set up the output
output = AVCaptureVideoDataOutput()
output.videoSettings = [(kCVPixelBufferPixelFormatTypeKey as NSString) as String: kCVPixelFormatType_32BGRA]
let queue = DispatchQueue(label: "myqueue")
output!.setSampleBufferDelegate(self, queue: queue)
output.alwaysDiscardsLateVideoFrames = true
if session.canAddOutput(output) {
session.addOutput(output)
} else {
return false
}
for connection in output.connections {
if let conn = connection as? AVCaptureConnection {
if conn.isVideoOrientationSupported {
conn.videoOrientation = AVCaptureVideoOrientation.portrait
}
}
}
session.startRunning()
} catch let error as NSError {
print(error)
return false
}
return true
}
func captureOutput (captureOutput: AVCaptureOutput!, didOutputSampleBuffer sampleBuffer: CMSampleBuffer!, from connection: AVCaptureConnection!) {
print("captureOutput!\n");
DispatchQueue.main.async(execute: {
// Do stuff
})
}
}
這裏有一些鏈接,我進去看了看,沒有一個相關的解決我的問題:
AVCaptureDeviceOutput not calling delegate method captureOutput
iOS: captureOutput:didOutputSampleBuffer:fromConnection is NOT called
嗨Canis,謝謝你的回覆!我已經測試過,並且可以確認該函數在會話開始運行之前不會返回false。在開始運行會話之前,我會考慮提交配置,看看它是如何工作的。再次感謝! – CMao
我試圖使用startConfiguration和commitConfiguration函數,但它沒有解決問題... – CMao
@Cooao如果以下教程不起作用,那麼我也不知所措。我的開發環境不允許我在此刻自己測試你的代碼......你可以嘗試在一個單獨的viewcontroller中一步一步地按照教程嗎? 哦,另一個雖然,你正在一個真實的設備上而不是在模擬器上測試? – Canis