這裏,如果你願意跑這一個完整的項目自己:https://www.dropbox.com/s/5p384mogjzflvqk/AVPlayerLayerSoundOnlyBug_iOS10.zip?dl=0iOS的10.0 - 10.1:AVPlayerLayer使用AVVideoCompositionCoreAnimationTool後不顯示視頻,只有音頻
這是iOS上的10個新問題,它有從iOS 10.2開始就已經修復。使用AVAssetExportSession和AVVideoCompositionCoreAnimationTool導出視頻以在導出期間在視頻頂部合成圖層後,在AVPlayerLayer中播放的視頻無法播放。這似乎不是由於觸及AV編碼/解碼流水線限制造成的,因爲它經常在單次導出後發生,據我所知,這隻會導致2個流水線:1個用於AVAssetExportSession,另一個用於AVPlayer。我也正確設置圖層的框架,正如您可以通過運行下面的代碼所看到的那樣給圖層顯示一個藍色背景,您可以清楚地看到。
導出後,在播放視頻之前等待一段時間似乎使它更可靠,但這不是一個真正可以接受的解決方法來告訴您的用戶。
任何想法是什麼導致這種情況或我如何解決或解決它?我是否搞砸了一些東西或缺少重要的步驟或細節?任何幫助或指向文檔的指針非常感謝。
import UIKit
import AVFoundation
/* After exporting an AVAsset using AVAssetExportSession with AVVideoCompositionCoreAnimationTool, we
* will attempt to play a video using an AVPlayerLayer with a blue background.
*
* If you see the blue background and hear audio you're experiencing the missing-video bug. Otherwise
* try hitting the button again.
*/
class ViewController: UIViewController {
private var playerLayer: AVPlayerLayer?
private let button = UIButton()
private let indicator = UIActivityIndicatorView(activityIndicatorStyle: .gray)
override func viewDidLoad() {
super.viewDidLoad()
view.backgroundColor = UIColor.white
button.setTitle("Cause Trouble", for: .normal)
button.setTitleColor(UIColor.black, for: .normal)
button.addTarget(self, action: #selector(ViewController.buttonTapped), for: .touchUpInside)
view.addSubview(button)
button.translatesAutoresizingMaskIntoConstraints = false
NSLayoutConstraint.activate([
button.centerXAnchor.constraint(equalTo: view.centerXAnchor),
button.bottomAnchor.constraint(equalTo: view.bottomAnchor, constant: -16),
])
indicator.hidesWhenStopped = true
view.insertSubview(indicator, belowSubview: button)
indicator.translatesAutoresizingMaskIntoConstraints = false
NSLayoutConstraint.activate([
indicator.centerXAnchor.constraint(equalTo: button.centerXAnchor),
indicator.centerYAnchor.constraint(equalTo: button.centerYAnchor),
])
}
func buttonTapped() {
button.isHidden = true
indicator.startAnimating()
playerLayer?.removeFromSuperlayer()
let sourcePath = Bundle.main.path(forResource: "video.mov", ofType: nil)!
let sourceURL = URL(fileURLWithPath: sourcePath)
let sourceAsset = AVURLAsset(url: sourceURL)
//////////////////////////////////////////////////////////////////////
// STEP 1: Export a video using AVVideoCompositionCoreAnimationTool //
//////////////////////////////////////////////////////////////////////
let exportSession = {() -> AVAssetExportSession in
let sourceTrack = sourceAsset.tracks(withMediaType: AVMediaTypeVideo).first!
let parentLayer = CALayer()
parentLayer.frame = CGRect(origin: .zero, size: CGSize(width: 1280, height: 720))
let videoLayer = CALayer()
videoLayer.frame = parentLayer.bounds
parentLayer.addSublayer(videoLayer)
let composition = AVMutableVideoComposition(propertiesOf: sourceAsset)
composition.animationTool = AVVideoCompositionCoreAnimationTool(postProcessingAsVideoLayer: videoLayer, in: parentLayer)
let layerInstruction = AVMutableVideoCompositionLayerInstruction(assetTrack: sourceTrack)
layerInstruction.setTransform(sourceTrack.preferredTransform, at: kCMTimeZero)
let instruction = AVMutableVideoCompositionInstruction()
instruction.timeRange = CMTimeRange(start: kCMTimeZero, duration: sourceAsset.duration)
instruction.layerInstructions = [layerInstruction]
composition.instructions = [instruction]
let e = AVAssetExportSession(asset: sourceAsset, presetName: AVAssetExportPreset1280x720)!
e.videoComposition = composition
e.outputFileType = AVFileTypeQuickTimeMovie
e.timeRange = CMTimeRange(start: kCMTimeZero, duration: sourceAsset.duration)
let outputURL = URL(fileURLWithPath: NSTemporaryDirectory().appending("/out2.mov"))
_ = try? FileManager.default.removeItem(at: outputURL)
e.outputURL = outputURL
return e
}()
print("Exporting asset...")
exportSession.exportAsynchronously {
assert(exportSession.status == .completed)
//////////////////////////////////////////////
// STEP 2: Play a video in an AVPlayerLayer //
//////////////////////////////////////////////
DispatchQueue.main.async {
// Reuse player layer, shouldn't be hitting the AV pipeline limit
let playerItem = AVPlayerItem(asset: sourceAsset)
let layer = self.playerLayer ?? AVPlayerLayer()
if layer.player == nil {
layer.player = AVPlayer(playerItem: playerItem)
}
else {
layer.player?.replaceCurrentItem(with: playerItem)
}
layer.backgroundColor = UIColor.blue.cgColor
if UIDeviceOrientationIsPortrait(UIDevice.current.orientation) {
layer.frame = self.view.bounds
layer.bounds.size.height = layer.bounds.width * 9.0/16.0
}
else {
layer.frame = self.view.bounds.insetBy(dx: 0, dy: 60)
layer.bounds.size.width = layer.bounds.height * 16.0/9.0
}
self.view.layer.insertSublayer(layer, at: 0)
self.playerLayer = layer
layer.player?.play()
print("Playing a video in an AVPlayerLayer...")
self.button.isHidden = false
self.indicator.stopAnimating()
}
}
}
}
AVAssetExportSession似乎是在iOS10上的越野車http://stackoverflow.com/q/39560386/22147 http://stackoverflow.com/a/39746140/22147 –
@RhythmicFistman謝謝!我還沒有遇到過。看起來我可以使用自定義視頻合成器而不是AVVideoCompositionCoreAnimationTool解決此問題。 –