我有興趣在iPhone上添加文字(以字幕樣式)到現有視頻。一些解決這個在stackoverflow建議添加一個透明的UIView覆蓋視頻。這很好,但是,我不能將它保存爲「新的和修改過的視頻」。iOS在iPhone上逐幀修改視頻(添加CC)
我看到的唯一解決方案就是取文本,從視頻中取一幀,將文本添加到幀中,然後將帶有文本的已修改幀推回到視頻中,替換視頻中的原始幀。
有誰知道如何從視頻中取出一幀(我想我可以算出添加文本),然後如何將幀推回到視頻中?如果你有一些想法,或知道教程,我將不勝感激。
我有興趣在iPhone上添加文字(以字幕樣式)到現有視頻。一些解決這個在stackoverflow建議添加一個透明的UIView覆蓋視頻。這很好,但是,我不能將它保存爲「新的和修改過的視頻」。iOS在iPhone上逐幀修改視頻(添加CC)
我看到的唯一解決方案就是取文本,從視頻中取一幀,將文本添加到幀中,然後將帶有文本的已修改幀推回到視頻中,替換視頻中的原始幀。
有誰知道如何從視頻中取出一幀(我想我可以算出添加文本),然後如何將幀推回到視頻中?如果你有一些想法,或知道教程,我將不勝感激。
你不需要在逐幀的基礎上做到這一點。自iOS 4.0以來,AVFoundation支持字幕。
例如,您可以創建一個AVMutableComposition,然後在視頻頂部添加一個字幕軌道。 AVMediaTypeSubtitle是字幕的類型(或用於隱藏字幕的AVMediaTypeClosedCaption)。然後,您可以將作品提供給玩家或AVAssetWriter。節省你所有的麻煩。
真棒 - 謝謝你,我會研究一下.. – geekyaleks 2013-03-18 16:39:07
我對你的解決方案感興趣,但我沒有找到如何使用「AVMediaTypeSubtitle」設置「字幕」的文本。你能舉個例子嗎?謝謝! – lansher1985 2014-05-09 09:55:56
對於那些想要逐幀編輯電影的人,請使用AVReaderWriter。儘管它是一個OS X Apple示例代碼,但AVFoundation在兩個平臺上都可用,幾乎沒有任何變化。
根據你所說的,這裏有一個你可以學習的例子。
正如你可以從上面的圖片中看到,你可以添加你想要的邊界,疊加,字幕,什麼視頻。
http://www.raywenderlich.com/30200/avfoundation-tutorial-adding-overlays-and-animations-to-videos
您可以通過編輯框架AvFoundation視頻......
下面是一個例子..
VAR的AssetTrack = AVAsset.assetWithURL(filePath1)爲! AVURLAsset var mutableComposition = AVMutableComposition()
// Step - 1 pass url to avasset
// video
var compositionVideoTrack = AVMutableCompositionTrack()
compositionVideoTrack = mutableComposition.addMutableTrackWithMediaType(AVMediaTypeVideo, preferredTrackID: CMPersistentTrackID())
var assetVideoTrack = assetTrack.tracksWithMediaType(AVMediaTypeVideo)[0] as! AVAssetTrack
compositionVideoTrack.insertTimeRange(CMTimeRangeMake(kCMTimeZero , assetTrack.duration), ofTrack: assetVideoTrack, atTime: kCMTimeZero, error: nil)
// audio
var compositionAudioTrack = AVMutableCompositionTrack()
compositionAudioTrack = mutableComposition.addMutableTrackWithMediaType(AVMediaTypeAudio, preferredTrackID: CMPersistentTrackID())
var assetAudioTrack = assetTrack.tracksWithMediaType(AVMediaTypeAudio)[0] as! AVAssetTrack
compositionAudioTrack.insertTimeRange(CMTimeRangeMake(kCMTimeZero , assetTrack.duration), ofTrack: assetAudioTrack, atTime: kCMTimeZero, error: nil)
var videoAssetOrientation_: UIImageOrientation = .Up
var isVideoAssetPortrait_: Bool = false
var videoTransform: CGAffineTransform = assetTrack.preferredTransform
var videosize = CGSize()
videosize = assetVideoTrack.naturalSize
var parentLayer = CALayer()
var videoLayer = CALayer()
var textLayer = CALayer()
parentLayer.frame = CGRectMake(0, 0, videosize.width, videosize.height)
videoLayer.frame = CGRectMake(0, 0, videosize.width, videosize.height)
textLayer.frame = CGRectMake(0, 0, videosize.width, videosize.height)
parentLayer.addSublayer(videoLayer)
if drawingView.image != nil
{
var drawingLayer = CALayer()
drawingLayer.frame = CGRectMake(0, 0, videosize.width, videosize.height)
drawingLayer.contents = drawingView.image.CGImage
var image = UIImage()
parentLayer.addSublayer(drawingLayer)
}
var textV = UIView()
textV.backgroundColor = UIColor.clearColor()
textV.layer.backgroundColor = UIColor.clearColor().CGColor
textV.frame = CGRectMake(self.captureView.frame.size.width, 0, self.captureView.frame.size.width, self.captureView.frame.size.height)
var textL = UILabel()
textL = textShowOnPreview
textV.addSubview(textL)
if textL != ""
{
UIGraphicsBeginImageContext(textV.bounds.size)
textV.layer.renderInContext(UIGraphicsGetCurrentContext())
var image1: UIImage = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
if (TextAnimation == "")
{
textLayer.contents = image1.CGImage
parentLayer.addSublayer(textLayer)
}
else if (TextAnimation == "flip")
{
var overlayer1 = CALayer()
overlayer1.backgroundColor = UIColor.clearColor().CGColor
let screenSize: CGRect = UIScreen.mainScreen().bounds
overlayer1.contents = image1.CGImage
overlayer1.masksToBounds = true
overlayer1.frame = CGRectMake(videosize.width/2-300, videosize.height/2 - 400, videosize.width,videosize.width);
var animation : CABasicAnimation = CABasicAnimation(keyPath: "transform.rotation")
animation.duration=5.0;
animation.repeatCount=5;
animation.autoreverses = true;
// rotate from 0 to 360
animation.fromValue = 0
animation.toValue = (2.0 * M_PI);
animation.beginTime = AVCoreAnimationBeginTimeAtZero;
overlayer1.addAnimation(animation, forKey:"rotation")
[parentLayer .addSublayer(overlayer1)]
}
else if (TextAnimation == "fade")
{
//opacity
var overlayer1 = CALayer()
overlayer1.backgroundColor = UIColor.clearColor().CGColor
overlayer1.contents = image1.CGImage
overlayer1.masksToBounds = true
overlayer1.frame = CGRectMake(videosize.width/2 - 300, videosize.height/2 - 100 , videosize.width+20, videosize.width);
var animation : CABasicAnimation = CABasicAnimation(keyPath: "transform.scale")
animation.duration = 2.0;
animation.repeatCount = 3;
animation.autoreverses = true;
// rotate from 0 to 360
animation.fromValue = 0.5;
animation.toValue = 1.0;
animation.beginTime = AVCoreAnimationBeginTimeAtZero;
overlayer1.addAnimation(animation, forKey:"scale")
[parentLayer .addSublayer(overlayer1)]
}
else if (TextAnimation == "bounce")
{
var overlayer1 = CALayer()
var bounce : CABasicAnimation = CABasicAnimation (keyPath:"position.y");
overlayer1.backgroundColor = UIColor.clearColor().CGColor
overlayer1.contents = image1.CGImage
overlayer1.masksToBounds = true
overlayer1.frame = CGRectMake(videosize.width/2 - 300, videosize.height/2 - 100 , videosize.width, videosize.width);
bounce.duration = 1.0;
bounce.fromValue = overlayer1.frame.origin.y
bounce.toValue = overlayer1.frame.origin.y - 100
bounce.repeatCount = 10
bounce.autoreverses = true;
overlayer1.addAnimation(bounce, forKey: "y")
var animation = CABasicAnimation(keyPath: "transform.scale")
animation.toValue = NSNumber(float: 0.9)
animation.duration = 1.0
animation.repeatCount = 10;
animation.autoreverses = true
overlayer1.addAnimation(animation, forKey: nil)
[parentLayer .addSublayer(overlayer1)]
}
}
var mutableVideoComposition = AVMutableVideoComposition()
mutableVideoComposition.frameDuration = CMTimeMake(1, 30)
mutableVideoComposition.renderSize = videosize
mutableVideoComposition.animationTool = AVVideoCompositionCoreAnimationTool(postProcessingAsVideoLayer: videoLayer, inLayer: parentLayer)
var passThroughInstruction = AVMutableVideoCompositionInstruction()
passThroughInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, mutableComposition.duration)
var passThroughLayerInstruction = AVMutableVideoCompositionLayerInstruction()
// video
var assestVideoMutableCompositionVideo = mutableComposition.tracksWithMediaType(AVMediaTypeVideo)[0] as! AVAssetTrack
passThroughLayerInstruction = AVMutableVideoCompositionLayerInstruction(assetTrack: assestVideoMutableCompositionVideo)
if isVideoAssetPortrait_ == false
{
var FirstAssetScaleFactor: CGAffineTransform = CGAffineTransformMakeScale(1, 1)
passThroughLayerInstruction.setTransform(CGAffineTransformConcat(assetVideoTrack.preferredTransform, FirstAssetScaleFactor), atTime: kCMTimeZero)
}
passThroughInstruction.layerInstructions = [passThroughLayerInstruction]
mutableVideoComposition.instructions = [passThroughInstruction]
let documentsURL = NSFileManager.defaultManager().URLsForDirectory(.DocumentDirectory, inDomains: .UserDomainMask)[0] as! NSURL
let filePath = documentsURL.URLByAppendingPathComponent("NewWatermarkedVideo.mov") as NSURL
var fileManager:NSFileManager = NSFileManager.defaultManager()
fileManager.removeItemAtURL(filePath, error: nil)
var exporter: AVAssetExportSession = AVAssetExportSession(asset: mutableComposition, presetName: AVAssetExportPresetMediumQuality)
exporter.videoComposition = mutableVideoComposition
exporter.outputFileType = AVFileTypeQuickTimeMovie
exporter.outputURL = filePath
exporter.shouldOptimizeForNetworkUse = true
self.captureView.addSubview(textShowOnPreview)
exporter.exportAsynchronouslyWithCompletionHandler({() -> Void in
println(exporter.status)
if exporter.status == AVAssetExportSessionStatus.Completed
{
dispatch_async(dispatch_get_main_queue(), {() -> Void in
MBProgressHUD.hideAllHUDsForView(self.view, animated: true)
self.topicSelectedImage.highlighted = false
self.timelineSelectedImage.highlighted = false
self.selectCat = ""
self.postView.hidden = false
})
println("Completed")
self.mediaData = NSData(contentsOfURL:filePath, options: nil, error: nil)!
var err: NSError? = nil
var asset = AVURLAsset(URL: filePath, options: nil)
var imgGenerator = AVAssetImageGenerator(asset: asset)
var cgImage = imgGenerator.copyCGImageAtTime(CMTimeMake(0, 30), actualTime: nil, error: &err)
var uiImage = UIImage(CGImage: cgImage)!
self.videoThumbData = UIImageJPEGRepresentation(uiImage, 0.1)
var assetTrack = AVAsset.assetWithURL(filePath) as! AVURLAsset
self.videoTime = Int(CMTimeGetSeconds(assetTrack.duration)) + 3
println(self.videoTime)
}
else if exporter.status == AVAssetExportSessionStatus.Cancelled
{
}
else if exporter.status == AVAssetExportSessionStatus.Failed
{
}
})
你是否瀏覽了AVFoundation框架文檔? – zoul 2013-03-18 15:06:31
只要每次播放視頻時,爲什麼要將其保存爲修改過的視頻呢?編輯視頻是CPU密集型的,因此移動設備上的電池耗盡。 – codeghost 2013-03-18 15:10:21
iPhone已經具有視頻編輯功能,以及iPhone的完整iMovie套件,所以我不知道如何添加文本框架是更多的稅收.... – geekyaleks 2013-03-18 15:29:45