قالب وردپرس درنا توس
Home / IOS Development / ios – AVAssetExportSession does not work on devices but works on simulator (AVFoundationErrorDomain Code = -11800, Unknown error code -12780)

ios – AVAssetExportSession does not work on devices but works on simulator (AVFoundationErrorDomain Code = -11800, Unknown error code -12780)



I have seen several questions that had the same error, but none of the solutions found there helped me. I figured I'd try my luck.

I'm trying to export a video as it is, mainly to learn about AVFoundation and AVAssetExportSession. My export works fine on the simulator but doesn't work on any iOS device I've tried (namely an iPhone X and an iPhone XR running iOS 12 each). I mainly followed a Ray Wenderleich tutorial found on this link to perform the video port: https://www.raywenderlich.com/2734-avfoundation-tutorial-adding-overlays-and-animations-to-videosebrit19659002₂Do you want to appreciate some help on the subject. My code is as follows:

Retrieving the URL of a video I added to the App Bundle called Demo.mp4:

  @objc func export () {
leave urlString = Bundle.main.path (forRessource: "Demo", ofType: ".mp4")!
la url = URL (filURLWithPath: urlString)
ExportManager.shared.exportWithAVFoundation (url: url) {(outputUrl, errorString) in
if la outputUrl = outputUrl {
self.playVideo (url: outputUrl)
} other if let errorString = errorString {
print ("ERROR:  t
}
}
}

My export function in ExportManager is as follows (Sorry quite a long time)

  func exportWithAVFoundation (url: URL completion: @escaping (_ outputUrl: URL ?, _ ErrorString: String?) -> ()) {
leave active = AVAsset (url: url)
print ("URL IS (url)")
guard let byAssetTrack = asset.tracks (withMediaType: .video). first otherwise {
completion (zero, "Could not create activity track")
return
}

la mutableComposition = AVMutableComposition ()
guard let videoTrack = mutableComposition.addMutableTrack (with MediaType: .video, preferredTrackID: kCMPersistentTrackID_Invalid) other {return}
try? videoTrack.insertTimeRange (CMTimeRange (start: .zero, duration: asset.duration), of: avAssetTrack, at: .zero)
videoTrack.preferredTransform = ofAssetTrack.preferredTransform

if let audioAssetTrack = asset.tracks (withMediaType: .audio). first {
la audioTrack = mutableComposition.addMutableTrack (with MediaType: .audio, preferredTrackID: kCMPersistentTrackID_Invalid)
try? audio track? .insertTimeRange (CMTimeRange (start: .zero, duration: asset.duration), by: audioAssetTrack, at: .zero)
}

la mainInstruction = AVMutableVideoCompositionInstruction ()
mainInstruction.timeRange = CMTimeRange (start: .zero, duration: asset.duration)

la videoLayerInstruction = AVMutableVideoCompositionLayerInstruction (assetTrack: videoTrack)

// Fix video orientation
var videoAssetOrientation = UIImage.Orientation.up
was isVideoAssetPortrait = false
let videoTransform = ofAssetTrack.preferredTransform

switch (videoTransform.a, videoTransform.b, videoTransform.c, videoTransform.c) {
case (0, 1
.0, -1.0, 0): videoAssetOrientation = .right isVideoAssetPortrait = true case (0, -1.0, 1.0, 0): videoAssetOrientation =. left isVideoAssetPortrait = true bag (1.0, 0, 0, 1.0): videoAssetOrientation = .up case (-1.0, 0, 0, -1.0): videoAssetOrientation = .down default: break } was naturalSize = ofAssetTrack.naturalSize switch (videoAssetOrientation, isVideoAssetPortrait) { case (right, true): naturalSize = CGSize (width: ofAssetTrack.naturalSize.height, height: ofAssetTrack.naturalSize.width) case (left, true): naturalSize = CGSize (width: ofAssetTrack.naturalSize.height, height: ofAssetTrack.naturalSize.width) case (.leftMirrored, true): naturalSize = CGSize (width: ofAssetTrack.naturalSize.height, height: ofAssetTrack.naturalSize.width) thing (.rightMirrored, true): naturalSize = CGSize (width: ofAssetTrack.naturalSize.height, height: ofAssetTrack.naturalSize.width) default: break } videoLayerInstruction.setTransform (byAssetTrack.preferredTransform, at: .zero) videoLayerInstruction.setOpacity (0, at: asset.duration) mainInstruction.layerInstructions = [videoLayerInstruction] la mainCompositionInstruction = AVMutableVideoComposition () mainCompositionInstruction.renderSize = naturalSize mainCompositionInstruction.instructions = [mainInstruction] mainCompositionInstruction.frameDuration = CMTimeMake (value: 1, time scale: 30); la documentDirectoryURL = createPath () guard let exporter = AVAssetExportSession (asset: mutableComposition, presetName: AVAssetExportPresetHighestQuality) other { print ("Could not create AVAssetExportSession") completion (zero, "Could not create AVAssetExportSession") return } exporter.outputURL = documentDirectoryURL exporter.outputFileType = .mov exporter.shouldOptimizeForNetworkUse = true exporter.videoComposition = mainCompositionInstruction exporter.exportAsynchronously { if let error = exporter.error { print (wrong) completion (zero, error.local description) return } completion (exporter.outputURL, null) print ("Finished export") } }

Some things I've tried to do have been adding an AudioTrack to the composition (which I haven't included before). Didn't help it work on an actual device, but at least the exported video has sound now.

I also tried to change the preset quality to Passthrough as opposed to the highest quality, as I read from other threads that this could help but to no avail.

Note:
createPath () creates only a valid path in the directory to store the exported video in. If there is a file on that path before it is exported, it is deleted.


Source link