├── LICENSE.md ├── README.md ├── VideoEffects.xcodeproj ├── project.pbxproj ├── project.xcworkspace │ └── contents.xcworkspacedata └── xcuserdata │ └── simongladman.xcuserdatad │ └── xcschemes │ ├── VideoEffects.xcscheme │ └── xcschememanagement.plist └── VideoEffects ├── AppDelegate.swift ├── Assets.xcassets └── AppIcon.appiconset │ └── Contents.json ├── Base.lproj ├── LaunchScreen.storyboard └── Main.storyboard ├── FilteredVideoVendor.swift ├── FilteredVideoWriter.swift ├── Info.plist ├── VideoEffectsControlPanel.swift ├── VideoEffectsView.swift └── ViewController.swift /LICENSE.md: -------------------------------------------------------------------------------- 1 | The MIT License (MIT) 2 | 3 | Copyright (c) 2016 simon gladman 4 | 5 | Permission is hereby granted, free of charge, to any person obtaining a copy 6 | of this software and associated documentation files (the "Software"), to deal 7 | in the Software without restriction, including without limitation the rights 8 | to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 9 | copies of the Software, and to permit persons to whom the Software is 10 | furnished to do so, subject to the following conditions: 11 | 12 | The above copyright notice and this permission notice shall be included in all 13 | copies or substantial portions of the Software. 14 | 15 | THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 16 | IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 17 | FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 18 | AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 19 | LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 20 | OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE 21 | SOFTWARE. 22 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | # VideoEffects 2 | ### iPad app to open videos from file system, apply Core Image filters and save result back to Saved Photos Album 3 | 4 | If you've ever used an application such as Adobe's After Effects, you'll know how much creative potential there is adding and animating filters to video files. If you've worked with Apple's Core Image framework, you may well have added filters to still images or even live video feeds, but working with video files and saving the results back to a device isn't a trivial coding challenge. 5 | 6 | Well, my VideoEffects app solves that challenge for you: VideoEffects allows a user to open a video file, apply a Core Image Photo Effects filter and write the filtered movie back to the saved photos album. 7 | 8 | ## VideoEffects Overview 9 | 10 | The VideoEffects project consists of four main files: 11 | 12 | * **VideoEffectsView:** this is the main user interface component. It contains an image view and a control bar. 13 | * **VideoEffectsControlPanel:** Contains a scrubber bar, filter selection and play, pause, load and save buttons. 14 | * **FilteredVideoVendor:** Vends filtered image frames 15 | * **FilteredVideoWriter:** Writes frames from the vendor to the file system 16 | 17 | The first action a user needs to take is to press "load" in the bottom left of the screen. This opens a standard image picker filtered for the movie media type. Once a movie is opened, it's displayed on the screen where the user can either play/pause or use the slider as a scrub bar. If any of the filters are selected, the save button is enabled which will save a filtered version of the video back to the file system. 18 | 19 | Let's look at the vendor and writer code in detail. 20 | 21 | ## Filtered Video Vendor 22 | 23 | The first job of the vendor class is to actually open a movie from a URL supplied by the "load" button in the control panel: 24 | 25 | ```swift 26 | func openMovie(url: NSURL){ 27 | player = AVPlayer(URL: url) 28 | 29 | guard let player = player, 30 | currentItem = player.currentItem, 31 | videoTrack = currentItem.asset.tracksWithMediaType(AVMediaTypeVideo).first else { 32 | fatalError("** unable to access item **") 33 | } 34 | 35 | currentURL = url 36 | failedPixelBufferForItemTimeCount = 0 37 | 38 | currentItem.addOutput(videoOutput) 39 | 40 | videoTransform = CGAffineTransformInvert(videoTrack.preferredTransform) 41 | 42 | player.muted = true 43 | } 44 | ``` 45 | 46 | There are a few interesting points here: firstly, I reset a variable named `failedPixelBufferForItemTimeCount` - this is a workaround for what I think is a bug in AVFoundation with videos that would occasionally fail to load with no apparent error. Secondly, to support both landscape and portrait videos, I create an inverted version of the video track's preferred transform. 47 | 48 | The vendor contains a `CADisplayLink` which invokes `step(_:)`: 49 | 50 | ```swift 51 | func step(link: CADisplayLink) { 52 | guard let player = player, 53 | currentItem = player.currentItem else { 54 | return 55 | } 56 | 57 | let itemTime = videoOutput.itemTimeForHostTime(CACurrentMediaTime()) 58 | 59 | displayVideoFrame(itemTime) 60 | 61 | let normalisedTime = Float(itemTime.seconds / currentItem.asset.duration.seconds) 62 | 63 | delegate?.vendorNormalisedTimeUpdated(normalisedTime) 64 | 65 | if normalisedTime >= 1.0 66 | { 67 | paused = true 68 | } 69 | } 70 | ``` 71 | 72 | With the `CADisplayLink`, I calculate the time for the `AVPlayerItem` based on `CACurrentMediaTime`. The normalised time (i.e. between 0 and 1) is calculated by dividing the player item's time by the assets duration, this is used by the UI components to set the scrub bar's position during playback. Creating a `CIImage` from the movie's frame at `itemTime` is done in `displayVideoFrame(_:)`: 73 | 74 | ```swift 75 | func displayVideoFrame(time: CMTime) { 76 | guard let player = player, 77 | currentItem = player.currentItem where player.status == .ReadyToPlay && currentItem.status == .ReadyToPlay else { 78 | return 79 | } 80 | 81 | if videoOutput.hasNewPixelBufferForItemTime(time) { 82 | failedPixelBufferForItemTimeCount = 0 83 | 84 | var presentationItemTime = kCMTimeZero 85 | 86 | guard let pixelBuffer = videoOutput.copyPixelBufferForItemTime( 87 | time, 88 | itemTimeForDisplay: &presentationItemTime) else { 89 | return 90 | } 91 | 92 | unfilteredImage = CIImage(CVImageBuffer: pixelBuffer) 93 | 94 | displayFilteredImage() 95 | } 96 | else if let currentURL = currentURL where !paused { 97 | failedPixelBufferForItemTimeCount += 1 98 | 99 | if failedPixelBufferForItemTimeCount > 12 { 100 | openMovie(currentURL) 101 | } 102 | } 103 | } 104 | ``` 105 | 106 | Before copying a pixel buffer from the video output, I need to ensure one is available. If that's all good, it's a simple step to create a `CIImage` from that pixel buffer. However, if `hasNewPixelBufferForItemTime(_:)` fails too many times (12 seems to work), I assume AVFoundation has silently failed and I reopen the movie. 107 | 108 | With the populated `CIImage`, I apply a filter (if there is one) and return the rendered result back to the delegate (which is the main view) to be displayed: 109 | 110 | ```swift 111 | func displayFilteredImage() { 112 | guard let unfilteredImage = unfilteredImage, 113 | videoTransform = videoTransform else { 114 | return 115 | } 116 | 117 | let ciImage: CIImage 118 | 119 | if let ciFilter = ciFilter { 120 | ciFilter.setValue(unfilteredImage, forKey: kCIInputImageKey) 121 | 122 | ciImage = ciFilter.outputImage!.imageByApplyingTransform(videoTransform) 123 | } 124 | else { 125 | ciImage = unfilteredImage.imageByApplyingTransform(videoTransform) 126 | } 127 | 128 | let cgImage = ciContext.createCGImage( 129 | ciImage, 130 | fromRect: ciImage.extent) 131 | 132 | delegate?.finalOutputUpdated(UIImage(CGImage: cgImage)) 133 | } 134 | ``` 135 | 136 | The vendor can also jump to a specific normalised time. Here, rather than relying on the `CACurrentMediaTime`, I create a `CMTime` and pass that to `displayVideoFrame(_:)`: 137 | 138 | ```swift 139 | func gotoNormalisedTime(normalisedTime: Double) { 140 | guard let player = player else { 141 | return 142 | } 143 | 144 | let timeSeconds = player.currentItem!.asset.duration.seconds * normalisedTime 145 | 146 | let time = CMTimeMakeWithSeconds(timeSeconds, 600) 147 | 148 | player.seekToTime( 149 | time, 150 | toleranceBefore: kCMTimeZero, 151 | toleranceAfter: kCMTimeZero) 152 | 153 | displayVideoFrame(time) 154 | } 155 | ``` 156 | 157 | ## Filtered Video Writer 158 | 159 | Writing the result is not the simplest coding task I've ever done. I'll explain the highlights, the full code is available here. 160 | 161 | The writer class exposes a function, `beginSaving(player:ciFilter:videoTransform:videoOutput)` which begins the writing process. 162 | 163 | Writing is actually done to a temporary file in the documents directory and given a file name based on the current time: 164 | 165 | ```swift 166 | let urls = NSFileManager 167 | .defaultManager() 168 | .URLsForDirectory( 169 | .DocumentDirectory, 170 | 171 | inDomains: .UserDomainMask) 172 | 173 | videoOutputURL = documentDirectory 174 | .URLByAppendingPathComponent("Output_\(timeDateFormatter.stringFromDate(NSDate())).mp4") 175 | 176 | do { 177 | videoWriter = try AVAssetWriter(URL: videoOutputURL!, fileType: AVFileTypeMPEG4) 178 | } 179 | catch { 180 | fatalError("** unable to create asset writer **") 181 | } 182 | ``` 183 | 184 | The next step is to create an asset writer input using H264 and of the correct size: 185 | 186 | ```swift 187 | let outputSettings: [String : AnyObject] = [ 188 | AVVideoCodecKey: AVVideoCodecH264, 189 | AVVideoWidthKey: currentItem.presentationSize.width, 190 | AVVideoHeightKey: currentItem.presentationSize.height] 191 | 192 | guard videoWriter!.canApplyOutputSettings(outputSettings, forMediaType: AVMediaTypeVideo) else { 193 | fatalError("** unable to apply video settings ** ") 194 | } 195 | 196 | videoWriterInput = AVAssetWriterInput( 197 | mediaType: AVMediaTypeVideo, 198 | outputSettings: outputSettings) 199 | ``` 200 | 201 | The video writer input is added to an `AVAssetWriter`: 202 | 203 | ```swift 204 | videoWriterInput = AVAssetWriterInput( 205 | mediaType: AVMediaTypeVideo, 206 | outputSettings: outputSettings) 207 | 208 | if videoWriter!.canAddInput(videoWriterInput!) { 209 | videoWriter!.addInput(videoWriterInput!) 210 | } 211 | else { 212 | fatalError ("** unable to add input **") 213 | } 214 | ``` 215 | 216 | The final set up step for initialising is to create a pixel buffer adaptor: 217 | 218 | ```swift 219 | let sourcePixelBufferAttributesDictionary = [ 220 | String(kCVPixelBufferPixelFormatTypeKey) : Int(kCVPixelFormatType_32BGRA), 221 | String(kCVPixelBufferWidthKey) : currentItem.presentationSize.width, 222 | String(kCVPixelBufferHeightKey) : currentItem.presentationSize.height, 223 | String(kCVPixelFormatOpenGLESCompatibility) : kCFBooleanTrue 224 | ] 225 | 226 | assetWriterPixelBufferInput = AVAssetWriterInputPixelBufferAdaptor( 227 | assetWriterInput: videoWriterInput!, 228 | sourcePixelBufferAttributes: sourcePixelBufferAttributesDictionary) 229 | ``` 230 | 231 | We're now ready to actually start writing. I'll rewind the player to the beginning of the movie and, since that is asynchronous, call `writeVideoFrames` in the seek completion handler: 232 | 233 | ```swift 234 | player.seekToTime( 235 | CMTimeMakeWithSeconds(0, 600), 236 | toleranceBefore: kCMTimeZero, 237 | toleranceAfter: kCMTimeZero) 238 | { 239 | _ in self.writeVideoFrames() 240 | } 241 | ``` 242 | 243 | `writeVideoFrames` writes the frames to the temporary file. It's basically a loop over each frame, incrementing the frame with each iteration. The number of frames is calculated as: 244 | 245 | ```swift 246 | let numberOfFrames = Int(duration.seconds * Double(frameRate)) 247 | ``` 248 | 249 | There was an intermittent bug where, again, `hasNewPixelBufferForItemTime(_:)` failed. This is fixed with a slightly ugly sleep: 250 | 251 | ```swift 252 | NSThread.sleepForTimeInterval(0.05) 253 | ``` 254 | 255 | In this loop, I do something very similar to the vendor: convert a pixel buffer from the video output to a `CIImage`, filter and render it. However, I'm not rendering to a `CGImage` for display, I'm rendering back to a `CVPixelBuffer` to append to the asset write pixel buffer. The pixel buffer adaptor has a pixel buffer pool I take pixel buffers from which are passed to the Core Image context as a render target: 256 | 257 | ```swift 258 | ciFilter.setValue(transformedImage, forKey: kCIInputImageKey) 259 | 260 | var newPixelBuffer: CVPixelBuffer? = nil 261 | 262 | CVPixelBufferPoolCreatePixelBuffer(kCFAllocatorDefault, pixelBufferPool, &newPixelBuffer) 263 | 264 | self.ciContext.render( 265 | ciFilter.outputImage!, 266 | toCVPixelBuffer: newPixelBuffer!, 267 | bounds: ciFilter.outputImage!.extent, 268 | colorSpace: nil) 269 | ``` 270 | 271 | `transformedImage` is the filtered `CIImage` rotated based on the original assets preferred transform. 272 | 273 | Now that the new pixel buffer contains the rendered filtered image, it's appended to the pixel buffer adaptor: 274 | 275 | ```swift 276 | assetWriterPixelBufferInput.appendPixelBuffer( 277 | newPixelBuffer!, 278 | withPresentationTime: presentationItemTime) 279 | ``` 280 | 281 | The final part of the loop kernel is to increment the frame: 282 | 283 | ```swift 284 | currentItem.stepByCount(1) 285 | ``` 286 | 287 | Once I've looped over each frame, the video write input is marked as finished and the video writer's `finishWritingWithCompletionHandler(_:)` is invoked. In the completion handler, I rewind the player back to the beginning and copy the temporary video into the saved photos album: 288 | 289 | ```swift 290 | videoWriter.finishWritingWithCompletionHandler { 291 | player.seekToTime( 292 | CMTimeMakeWithSeconds(0, 600), 293 | toleranceBefore: kCMTimeZero, 294 | toleranceAfter: kCMTimeZero) 295 | 296 | dispatch_async(dispatch_get_main_queue()) { 297 | UISaveVideoAtPathToSavedPhotosAlbum( 298 | videoOutputURL.relativePath!, 299 | self, 300 | #selector(FilteredVideoWriter.video(_:didFinishSavingWithError:contextInfo:)), 301 | nil) 302 | } 303 | ``` 304 | 305 | ...and once the video is copied, I can delete the temporary file: 306 | 307 | ```swift 308 | func video(videoPath: NSString, didFinishSavingWithError error: NSError?, contextInfo info: AnyObject) 309 | { 310 | if let videoOutputURL = videoOutputURL where NSFileManager.defaultManager().isDeletableFileAtPath(videoOutputURL.relativePath!) 311 | { 312 | try! NSFileManager.defaultManager().removeItemAtURL(videoOutputURL) 313 | } 314 | 315 | assetWriterPixelBufferInput = nil 316 | videoWriterInput = nil 317 | videoWriter = nil 318 | videoOutputURL = nil 319 | 320 | delegate?.saveComplete() 321 | } 322 | ``` 323 | 324 | Easy! 325 | 326 | ## Conclusion 327 | 328 | I've been wanting to write this code for almost two years and it proved a lot more "interesting" than I anticipated. There are two slightly hacky workarounds in there, but the end result is the foundation for a tremendously powerful app. At every frame, the normalised time is available and this can be used to animate the attributes of filters and opens the way for a powerful After Effects style application. 329 | -------------------------------------------------------------------------------- /VideoEffects.xcodeproj/project.pbxproj: -------------------------------------------------------------------------------- 1 | // !$*UTF8*$! 2 | { 3 | archiveVersion = 1; 4 | classes = { 5 | }; 6 | objectVersion = 46; 7 | objects = { 8 | 9 | /* Begin PBXBuildFile section */ 10 | 3E0811D41CD20F8C001581D3 /* AppDelegate.swift in Sources */ = {isa = PBXBuildFile; fileRef = 3E0811D31CD20F8C001581D3 /* AppDelegate.swift */; }; 11 | 3E0811D61CD20F8C001581D3 /* ViewController.swift in Sources */ = {isa = PBXBuildFile; fileRef = 3E0811D51CD20F8C001581D3 /* ViewController.swift */; }; 12 | 3E0811D91CD20F8C001581D3 /* Main.storyboard in Resources */ = {isa = PBXBuildFile; fileRef = 3E0811D71CD20F8C001581D3 /* Main.storyboard */; }; 13 | 3E0811DB1CD20F8C001581D3 /* Assets.xcassets in Resources */ = {isa = PBXBuildFile; fileRef = 3E0811DA1CD20F8C001581D3 /* Assets.xcassets */; }; 14 | 3E0811DE1CD20F8C001581D3 /* LaunchScreen.storyboard in Resources */ = {isa = PBXBuildFile; fileRef = 3E0811DC1CD20F8C001581D3 /* LaunchScreen.storyboard */; }; 15 | 3E0811EA1CD20FC8001581D3 /* FilteredVideoWriter.swift in Sources */ = {isa = PBXBuildFile; fileRef = 3E0811E61CD20FC8001581D3 /* FilteredVideoWriter.swift */; }; 16 | 3E0811EB1CD20FC8001581D3 /* FilteredVideoVendor.swift in Sources */ = {isa = PBXBuildFile; fileRef = 3E0811E71CD20FC8001581D3 /* FilteredVideoVendor.swift */; }; 17 | 3E0811EC1CD20FC8001581D3 /* VideoEffectsControlPanel.swift in Sources */ = {isa = PBXBuildFile; fileRef = 3E0811E81CD20FC8001581D3 /* VideoEffectsControlPanel.swift */; }; 18 | 3E0811ED1CD20FC8001581D3 /* VideoEffectsView.swift in Sources */ = {isa = PBXBuildFile; fileRef = 3E0811E91CD20FC8001581D3 /* VideoEffectsView.swift */; }; 19 | /* End PBXBuildFile section */ 20 | 21 | /* Begin PBXFileReference section */ 22 | 3E0811D01CD20F8C001581D3 /* VideoEffects.app */ = {isa = PBXFileReference; explicitFileType = wrapper.application; includeInIndex = 0; path = VideoEffects.app; sourceTree = BUILT_PRODUCTS_DIR; }; 23 | 3E0811D31CD20F8C001581D3 /* AppDelegate.swift */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.swift; path = AppDelegate.swift; sourceTree = ""; }; 24 | 3E0811D51CD20F8C001581D3 /* ViewController.swift */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.swift; path = ViewController.swift; sourceTree = ""; }; 25 | 3E0811D81CD20F8C001581D3 /* Base */ = {isa = PBXFileReference; lastKnownFileType = file.storyboard; name = Base; path = Base.lproj/Main.storyboard; sourceTree = ""; }; 26 | 3E0811DA1CD20F8C001581D3 /* Assets.xcassets */ = {isa = PBXFileReference; lastKnownFileType = folder.assetcatalog; path = Assets.xcassets; sourceTree = ""; }; 27 | 3E0811DD1CD20F8C001581D3 /* Base */ = {isa = PBXFileReference; lastKnownFileType = file.storyboard; name = Base; path = Base.lproj/LaunchScreen.storyboard; sourceTree = ""; }; 28 | 3E0811DF1CD20F8C001581D3 /* Info.plist */ = {isa = PBXFileReference; lastKnownFileType = text.plist.xml; path = Info.plist; sourceTree = ""; }; 29 | 3E0811E61CD20FC8001581D3 /* FilteredVideoWriter.swift */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.swift; path = FilteredVideoWriter.swift; sourceTree = ""; }; 30 | 3E0811E71CD20FC8001581D3 /* FilteredVideoVendor.swift */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.swift; path = FilteredVideoVendor.swift; sourceTree = ""; }; 31 | 3E0811E81CD20FC8001581D3 /* VideoEffectsControlPanel.swift */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.swift; path = VideoEffectsControlPanel.swift; sourceTree = ""; }; 32 | 3E0811E91CD20FC8001581D3 /* VideoEffectsView.swift */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.swift; path = VideoEffectsView.swift; sourceTree = ""; }; 33 | /* End PBXFileReference section */ 34 | 35 | /* Begin PBXFrameworksBuildPhase section */ 36 | 3E0811CD1CD20F8C001581D3 /* Frameworks */ = { 37 | isa = PBXFrameworksBuildPhase; 38 | buildActionMask = 2147483647; 39 | files = ( 40 | ); 41 | runOnlyForDeploymentPostprocessing = 0; 42 | }; 43 | /* End PBXFrameworksBuildPhase section */ 44 | 45 | /* Begin PBXGroup section */ 46 | 3E0811C71CD20F8C001581D3 = { 47 | isa = PBXGroup; 48 | children = ( 49 | 3E0811D21CD20F8C001581D3 /* VideoEffects */, 50 | 3E0811D11CD20F8C001581D3 /* Products */, 51 | ); 52 | sourceTree = ""; 53 | }; 54 | 3E0811D11CD20F8C001581D3 /* Products */ = { 55 | isa = PBXGroup; 56 | children = ( 57 | 3E0811D01CD20F8C001581D3 /* VideoEffects.app */, 58 | ); 59 | name = Products; 60 | sourceTree = ""; 61 | }; 62 | 3E0811D21CD20F8C001581D3 /* VideoEffects */ = { 63 | isa = PBXGroup; 64 | children = ( 65 | 3E0811E51CD20F9C001581D3 /* videoEffects */, 66 | 3E0811D31CD20F8C001581D3 /* AppDelegate.swift */, 67 | 3E0811D51CD20F8C001581D3 /* ViewController.swift */, 68 | 3E0811D71CD20F8C001581D3 /* Main.storyboard */, 69 | 3E0811DA1CD20F8C001581D3 /* Assets.xcassets */, 70 | 3E0811DC1CD20F8C001581D3 /* LaunchScreen.storyboard */, 71 | 3E0811DF1CD20F8C001581D3 /* Info.plist */, 72 | ); 73 | path = VideoEffects; 74 | sourceTree = ""; 75 | }; 76 | 3E0811E51CD20F9C001581D3 /* videoEffects */ = { 77 | isa = PBXGroup; 78 | children = ( 79 | 3E0811E61CD20FC8001581D3 /* FilteredVideoWriter.swift */, 80 | 3E0811E71CD20FC8001581D3 /* FilteredVideoVendor.swift */, 81 | 3E0811E81CD20FC8001581D3 /* VideoEffectsControlPanel.swift */, 82 | 3E0811E91CD20FC8001581D3 /* VideoEffectsView.swift */, 83 | ); 84 | name = videoEffects; 85 | sourceTree = ""; 86 | }; 87 | /* End PBXGroup section */ 88 | 89 | /* Begin PBXNativeTarget section */ 90 | 3E0811CF1CD20F8C001581D3 /* VideoEffects */ = { 91 | isa = PBXNativeTarget; 92 | buildConfigurationList = 3E0811E21CD20F8C001581D3 /* Build configuration list for PBXNativeTarget "VideoEffects" */; 93 | buildPhases = ( 94 | 3E0811CC1CD20F8C001581D3 /* Sources */, 95 | 3E0811CD1CD20F8C001581D3 /* Frameworks */, 96 | 3E0811CE1CD20F8C001581D3 /* Resources */, 97 | ); 98 | buildRules = ( 99 | ); 100 | dependencies = ( 101 | ); 102 | name = VideoEffects; 103 | productName = VideoEffects; 104 | productReference = 3E0811D01CD20F8C001581D3 /* VideoEffects.app */; 105 | productType = "com.apple.product-type.application"; 106 | }; 107 | /* End PBXNativeTarget section */ 108 | 109 | /* Begin PBXProject section */ 110 | 3E0811C81CD20F8C001581D3 /* Project object */ = { 111 | isa = PBXProject; 112 | attributes = { 113 | LastSwiftUpdateCheck = 0730; 114 | LastUpgradeCheck = 0730; 115 | ORGANIZATIONNAME = "Simon Gladman"; 116 | TargetAttributes = { 117 | 3E0811CF1CD20F8C001581D3 = { 118 | CreatedOnToolsVersion = 7.3; 119 | }; 120 | }; 121 | }; 122 | buildConfigurationList = 3E0811CB1CD20F8C001581D3 /* Build configuration list for PBXProject "VideoEffects" */; 123 | compatibilityVersion = "Xcode 3.2"; 124 | developmentRegion = English; 125 | hasScannedForEncodings = 0; 126 | knownRegions = ( 127 | en, 128 | Base, 129 | ); 130 | mainGroup = 3E0811C71CD20F8C001581D3; 131 | productRefGroup = 3E0811D11CD20F8C001581D3 /* Products */; 132 | projectDirPath = ""; 133 | projectRoot = ""; 134 | targets = ( 135 | 3E0811CF1CD20F8C001581D3 /* VideoEffects */, 136 | ); 137 | }; 138 | /* End PBXProject section */ 139 | 140 | /* Begin PBXResourcesBuildPhase section */ 141 | 3E0811CE1CD20F8C001581D3 /* Resources */ = { 142 | isa = PBXResourcesBuildPhase; 143 | buildActionMask = 2147483647; 144 | files = ( 145 | 3E0811DE1CD20F8C001581D3 /* LaunchScreen.storyboard in Resources */, 146 | 3E0811DB1CD20F8C001581D3 /* Assets.xcassets in Resources */, 147 | 3E0811D91CD20F8C001581D3 /* Main.storyboard in Resources */, 148 | ); 149 | runOnlyForDeploymentPostprocessing = 0; 150 | }; 151 | /* End PBXResourcesBuildPhase section */ 152 | 153 | /* Begin PBXSourcesBuildPhase section */ 154 | 3E0811CC1CD20F8C001581D3 /* Sources */ = { 155 | isa = PBXSourcesBuildPhase; 156 | buildActionMask = 2147483647; 157 | files = ( 158 | 3E0811EB1CD20FC8001581D3 /* FilteredVideoVendor.swift in Sources */, 159 | 3E0811D61CD20F8C001581D3 /* ViewController.swift in Sources */, 160 | 3E0811D41CD20F8C001581D3 /* AppDelegate.swift in Sources */, 161 | 3E0811ED1CD20FC8001581D3 /* VideoEffectsView.swift in Sources */, 162 | 3E0811EC1CD20FC8001581D3 /* VideoEffectsControlPanel.swift in Sources */, 163 | 3E0811EA1CD20FC8001581D3 /* FilteredVideoWriter.swift in Sources */, 164 | ); 165 | runOnlyForDeploymentPostprocessing = 0; 166 | }; 167 | /* End PBXSourcesBuildPhase section */ 168 | 169 | /* Begin PBXVariantGroup section */ 170 | 3E0811D71CD20F8C001581D3 /* Main.storyboard */ = { 171 | isa = PBXVariantGroup; 172 | children = ( 173 | 3E0811D81CD20F8C001581D3 /* Base */, 174 | ); 175 | name = Main.storyboard; 176 | sourceTree = ""; 177 | }; 178 | 3E0811DC1CD20F8C001581D3 /* LaunchScreen.storyboard */ = { 179 | isa = PBXVariantGroup; 180 | children = ( 181 | 3E0811DD1CD20F8C001581D3 /* Base */, 182 | ); 183 | name = LaunchScreen.storyboard; 184 | sourceTree = ""; 185 | }; 186 | /* End PBXVariantGroup section */ 187 | 188 | /* Begin XCBuildConfiguration section */ 189 | 3E0811E01CD20F8C001581D3 /* Debug */ = { 190 | isa = XCBuildConfiguration; 191 | buildSettings = { 192 | ALWAYS_SEARCH_USER_PATHS = NO; 193 | CLANG_ANALYZER_NONNULL = YES; 194 | CLANG_CXX_LANGUAGE_STANDARD = "gnu++0x"; 195 | CLANG_CXX_LIBRARY = "libc++"; 196 | CLANG_ENABLE_MODULES = YES; 197 | CLANG_ENABLE_OBJC_ARC = YES; 198 | CLANG_WARN_BOOL_CONVERSION = YES; 199 | CLANG_WARN_CONSTANT_CONVERSION = YES; 200 | CLANG_WARN_DIRECT_OBJC_ISA_USAGE = YES_ERROR; 201 | CLANG_WARN_EMPTY_BODY = YES; 202 | CLANG_WARN_ENUM_CONVERSION = YES; 203 | CLANG_WARN_INT_CONVERSION = YES; 204 | CLANG_WARN_OBJC_ROOT_CLASS = YES_ERROR; 205 | CLANG_WARN_UNREACHABLE_CODE = YES; 206 | CLANG_WARN__DUPLICATE_METHOD_MATCH = YES; 207 | "CODE_SIGN_IDENTITY[sdk=iphoneos*]" = "iPhone Developer"; 208 | COPY_PHASE_STRIP = NO; 209 | DEBUG_INFORMATION_FORMAT = dwarf; 210 | ENABLE_STRICT_OBJC_MSGSEND = YES; 211 | ENABLE_TESTABILITY = YES; 212 | GCC_C_LANGUAGE_STANDARD = gnu99; 213 | GCC_DYNAMIC_NO_PIC = NO; 214 | GCC_NO_COMMON_BLOCKS = YES; 215 | GCC_OPTIMIZATION_LEVEL = 0; 216 | GCC_PREPROCESSOR_DEFINITIONS = ( 217 | "DEBUG=1", 218 | "$(inherited)", 219 | ); 220 | GCC_WARN_64_TO_32_BIT_CONVERSION = YES; 221 | GCC_WARN_ABOUT_RETURN_TYPE = YES_ERROR; 222 | GCC_WARN_UNDECLARED_SELECTOR = YES; 223 | GCC_WARN_UNINITIALIZED_AUTOS = YES_AGGRESSIVE; 224 | GCC_WARN_UNUSED_FUNCTION = YES; 225 | GCC_WARN_UNUSED_VARIABLE = YES; 226 | IPHONEOS_DEPLOYMENT_TARGET = 9.3; 227 | MTL_ENABLE_DEBUG_INFO = YES; 228 | ONLY_ACTIVE_ARCH = YES; 229 | SDKROOT = iphoneos; 230 | SWIFT_OPTIMIZATION_LEVEL = "-Onone"; 231 | TARGETED_DEVICE_FAMILY = 2; 232 | }; 233 | name = Debug; 234 | }; 235 | 3E0811E11CD20F8C001581D3 /* Release */ = { 236 | isa = XCBuildConfiguration; 237 | buildSettings = { 238 | ALWAYS_SEARCH_USER_PATHS = NO; 239 | CLANG_ANALYZER_NONNULL = YES; 240 | CLANG_CXX_LANGUAGE_STANDARD = "gnu++0x"; 241 | CLANG_CXX_LIBRARY = "libc++"; 242 | CLANG_ENABLE_MODULES = YES; 243 | CLANG_ENABLE_OBJC_ARC = YES; 244 | CLANG_WARN_BOOL_CONVERSION = YES; 245 | CLANG_WARN_CONSTANT_CONVERSION = YES; 246 | CLANG_WARN_DIRECT_OBJC_ISA_USAGE = YES_ERROR; 247 | CLANG_WARN_EMPTY_BODY = YES; 248 | CLANG_WARN_ENUM_CONVERSION = YES; 249 | CLANG_WARN_INT_CONVERSION = YES; 250 | CLANG_WARN_OBJC_ROOT_CLASS = YES_ERROR; 251 | CLANG_WARN_UNREACHABLE_CODE = YES; 252 | CLANG_WARN__DUPLICATE_METHOD_MATCH = YES; 253 | "CODE_SIGN_IDENTITY[sdk=iphoneos*]" = "iPhone Developer"; 254 | COPY_PHASE_STRIP = NO; 255 | DEBUG_INFORMATION_FORMAT = "dwarf-with-dsym"; 256 | ENABLE_NS_ASSERTIONS = NO; 257 | ENABLE_STRICT_OBJC_MSGSEND = YES; 258 | GCC_C_LANGUAGE_STANDARD = gnu99; 259 | GCC_NO_COMMON_BLOCKS = YES; 260 | GCC_WARN_64_TO_32_BIT_CONVERSION = YES; 261 | GCC_WARN_ABOUT_RETURN_TYPE = YES_ERROR; 262 | GCC_WARN_UNDECLARED_SELECTOR = YES; 263 | GCC_WARN_UNINITIALIZED_AUTOS = YES_AGGRESSIVE; 264 | GCC_WARN_UNUSED_FUNCTION = YES; 265 | GCC_WARN_UNUSED_VARIABLE = YES; 266 | IPHONEOS_DEPLOYMENT_TARGET = 9.3; 267 | MTL_ENABLE_DEBUG_INFO = NO; 268 | SDKROOT = iphoneos; 269 | TARGETED_DEVICE_FAMILY = 2; 270 | VALIDATE_PRODUCT = YES; 271 | }; 272 | name = Release; 273 | }; 274 | 3E0811E31CD20F8C001581D3 /* Debug */ = { 275 | isa = XCBuildConfiguration; 276 | buildSettings = { 277 | ASSETCATALOG_COMPILER_APPICON_NAME = AppIcon; 278 | INFOPLIST_FILE = VideoEffects/Info.plist; 279 | LD_RUNPATH_SEARCH_PATHS = "$(inherited) @executable_path/Frameworks"; 280 | PRODUCT_BUNDLE_IDENTIFIER = uk.co.flexmonkey.VideoEffects; 281 | PRODUCT_NAME = "$(TARGET_NAME)"; 282 | }; 283 | name = Debug; 284 | }; 285 | 3E0811E41CD20F8C001581D3 /* Release */ = { 286 | isa = XCBuildConfiguration; 287 | buildSettings = { 288 | ASSETCATALOG_COMPILER_APPICON_NAME = AppIcon; 289 | INFOPLIST_FILE = VideoEffects/Info.plist; 290 | LD_RUNPATH_SEARCH_PATHS = "$(inherited) @executable_path/Frameworks"; 291 | PRODUCT_BUNDLE_IDENTIFIER = uk.co.flexmonkey.VideoEffects; 292 | PRODUCT_NAME = "$(TARGET_NAME)"; 293 | }; 294 | name = Release; 295 | }; 296 | /* End XCBuildConfiguration section */ 297 | 298 | /* Begin XCConfigurationList section */ 299 | 3E0811CB1CD20F8C001581D3 /* Build configuration list for PBXProject "VideoEffects" */ = { 300 | isa = XCConfigurationList; 301 | buildConfigurations = ( 302 | 3E0811E01CD20F8C001581D3 /* Debug */, 303 | 3E0811E11CD20F8C001581D3 /* Release */, 304 | ); 305 | defaultConfigurationIsVisible = 0; 306 | defaultConfigurationName = Release; 307 | }; 308 | 3E0811E21CD20F8C001581D3 /* Build configuration list for PBXNativeTarget "VideoEffects" */ = { 309 | isa = XCConfigurationList; 310 | buildConfigurations = ( 311 | 3E0811E31CD20F8C001581D3 /* Debug */, 312 | 3E0811E41CD20F8C001581D3 /* Release */, 313 | ); 314 | defaultConfigurationIsVisible = 0; 315 | }; 316 | /* End XCConfigurationList section */ 317 | }; 318 | rootObject = 3E0811C81CD20F8C001581D3 /* Project object */; 319 | } 320 | -------------------------------------------------------------------------------- /VideoEffects.xcodeproj/project.xcworkspace/contents.xcworkspacedata: -------------------------------------------------------------------------------- 1 | 2 | 4 | 6 | 7 | 8 | -------------------------------------------------------------------------------- /VideoEffects.xcodeproj/xcuserdata/simongladman.xcuserdatad/xcschemes/VideoEffects.xcscheme: -------------------------------------------------------------------------------- 1 | 2 | 5 | 8 | 9 | 15 | 21 | 22 | 23 | 24 | 25 | 30 | 31 | 32 | 33 | 39 | 40 | 41 | 42 | 43 | 44 | 54 | 56 | 62 | 63 | 64 | 65 | 66 | 67 | 73 | 75 | 81 | 82 | 83 | 84 | 86 | 87 | 90 | 91 | 92 | -------------------------------------------------------------------------------- /VideoEffects.xcodeproj/xcuserdata/simongladman.xcuserdatad/xcschemes/xcschememanagement.plist: -------------------------------------------------------------------------------- 1 | 2 | 3 | 4 | 5 | SchemeUserState 6 | 7 | VideoEffects.xcscheme 8 | 9 | orderHint 10 | 0 11 | 12 | 13 | SuppressBuildableAutocreation 14 | 15 | 3E0811CF1CD20F8C001581D3 16 | 17 | primary 18 | 19 | 20 | 21 | 22 | 23 | -------------------------------------------------------------------------------- /VideoEffects/AppDelegate.swift: -------------------------------------------------------------------------------- 1 | // 2 | // AppDelegate.swift 3 | // VideoEffects 4 | // 5 | // Created by Simon Gladman on 28/04/2016. 6 | // Copyright © 2016 Simon Gladman. All rights reserved. 7 | // 8 | 9 | import UIKit 10 | 11 | @UIApplicationMain 12 | class AppDelegate: UIResponder, UIApplicationDelegate { 13 | 14 | var window: UIWindow? 15 | 16 | 17 | func application(application: UIApplication, didFinishLaunchingWithOptions launchOptions: [NSObject: AnyObject]?) -> Bool { 18 | // Override point for customization after application launch. 19 | return true 20 | } 21 | 22 | func applicationWillResignActive(application: UIApplication) { 23 | // Sent when the application is about to move from active to inactive state. This can occur for certain types of temporary interruptions (such as an incoming phone call or SMS message) or when the user quits the application and it begins the transition to the background state. 24 | // Use this method to pause ongoing tasks, disable timers, and throttle down OpenGL ES frame rates. Games should use this method to pause the game. 25 | } 26 | 27 | func applicationDidEnterBackground(application: UIApplication) { 28 | // Use this method to release shared resources, save user data, invalidate timers, and store enough application state information to restore your application to its current state in case it is terminated later. 29 | // If your application supports background execution, this method is called instead of applicationWillTerminate: when the user quits. 30 | } 31 | 32 | func applicationWillEnterForeground(application: UIApplication) { 33 | // Called as part of the transition from the background to the inactive state; here you can undo many of the changes made on entering the background. 34 | } 35 | 36 | func applicationDidBecomeActive(application: UIApplication) { 37 | // Restart any tasks that were paused (or not yet started) while the application was inactive. If the application was previously in the background, optionally refresh the user interface. 38 | } 39 | 40 | func applicationWillTerminate(application: UIApplication) { 41 | // Called when the application is about to terminate. Save data if appropriate. See also applicationDidEnterBackground:. 42 | } 43 | 44 | 45 | } 46 | 47 | -------------------------------------------------------------------------------- /VideoEffects/Assets.xcassets/AppIcon.appiconset/Contents.json: -------------------------------------------------------------------------------- 1 | { 2 | "images" : [ 3 | { 4 | "idiom" : "ipad", 5 | "size" : "29x29", 6 | "scale" : "1x" 7 | }, 8 | { 9 | "idiom" : "ipad", 10 | "size" : "29x29", 11 | "scale" : "2x" 12 | }, 13 | { 14 | "idiom" : "ipad", 15 | "size" : "40x40", 16 | "scale" : "1x" 17 | }, 18 | { 19 | "idiom" : "ipad", 20 | "size" : "40x40", 21 | "scale" : "2x" 22 | }, 23 | { 24 | "idiom" : "ipad", 25 | "size" : "76x76", 26 | "scale" : "1x" 27 | }, 28 | { 29 | "idiom" : "ipad", 30 | "size" : "76x76", 31 | "scale" : "2x" 32 | } 33 | ], 34 | "info" : { 35 | "version" : 1, 36 | "author" : "xcode" 37 | } 38 | } -------------------------------------------------------------------------------- /VideoEffects/Base.lproj/LaunchScreen.storyboard: -------------------------------------------------------------------------------- 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10 | 11 | 12 | 13 | 14 | 15 | 16 | 17 | 18 | 19 | 20 | 21 | 22 | 23 | 24 | 25 | 26 | 27 | 28 | -------------------------------------------------------------------------------- /VideoEffects/Base.lproj/Main.storyboard: -------------------------------------------------------------------------------- 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10 | 11 | 12 | 13 | 14 | 15 | 16 | 17 | 18 | 19 | 20 | 21 | 22 | 23 | 24 | 25 | 26 | -------------------------------------------------------------------------------- /VideoEffects/FilteredVideoVendor.swift: -------------------------------------------------------------------------------- 1 | // FilteredVideoVendor.swift 2 | // VideoEffects 3 | // 4 | // Created by Simon Gladman on 17/04/2016. 5 | // Copyright © 2016 Simon Gladman. All rights reserved. 6 | // 7 | 8 | import UIKit 9 | import MobileCoreServices 10 | import AVFoundation 11 | 12 | class FilteredVideoVendor: NSObject { 13 | 14 | static let pixelBufferAttributes: [String:AnyObject] = [ 15 | String(kCVPixelBufferPixelFormatTypeKey): NSNumber(unsignedInt: kCVPixelFormatType_420YpCbCr8BiPlanarFullRange)] 16 | 17 | let ciContext = CIContext() 18 | 19 | var videoOutput = AVPlayerItemVideoOutput(pixelBufferAttributes: FilteredVideoVendor.pixelBufferAttributes) 20 | var player: AVPlayer? 21 | var videoTransform: CGAffineTransform? 22 | var unfilteredImage: CIImage? 23 | var currentURL: NSURL? 24 | var failedPixelBufferForItemTimeCount = 0 25 | 26 | weak var delegate: FilteredVideoVendorDelegate? 27 | 28 | var ciFilter: CIFilter? { 29 | didSet { 30 | displayFilteredImage() 31 | } 32 | } 33 | 34 | var paused = true { 35 | didSet { 36 | displayLink.paused = paused 37 | 38 | if displayLink.paused { 39 | player?.pause() 40 | } 41 | else { 42 | player?.play() 43 | } 44 | } 45 | } 46 | 47 | lazy var displayLink: CADisplayLink = { 48 | [unowned self] in 49 | 50 | let displayLink = CADisplayLink( 51 | target: self, 52 | selector: #selector(FilteredVideoVendor.step(_:))) 53 | 54 | displayLink.addToRunLoop(NSRunLoop.mainRunLoop(), forMode: NSDefaultRunLoopMode) 55 | displayLink.paused = true 56 | 57 | return displayLink 58 | }() 59 | 60 | func openMovie(url: NSURL){ 61 | player = AVPlayer(URL: url) 62 | 63 | guard let player = player, 64 | currentItem = player.currentItem, 65 | videoTrack = currentItem.asset.tracksWithMediaType(AVMediaTypeVideo).first else { 66 | fatalError("** unable to access item **") 67 | } 68 | 69 | currentURL = url 70 | failedPixelBufferForItemTimeCount = 0 71 | 72 | currentItem.addOutput(videoOutput) 73 | 74 | videoTransform = CGAffineTransformInvert(videoTrack.preferredTransform) 75 | 76 | player.muted = true 77 | } 78 | 79 | func gotoNormalisedTime(normalisedTime: Double) { 80 | guard let player = player else { 81 | return 82 | } 83 | 84 | let timeSeconds = player.currentItem!.asset.duration.seconds * normalisedTime 85 | 86 | let time = CMTimeMakeWithSeconds(timeSeconds, 600) 87 | 88 | player.seekToTime( 89 | time, 90 | toleranceBefore: kCMTimeZero, 91 | toleranceAfter: kCMTimeZero) 92 | 93 | displayVideoFrame(time) 94 | } 95 | 96 | // MARK: Main playback loop 97 | func step(link: CADisplayLink) { 98 | guard let player = player, 99 | currentItem = player.currentItem else { 100 | return 101 | } 102 | 103 | let itemTime = videoOutput.itemTimeForHostTime(CACurrentMediaTime()) 104 | 105 | displayVideoFrame(itemTime) 106 | 107 | let normalisedTime = Float(itemTime.seconds / currentItem.asset.duration.seconds) 108 | 109 | delegate?.vendorNormalisedTimeUpdated(normalisedTime) 110 | 111 | if normalisedTime >= 1.0 112 | { 113 | paused = true 114 | } 115 | } 116 | 117 | func displayVideoFrame(time: CMTime) { 118 | guard let player = player, 119 | currentItem = player.currentItem where player.status == .ReadyToPlay && currentItem.status == .ReadyToPlay else { 120 | return 121 | } 122 | 123 | if videoOutput.hasNewPixelBufferForItemTime(time) { 124 | failedPixelBufferForItemTimeCount = 0 125 | 126 | var presentationItemTime = kCMTimeZero 127 | 128 | guard let pixelBuffer = videoOutput.copyPixelBufferForItemTime( 129 | time, 130 | itemTimeForDisplay: &presentationItemTime) else { 131 | return 132 | } 133 | 134 | unfilteredImage = CIImage(CVImageBuffer: pixelBuffer) 135 | 136 | displayFilteredImage() 137 | } 138 | else if let currentURL = currentURL where !paused { 139 | failedPixelBufferForItemTimeCount += 1 140 | 141 | if failedPixelBufferForItemTimeCount > 12 { 142 | openMovie(currentURL) 143 | } 144 | } 145 | } 146 | 147 | func displayFilteredImage() { 148 | guard let unfilteredImage = unfilteredImage, 149 | videoTransform = videoTransform else { 150 | return 151 | } 152 | 153 | let ciImage: CIImage 154 | 155 | if let ciFilter = ciFilter { 156 | ciFilter.setValue(unfilteredImage, forKey: kCIInputImageKey) 157 | 158 | ciImage = ciFilter.outputImage!.imageByApplyingTransform(videoTransform) 159 | } 160 | else { 161 | ciImage = unfilteredImage.imageByApplyingTransform(videoTransform) 162 | } 163 | 164 | let cgImage = ciContext.createCGImage( 165 | ciImage, 166 | fromRect: ciImage.extent) 167 | 168 | delegate?.finalOutputUpdated(UIImage(CGImage: cgImage)) 169 | } 170 | 171 | } 172 | 173 | protocol FilteredVideoVendorDelegate: class { 174 | func finalOutputUpdated(image: UIImage) 175 | func vendorNormalisedTimeUpdated(normalisedTime: Float) 176 | } -------------------------------------------------------------------------------- /VideoEffects/FilteredVideoWriter.swift: -------------------------------------------------------------------------------- 1 | // FilteredVideoWriter.swift 2 | // VideoEffects 3 | // 4 | // Created by Simon Gladman on 17/04/2016. 5 | // Copyright © 2016 Simon Gladman. All rights reserved. 6 | // 7 | 8 | import MobileCoreServices 9 | import AVFoundation 10 | import CoreImage 11 | import UIKit 12 | 13 | class FilteredVideoWriter: NSObject { 14 | lazy var media_queue: dispatch_queue_t = { 15 | return dispatch_queue_create("mediaInputQueue", nil) 16 | }() 17 | 18 | /// `timeDateFormatter` is used when generating a file name for the 19 | /// temporary file when creating the final output 20 | let timeDateFormatter: NSDateFormatter = { 21 | let formatter = NSDateFormatter() 22 | 23 | formatter.dateFormat = "yyyyMMdd_HHmmss" 24 | 25 | return formatter 26 | }() 27 | 28 | let ciContext = CIContext() 29 | 30 | weak var delegate: FilteredVideoWriterDelegate? 31 | 32 | let urls = NSFileManager.defaultManager().URLsForDirectory(.DocumentDirectory, inDomains: .UserDomainMask) 33 | var assetWriterPixelBufferInput: AVAssetWriterInputPixelBufferAdaptor? 34 | var videoWriterInput: AVAssetWriterInput? 35 | var videoWriter: AVAssetWriter? 36 | var videoOutputURL: NSURL? 37 | var player: AVPlayer? 38 | var ciFilter: CIFilter? 39 | var videoTransform: CGAffineTransform? 40 | var videoOutput: AVPlayerItemVideoOutput? 41 | 42 | /// Initialises the objects required to save the final video output and begins writing 43 | func beginSaving(player player: AVPlayer, ciFilter: CIFilter, videoTransform: CGAffineTransform, videoOutput: AVPlayerItemVideoOutput) { 44 | 45 | self.player = player 46 | self.ciFilter = ciFilter 47 | self.videoTransform = videoTransform 48 | self.videoOutput = videoOutput 49 | 50 | guard let currentItem = player.currentItem else { 51 | return 52 | } 53 | 54 | guard let documentDirectory: NSURL = urls.first else { 55 | fatalError("** unable to access document directory **") 56 | } 57 | 58 | videoOutputURL = documentDirectory.URLByAppendingPathComponent("Output_\(timeDateFormatter.stringFromDate(NSDate())).mp4") 59 | 60 | do { 61 | videoWriter = try AVAssetWriter(URL: videoOutputURL!, fileType: AVFileTypeMPEG4) 62 | } 63 | catch { 64 | fatalError("** unable to create asset writer **") 65 | } 66 | 67 | let outputSettings: [String : AnyObject] = [ 68 | AVVideoCodecKey: AVVideoCodecH264, 69 | AVVideoWidthKey: currentItem.presentationSize.width, 70 | AVVideoHeightKey: currentItem.presentationSize.height] 71 | 72 | guard videoWriter!.canApplyOutputSettings(outputSettings, forMediaType: AVMediaTypeVideo) else { 73 | fatalError("** unable to apply video settings ** ") 74 | } 75 | 76 | videoWriterInput = AVAssetWriterInput( 77 | mediaType: AVMediaTypeVideo, 78 | outputSettings: outputSettings) 79 | 80 | if videoWriter!.canAddInput(videoWriterInput!) { 81 | videoWriter!.addInput(videoWriterInput!) 82 | } 83 | else { 84 | fatalError ("** unable to add input **") 85 | } 86 | 87 | let sourcePixelBufferAttributesDictionary = [ 88 | String(kCVPixelBufferPixelFormatTypeKey) : Int(kCVPixelFormatType_32BGRA), 89 | String(kCVPixelBufferWidthKey) : currentItem.presentationSize.width, 90 | String(kCVPixelBufferHeightKey) : currentItem.presentationSize.height, 91 | String(kCVPixelFormatOpenGLESCompatibility) : kCFBooleanTrue 92 | ] 93 | 94 | assetWriterPixelBufferInput = AVAssetWriterInputPixelBufferAdaptor( 95 | assetWriterInput: videoWriterInput!, 96 | sourcePixelBufferAttributes: sourcePixelBufferAttributesDictionary) 97 | 98 | if videoWriter!.startWriting() { 99 | videoWriter!.startSessionAtSourceTime(kCMTimeZero) 100 | } 101 | 102 | player.seekToTime( 103 | CMTimeMakeWithSeconds(0, 600), 104 | toleranceBefore: kCMTimeZero, 105 | toleranceAfter: kCMTimeZero) 106 | { 107 | _ in self.writeVideoFrames() 108 | } 109 | } 110 | 111 | /// Writes video frames to videoOutputURL 112 | func writeVideoFrames() { 113 | 114 | guard let player = player, 115 | assetWriterPixelBufferInput = assetWriterPixelBufferInput, 116 | pixelBufferPool = assetWriterPixelBufferInput.pixelBufferPool, 117 | currentItem = player.currentItem, 118 | duration = player.currentItem?.asset.duration, 119 | ciFilter = ciFilter, 120 | videoWriter = videoWriter, 121 | videoWriterInput = videoWriterInput, 122 | videoOutputURL = videoOutputURL, 123 | videoTransform = videoTransform, 124 | videoOutput = videoOutput, 125 | frameRate = currentItem.asset.tracksWithMediaType(AVMediaTypeVideo).first?.nominalFrameRate else { 126 | return 127 | } 128 | 129 | assetWriterPixelBufferInput.assetWriterInput.requestMediaDataWhenReadyOnQueue(media_queue) { 130 | 131 | let numberOfFrames = Int(duration.seconds * Double(frameRate)) 132 | 133 | for frameNumber in 0 ..< numberOfFrames { 134 | 135 | NSThread.sleepForTimeInterval(0.05) 136 | 137 | dispatch_async(dispatch_get_main_queue()) { 138 | self.delegate?.updateSaveProgress(Float(frameNumber) / Float(numberOfFrames)) 139 | } 140 | 141 | if videoOutput.hasNewPixelBufferForItemTime(currentItem.currentTime()) { 142 | var presentationItemTime = kCMTimeZero 143 | 144 | if let pixelBuffer = videoOutput.copyPixelBufferForItemTime( 145 | currentItem.currentTime(), 146 | itemTimeForDisplay: &presentationItemTime) { 147 | 148 | let ciImage = CIImage(CVImageBuffer: pixelBuffer).imageByApplyingTransform(videoTransform) 149 | let positionTransform = CGAffineTransformMakeTranslation(-ciImage.extent.origin.x, -ciImage.extent.origin.y) 150 | let transformedImage = ciImage.imageByApplyingTransform(positionTransform) 151 | 152 | ciFilter.setValue(transformedImage, forKey: kCIInputImageKey) 153 | 154 | var newPixelBuffer: CVPixelBuffer? = nil 155 | 156 | CVPixelBufferPoolCreatePixelBuffer(kCFAllocatorDefault, pixelBufferPool, &newPixelBuffer) 157 | 158 | self.ciContext.render( 159 | ciFilter.outputImage!, 160 | toCVPixelBuffer: newPixelBuffer!, 161 | bounds: ciFilter.outputImage!.extent, 162 | colorSpace: nil) 163 | 164 | assetWriterPixelBufferInput.appendPixelBuffer( 165 | newPixelBuffer!, 166 | withPresentationTime: presentationItemTime) 167 | } 168 | } 169 | 170 | currentItem.stepByCount(1) 171 | } 172 | 173 | videoWriterInput.markAsFinished() 174 | 175 | videoWriter.finishWritingWithCompletionHandler { 176 | player.seekToTime( 177 | CMTimeMakeWithSeconds(0, 600), 178 | toleranceBefore: kCMTimeZero, 179 | toleranceAfter: kCMTimeZero) 180 | 181 | dispatch_async(dispatch_get_main_queue()) { 182 | UISaveVideoAtPathToSavedPhotosAlbum( 183 | videoOutputURL.relativePath!, 184 | self, 185 | #selector(FilteredVideoWriter.video(_:didFinishSavingWithError:contextInfo:)), 186 | nil) 187 | } 188 | } 189 | } 190 | 191 | } 192 | 193 | // UISaveVideoAtPathToSavedPhotosAlbum completion 194 | func video(videoPath: NSString, didFinishSavingWithError error: NSError?, contextInfo info: AnyObject) 195 | { 196 | if let videoOutputURL = videoOutputURL where NSFileManager.defaultManager().isDeletableFileAtPath(videoOutputURL.relativePath!) 197 | { 198 | try! NSFileManager.defaultManager().removeItemAtURL(videoOutputURL) 199 | } 200 | 201 | assetWriterPixelBufferInput = nil 202 | videoWriterInput = nil 203 | videoWriter = nil 204 | videoOutputURL = nil 205 | 206 | delegate?.saveComplete() 207 | } 208 | } 209 | 210 | protocol FilteredVideoWriterDelegate: class { 211 | func updateSaveProgress(progress: Float) 212 | func saveComplete() 213 | } 214 | 215 | 216 | -------------------------------------------------------------------------------- /VideoEffects/Info.plist: -------------------------------------------------------------------------------- 1 | 2 | 3 | 4 | 5 | CFBundleDevelopmentRegion 6 | en 7 | CFBundleExecutable 8 | $(EXECUTABLE_NAME) 9 | CFBundleIdentifier 10 | $(PRODUCT_BUNDLE_IDENTIFIER) 11 | CFBundleInfoDictionaryVersion 12 | 6.0 13 | CFBundleName 14 | $(PRODUCT_NAME) 15 | CFBundlePackageType 16 | APPL 17 | CFBundleShortVersionString 18 | 1.0 19 | CFBundleSignature 20 | ???? 21 | CFBundleVersion 22 | 1 23 | LSRequiresIPhoneOS 24 | 25 | UILaunchStoryboardName 26 | LaunchScreen 27 | UIMainStoryboardFile 28 | Main 29 | UIRequiredDeviceCapabilities 30 | 31 | armv7 32 | 33 | UISupportedInterfaceOrientations~ipad 34 | 35 | UIInterfaceOrientationPortrait 36 | UIInterfaceOrientationPortraitUpsideDown 37 | UIInterfaceOrientationLandscapeLeft 38 | UIInterfaceOrientationLandscapeRight 39 | 40 | 41 | 42 | -------------------------------------------------------------------------------- /VideoEffects/VideoEffectsControlPanel.swift: -------------------------------------------------------------------------------- 1 | // VideoEffectsControlPanel.swift 2 | // VideoEffects 3 | // 4 | // Created by Simon Gladman on 17/04/2016. 5 | // Copyright © 2016 Simon Gladman. All rights reserved. 6 | // 7 | 8 | import UIKit 9 | import MobileCoreServices 10 | 11 | class VideoEffectsControlPanel: UIControl 12 | { 13 | static let PlayPauseControlEvent = UIControlEvents(rawValue: 0b0001 << 24) 14 | static let LoadControlEvent = UIControlEvents(rawValue: 0b0010 << 24) 15 | static let SaveControlEvent = UIControlEvents(rawValue: 0b0100 << 24) 16 | static let ScrubControlEvent = UIControlEvents(rawValue: 0b1000 << 24) 17 | static let FilterChangeControlEvent = UIControlEvents.ValueChanged 18 | 19 | lazy var toolbar: UIToolbar = { 20 | [unowned self] in 21 | 22 | let toolbar = UIToolbar() 23 | 24 | let flexibleSpacer = UIBarButtonItem( 25 | barButtonSystemItem: .FlexibleSpace, 26 | target: nil, 27 | action: nil) 28 | 29 | let playButton = UIBarButtonItem( 30 | barButtonSystemItem: .Play, 31 | target: self, 32 | action: #selector(VideoEffectsControlPanel.play)) 33 | 34 | let pauseButton = UIBarButtonItem( 35 | barButtonSystemItem: .Pause, 36 | target: nil, 37 | action: #selector(VideoEffectsControlPanel.pause)) 38 | 39 | let loadButton = UIBarButtonItem( 40 | title: "Load", 41 | style: .Plain, 42 | target: nil, 43 | action: #selector(VideoEffectsControlPanel.load)) 44 | 45 | let saveButton = UIBarButtonItem( 46 | title: "Save", 47 | style: .Plain, 48 | target: nil, 49 | action: #selector(VideoEffectsControlPanel.save)) 50 | 51 | saveButton.enabled = false 52 | playButton.enabled = false 53 | pauseButton.enabled = false 54 | 55 | var items = [playButton, pauseButton, flexibleSpacer] + self.filterButtons + [flexibleSpacer, loadButton, saveButton] 56 | 57 | self.filterButtons.forEach{ 58 | $0.enabled = false 59 | } 60 | 61 | self.filterButtons[0].style = .Done 62 | 63 | toolbar.setItems( 64 | items, 65 | animated: false) 66 | 67 | return toolbar 68 | }() 69 | 70 | lazy var scrubber: UISlider = { 71 | [unowned self] in 72 | 73 | let slider = UISlider() 74 | 75 | slider.maximumTrackTintColor = UIColor.lightGrayColor() 76 | slider.minimumTrackTintColor = UIColor.lightGrayColor() 77 | 78 | slider.addTarget( 79 | self, 80 | action: #selector(VideoEffectsControlPanel.scrubberHandler), 81 | forControlEvents: .ValueChanged) 82 | 83 | slider.enabled = false 84 | 85 | return slider 86 | }() 87 | 88 | lazy var controlsStackView: UIStackView = { 89 | [unowned self] in 90 | 91 | let stackview = UIStackView() 92 | 93 | stackview.axis = .Vertical 94 | stackview.addArrangedSubview(self.scrubber) 95 | stackview.addArrangedSubview(self.toolbar) 96 | 97 | return stackview 98 | }() 99 | 100 | lazy var filterButtons: [UIBarButtonItem] = { 101 | [unowned self] in 102 | 103 | return self.filterDisplayNames.map { 104 | UIBarButtonItem( 105 | title: $0, 106 | style: .Plain, 107 | target: self, 108 | action: #selector(VideoEffectsControlPanel.setFilter(_:))) 109 | } 110 | }() 111 | 112 | lazy var imagePicker: UIImagePickerController = { 113 | [unowned self] in 114 | 115 | let imagePicker = UIImagePickerController() 116 | 117 | imagePicker.delegate = self 118 | imagePicker.allowsEditing = false 119 | imagePicker.modalInPopover = true 120 | imagePicker.sourceType = .PhotoLibrary 121 | imagePicker.mediaTypes = [kUTTypeMovie as String] 122 | 123 | return imagePicker 124 | }() 125 | 126 | let filterDisplayNames = [ 127 | "None", "Chrome", "Fade", "Instant", "Mono", "Noir", "Process", "Tonal", "Transfer"] 128 | 129 | var playButton: UIBarButtonItem { 130 | return toolbar.items!.first! as UIBarButtonItem 131 | } 132 | 133 | var pauseButton: UIBarButtonItem { 134 | return toolbar.items![1] as UIBarButtonItem 135 | } 136 | 137 | var saveButton: UIBarButtonItem { 138 | return toolbar.items!.last! as UIBarButtonItem 139 | } 140 | 141 | var normalisedTime: Double 142 | { 143 | set { 144 | scrubber.value = Float(newValue) 145 | } 146 | get { 147 | return Double(scrubber.value) 148 | } 149 | } 150 | 151 | var rootViewController: UIViewController { 152 | return UIApplication.sharedApplication().keyWindow!.rootViewController! 153 | } 154 | 155 | private (set) var url: NSURL? 156 | private (set) var filterDisplayName = "None" 157 | 158 | var paused = true { 159 | didSet { 160 | playButton.enabled = paused 161 | pauseButton.enabled = !paused 162 | } 163 | } 164 | 165 | override var enabled: Bool { 166 | didSet { 167 | alpha = enabled ? 1 : 0.2 168 | } 169 | } 170 | 171 | override init(frame: CGRect) { 172 | super.init(frame: frame) 173 | 174 | addSubview(controlsStackView) 175 | } 176 | 177 | required init?(coder aDecoder: NSCoder) { 178 | fatalError("init(coder:) has not been implemented") 179 | } 180 | 181 | func play() { 182 | paused = false 183 | sendActionsForControlEvents(VideoEffectsControlPanel.PlayPauseControlEvent) 184 | } 185 | 186 | func pause() { 187 | paused = true 188 | sendActionsForControlEvents(VideoEffectsControlPanel.PlayPauseControlEvent) 189 | } 190 | 191 | func load() { 192 | paused = true 193 | sendActionsForControlEvents(VideoEffectsControlPanel.PlayPauseControlEvent) 194 | 195 | rootViewController.presentViewController(imagePicker, animated: true, completion: nil) 196 | } 197 | 198 | func save() { 199 | sendActionsForControlEvents(VideoEffectsControlPanel.SaveControlEvent) 200 | } 201 | 202 | func scrubberHandler() { 203 | paused = true 204 | sendActionsForControlEvents(VideoEffectsControlPanel.ScrubControlEvent) 205 | } 206 | 207 | func setFilter(barButtonItem: UIBarButtonItem) 208 | { 209 | guard let 210 | filterDisplayName = barButtonItem.title, 211 | filterIndex = filterDisplayNames.indexOf(filterDisplayName) else { 212 | return 213 | } 214 | 215 | filterButtons.forEach{ 216 | $0.style = .Plain 217 | } 218 | filterButtons[filterIndex].style = .Done 219 | 220 | self.filterDisplayName = filterDisplayName 221 | 222 | sendActionsForControlEvents(VideoEffectsControlPanel.FilterChangeControlEvent) 223 | } 224 | 225 | override func layoutSubviews() { 226 | controlsStackView.frame = bounds 227 | 228 | controlsStackView.spacing = 20 229 | } 230 | } 231 | 232 | // MARK: UIImagePickerControllerDelegate, UINavigationControllerDelegate 233 | 234 | extension VideoEffectsControlPanel: UIImagePickerControllerDelegate, UINavigationControllerDelegate { 235 | func imagePickerController(picker: UIImagePickerController, didFinishPickingMediaWithInfo info: [String : AnyObject]){ 236 | defer { 237 | rootViewController.dismissViewControllerAnimated(true, completion: nil) 238 | } 239 | 240 | guard let url = info[UIImagePickerControllerMediaURL] as? NSURL else { 241 | return 242 | } 243 | 244 | self.url = url 245 | 246 | sendActionsForControlEvents(VideoEffectsControlPanel.LoadControlEvent) 247 | } 248 | } 249 | 250 | -------------------------------------------------------------------------------- /VideoEffects/VideoEffectsView.swift: -------------------------------------------------------------------------------- 1 | // VideoEffectsView.swift 2 | // VideoEffects 3 | // 4 | // Created by Simon Gladman on 17/04/2016. 5 | // Copyright © 2016 Simon Gladman. All rights reserved. 6 | // 7 | 8 | import UIKit 9 | import AVFoundation 10 | 11 | class VideoEffectsView: UIView 12 | { 13 | 14 | // MARK: Video filtering components 15 | 16 | lazy var filteredVideoVendor: FilteredVideoVendor = { 17 | [unowned self] in 18 | 19 | let vendor = FilteredVideoVendor() 20 | vendor.delegate = self 21 | 22 | return vendor 23 | }() 24 | 25 | lazy var filteredVideoWriter: FilteredVideoWriter = { 26 | [unowned self] in 27 | 28 | let writer = FilteredVideoWriter() 29 | writer.delegate = self 30 | 31 | return writer 32 | }() 33 | 34 | // MARK: User Interface components 35 | 36 | let progressBar = UIProgressView(progressViewStyle: .Default) 37 | let imageView = UIImageView() 38 | 39 | lazy var controlPanel: VideoEffectsControlPanel = { 40 | [unowned self] in 41 | 42 | let controlPanel = VideoEffectsControlPanel() 43 | 44 | controlPanel.addTarget( 45 | self, 46 | action: #selector(VideoEffectsView.openMovie), 47 | forControlEvents: VideoEffectsControlPanel.LoadControlEvent) 48 | 49 | controlPanel.addTarget( 50 | self, 51 | action: #selector(VideoEffectsView.playPauseToggle), 52 | forControlEvents: VideoEffectsControlPanel.PlayPauseControlEvent) 53 | 54 | controlPanel.addTarget( 55 | self, 56 | action: #selector(VideoEffectsView.save), 57 | forControlEvents: VideoEffectsControlPanel.SaveControlEvent) 58 | 59 | controlPanel.addTarget( 60 | self, 61 | action: #selector(VideoEffectsView.filterChange), 62 | forControlEvents: VideoEffectsControlPanel.FilterChangeControlEvent) 63 | 64 | controlPanel.addTarget( 65 | self, 66 | action: #selector(VideoEffectsView.scrub), 67 | forControlEvents: VideoEffectsControlPanel.ScrubControlEvent) 68 | 69 | return controlPanel 70 | }() 71 | 72 | // MARK: CIFilter 73 | 74 | var ciFilter: CIFilter? { 75 | didSet { 76 | controlPanel.saveButton.enabled = ciFilter != nil 77 | 78 | filteredVideoVendor.ciFilter = ciFilter 79 | } 80 | } 81 | 82 | // MARK: State variables 83 | 84 | var saving = false { 85 | didSet { 86 | backgroundColor = saving ? UIColor.darkGrayColor() : UIColor.whiteColor() 87 | 88 | imageView.alpha = saving ? 0.2 : 1 89 | 90 | controlPanel.enabled = !saving 91 | 92 | progressBar.hidden = !saving 93 | 94 | if let player = filteredVideoVendor.player, 95 | ciFilter = filteredVideoVendor.ciFilter, 96 | videoTransform = filteredVideoVendor.videoTransform where saving 97 | { 98 | paused = true 99 | filteredVideoWriter.beginSaving( 100 | player: player, 101 | ciFilter: ciFilter, 102 | videoTransform: videoTransform, 103 | videoOutput: filteredVideoVendor.videoOutput) 104 | } 105 | } 106 | } 107 | 108 | var paused = true { 109 | didSet { 110 | controlPanel.paused = paused 111 | filteredVideoVendor.paused = paused 112 | } 113 | } 114 | 115 | // MARK: Control panel event handlers 116 | 117 | func playPauseToggle() { 118 | paused = controlPanel.paused 119 | } 120 | 121 | func save() { 122 | saving = true 123 | } 124 | 125 | func filterChange() { 126 | ciFilter = CIFilter(name: "CIPhotoEffect" + controlPanel.filterDisplayName) 127 | } 128 | 129 | func scrub() { 130 | paused = true 131 | 132 | filteredVideoVendor.gotoNormalisedTime(controlPanel.normalisedTime) 133 | } 134 | 135 | func openMovie(){ 136 | guard let url = controlPanel.url else { 137 | return 138 | } 139 | 140 | filteredVideoVendor.openMovie(url) 141 | 142 | controlPanel.filterButtons.forEach{ 143 | $0.enabled = true 144 | } 145 | 146 | controlPanel.scrubber.enabled = true 147 | controlPanel.saveButton.enabled = ciFilter != nil 148 | 149 | self.paused = false 150 | } 151 | 152 | // MARK: Overridden UI methods 153 | 154 | override func didMoveToWindow() { 155 | super.didMoveToWindow() 156 | 157 | imageView.contentMode = .ScaleAspectFit 158 | progressBar.hidden = true 159 | 160 | addSubview(imageView) 161 | addSubview(controlPanel) 162 | addSubview(progressBar) 163 | } 164 | 165 | override func layoutSubviews() { 166 | super.layoutSubviews() 167 | 168 | let controlsStackViewHeight: CGFloat = 90 169 | 170 | imageView.frame = CGRect( 171 | x: 0, 172 | y: 0, 173 | width: frame.width, 174 | height: frame.height - controlsStackViewHeight).insetBy(dx: 10, dy: 10) 175 | 176 | controlPanel.frame = CGRect( 177 | x: 0, 178 | y: frame.height - controlsStackViewHeight, 179 | width: frame.width, 180 | height: controlsStackViewHeight) 181 | 182 | progressBar.frame = CGRect( 183 | x: 0, 184 | y: frame.midY, 185 | width: frame.width, height: 20).insetBy(dx: 20, dy: 0) 186 | } 187 | 188 | } 189 | 190 | // MARK: FilteredVideoVendorDelegate 191 | 192 | extension VideoEffectsView: FilteredVideoVendorDelegate { 193 | 194 | func finalOutputUpdated(image: UIImage) { 195 | imageView.image = image 196 | } 197 | 198 | func vendorNormalisedTimeUpdated(normalisedTime: Float) { 199 | controlPanel.normalisedTime = Double(normalisedTime) 200 | } 201 | } 202 | 203 | // MARK: FilteredVideoWriterDelegate 204 | 205 | extension VideoEffectsView: FilteredVideoWriterDelegate { 206 | 207 | func updateSaveProgress(progress: Float) { 208 | progressBar.setProgress(progress, animated: true) 209 | } 210 | 211 | func saveComplete() { 212 | progressBar.setProgress(0, animated: false) 213 | saving = false 214 | controlPanel.normalisedTime = 0 215 | paused = false 216 | } 217 | } 218 | -------------------------------------------------------------------------------- /VideoEffects/ViewController.swift: -------------------------------------------------------------------------------- 1 | // 2 | // ViewController.swift 3 | // VideoEffects 4 | // 5 | // Created by Simon Gladman on 28/04/2016. 6 | // Copyright © 2016 Simon Gladman. All rights reserved. 7 | // 8 | 9 | import UIKit 10 | 11 | class ViewController: UIViewController { 12 | 13 | let videoEffectsView = VideoEffectsView() 14 | 15 | override func viewDidLoad() { 16 | super.viewDidLoad() 17 | 18 | view.addSubview(videoEffectsView) 19 | } 20 | 21 | override func viewDidLayoutSubviews() { 22 | videoEffectsView.frame = CGRect( 23 | x: 0, 24 | y: topLayoutGuide.length, 25 | width: view.frame.width, 26 | height: view.frame.height - topLayoutGuide.length) 27 | } 28 | 29 | } 30 | 31 | --------------------------------------------------------------------------------