- Notifications
You must be signed in to change notification settings - Fork272
NextLevel/NextLevel
Folders and files
Name | Name | Last commit message | Last commit date | |
---|---|---|---|---|
Repository files navigation
NextLevel
is aSwift camera system designed for easy integration, customized media capture, and image streaming in iOS. Integration can optionally leverageAVFoundation
orARKit
.
Features | |
---|---|
🎬 | “Vine-like” video clip recording and editing |
🖼 | photo capture (raw, jpeg, and video frame) |
👆 | customizable gestural interaction and interface |
💠 | ARKit integration (beta) |
📷 | dual, wide angle, telephoto, & true depth support |
🐢 | adjustable frame rate on supported hardware (ie fast/slow motion capture) |
🎢 | depth data capture support & portrait effects matte support |
🔍 | video zoom |
⚖ | white balance, focus, and exposure adjustment |
🔦 | flash and torch support |
👯 | mirroring support |
☀ | low light boost |
🕶 | smooth auto-focus |
⚙ | configurable encoding and compression settings |
🛠 | simple media capture and editing API |
🌀 | extensible API for image processing and CV |
🐈 | animated GIF creator |
😎 | face recognition; qr- and bar-codes recognition |
🐦 | Swift 5 |
Need a different version of Swift?
5.0
- Target your Podfile to the latest release or master4.2
- Target your Podfile to theswift4.2
branch
# CocoaPodspod"NextLevel","~> 0.16.3"# Carthagegithub"nextlevel/NextLevel" ~>0.16.3# Swift PMletpackage=Package(dependencies:[.Package(url:"https://github.com/nextlevel/NextLevel",majorVersion:0)])
Alternatively, drop the NextLevelsource files or project file into your Xcode project.
ARKit and the True Depth Camera software features are enabled with the inclusion of the Swift compiler flagUSE_ARKIT
andUSE_TRUE_DEPTH
respectively.
Apple willreject apps that link against ARKit or the True Depth Camera API and do not use them.
If you use Cocoapods, you can include-D USE_ARKIT
or-D USE_TRUE_DEPTH
with the followingPodfile
addition or by adding it to your Xcode build settings.
installer.pods_project.targets.eachdo |target|# setup NextLevel for ARKit useiftarget.name =='NextLevel'target.build_configurations.eachdo |config|config.build_settings['OTHER_SWIFT_FLAGS']=['$(inherited)','-DUSE_ARKIT']endendend
Before starting, ensure that permission keys have been added to your app'sInfo.plist
.
<key>NSCameraUsageDescription</key> <string>Allowing access to the camera lets you take photos and videos.</string><key>NSMicrophoneUsageDescription</key> <string>Allowing access to the microphone lets you record audio.</string>
Import the library.
import NextLevel
Setup the camera preview.
letscreenBounds=UIScreen.main.boundsself.previewView=UIView(frame: screenBounds)iflet previewView=self.previewView{ previewView.autoresizingMask=[.flexibleWidth,.flexibleHeight] previewView.backgroundColor=UIColor.blackNextLevel.shared.previewLayer.frame= previewView.bounds previewView.layer.addSublayer(NextLevel.shared.previewLayer)self.view.addSubview(previewView)}
Configure the capture session.
overridefunc viewDidLoad(){NextLevel.shared.delegate=selfNextLevel.shared.deviceDelegate=selfNextLevel.shared.videoDelegate=selfNextLevel.shared.photoDelegate=self // modify .videoConfiguration, .audioConfiguration, .photoConfiguration properties // Compression, resolution, and maximum recording time options are availableNextLevel.shared.videoConfiguration.maximumCaptureDuration=CMTimeMakeWithSeconds(5,600)NextLevel.shared.audioConfiguration.bitRate=44000}
Start/stop the session when appropriate. These methods create a new "session" instance for 'NextLevel.shared.session' when called.
overridefunc viewWillAppear(_ animated:Bool){ super.viewWillAppear(animated)NextLevel.shared.start() // …}
overridefunc viewWillDisappear(_ animated:Bool){ super.viewWillDisappear(animated)NextLevel.shared.stop() // …}
Video record/pause.
// recordNextLevel.shared.record()// pauseNextLevel.shared.pause()
Editing and finalizing the recorded session.
iflet session=NextLevel.shared.session{ //.. // undo session.removeLastClip() // various editing operations can be done using the NextLevelSession methods // export session.mergeClips(usingPreset: AVAssetExportPresetHighestQuality, completionHandler:{(url:URL?, error:Error?)iniflet _= url{ //}elseiflet _= error{ //}}) //..}
Videos can also be processed using theNextLevelSessionExporter, a media transcoding library in Swift.
‘NextLevel’ was designed for sample buffer analysis and custom modification in real-time along side a rich set of camera features.
Just to note, modifications performed on a buffer and provided back to NextLevel may potentially effect frame rate.
Enable custom rendering.
NextLevel.shared.isVideoCustomContextRenderingEnabled=true
Optional hook that allows readingsampleBuffer
for analysis.
extensionCameraViewController:NextLevelVideoDelegate{ // ... // video frame processingpublicfunc nextLevel(_ nextLevel:NextLevel, willProcessRawVideoSampleBuffer sampleBuffer:CMSampleBuffer){ // Use the sampleBuffer parameter in your system for continual analysis}
Another optional hook for reading buffers for modification,imageBuffer
. This is also the recommended place to provide the buffer back to NextLevel for recording.
extensionCameraViewController:NextLevelVideoDelegate{ // ... // enabled by isCustomContextVideoRenderingEnabledpublicfunc nextLevel(_ nextLevel:NextLevel, renderToCustomContextWithImageBuffer imageBuffer:CVPixelBuffer, onQueue queue:DispatchQueue){ // provide the frame back to NextLevel for recordingiflet frame=self._availableFrameBuffer{ nextLevel.videoCustomContextImageBuffer= frame}}
NextLevel will check this property when writing buffers to a destination file. This works for both video and photos withcapturePhotoFromVideo
.
nextLevel.videoCustomContextImageBuffer= modifiedFrame
NextLevel was initally a weekend project that has now grown into a open community of camera platform enthusists. The software provides foundational components for managing media recording, camera interface customization, gestural interaction customization, and image streaming on iOS. The same capabilities can also be found in apps such asSnapchat,Instagram, andVine.
The goal is to continue to provide a good foundation for quick integration (enabling projects to be taken to the next level) – allowing focus to placed on functionality that matters most whether it's realtime image processing, computer vision methods, augmented reality, orcomputational photography.
NextLevel provides components for capturing ARKit video and photo. This enables a variety of new camera features while leveraging the existing recording capabilities and media management of NextLevel.
If you are trying to capture frames from SceneKit for ARKit recording, check out theexamples project.
You can findthe docs here. Documentation is generated withjazzy and hosted onGitHub-Pages.
If you found this project to be helpful, check out theNext Level stickers.
NextLevel is a community – contributions and discussions are welcome!
- Feature idea? Open anissue.
- Found a bug? Open anissue.
- Need help? UseStack Overflow with the tag ’nextlevel’.
- Questions? UseStack Overflow with the tag 'nextlevel'.
- Want to contribute? Submit a pull request.
- Player (Swift), video player in Swift
- PBJVideoPlayer (obj-c), video player in obj-c
- NextLevelSessionExporter, media transcoding in Swift
- GPUImage3, image processing library
- SCRecorder, obj-c capture library
- PBJVision, obj-c capture library
- iOS Device Camera Summary
- AV Foundation Programming Guide
- AV Foundation Framework Reference
- ARKit Framework Reference
- Swift Evolution
- objc.io Camera and Photos
- objc.io Video
- objc.io Core Image and Video
- Cameras, ecommerce and machine learning
- Again, iPhone is the default camera
NextLevel is available under the MIT license, see theLICENSE file for more information.
About
⬆️ Media Capture in Swift