{"id":15030217,"url":"https://github.com/nextlevel/nextlevel","last_synced_at":"2025-05-14T21:07:52.446Z","repository":{"id":29296128,"uuid":"32829086","full_name":"NextLevel/NextLevel","owner":"NextLevel","description":"⬆️ Media Capture in Swift","archived":false,"fork":false,"pushed_at":"2024-08-12T16:37:50.000Z","size":5271,"stargazers_count":2205,"open_issues_count":76,"forks_count":271,"subscribers_count":63,"default_branch":"main","last_synced_at":"2024-10-29T15:34:10.664Z","etag":null,"topics":["ar","arkit","augmented-reality","avfoundation","camera","capture","coreimage","custom","instagram","ios","media","mixed-reality","nextlevel","photography","snapchat","swift","tiktok","video","vision"],"latest_commit_sha":null,"homepage":"http://nextlevel.engineering","language":"Swift","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"mit","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/NextLevel.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null}},"created_at":"2015-03-24T22:34:27.000Z","updated_at":"2024-10-22T13:46:25.000Z","dependencies_parsed_at":"2024-11-19T14:01:04.445Z","dependency_job_id":null,"html_url":"https://github.com/NextLevel/NextLevel","commit_stats":{"total_commits":736,"total_committers":29,"mean_commits":"25.379310344827587","dds":0.4035326086956522,"last_synced_commit":"ff3ebc46ad05fa3628daba0393ce1d1171f8c4ea"},"previous_names":[],"tags_count":76,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/NextLevel%2FNextLevel","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/NextLevel%2FNextLevel/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/NextLevel%2FNextLevel/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/NextLevel%2FNextLevel/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/NextLevel","download_url":"https://codeload.github.com/NextLevel/NextLevel/tar.gz/refs/heads/main","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":248202842,"owners_count":21064445,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["ar","arkit","augmented-reality","avfoundation","camera","capture","coreimage","custom","instagram","ios","media","mixed-reality","nextlevel","photography","snapchat","swift","tiktok","video","vision"],"created_at":"2024-09-24T20:12:46.567Z","updated_at":"2025-04-10T10:50:13.110Z","avatar_url":"https://github.com/NextLevel.png","language":"Swift","readme":"\u003cp\u003e\u003cimg src=\"https://raw.github.com/NextLevel/NextLevel/master/NextLevel%402x.png\" alt=\"Next Level\" style=\"max-width:100%;\"\u003e\u003c/p\u003e\n\n`NextLevel` is a [Swift](https://developer.apple.com/swift/) camera system designed for easy integration, customized media capture, and image streaming in iOS. Integration can optionally leverage `AVFoundation` or `ARKit`.\n\n[![Build Status](https://travis-ci.org/NextLevel/NextLevel.svg?branch=master)](https://travis-ci.org/NextLevel/NextLevel) [![Pod Version](https://img.shields.io/cocoapods/v/NextLevel.svg?style=flat)](http://cocoadocs.org/docsets/NextLevel/) [![Swift Version](https://img.shields.io/badge/language-swift%205.0-brightgreen.svg)](https://developer.apple.com/swift) [![GitHub license](https://img.shields.io/badge/license-MIT-lightgrey.svg)](https://github.com/NextLevel/NextLevel/blob/master/LICENSE)\n\n|  | Features |\n|:---------:|:---------------------------------------------------------------|\n| \u0026#127916; | “[Vine](http://vine.co)-like” video clip recording and editing |\n| \u0026#128444; | photo capture (raw, jpeg, and video frame) |\n| \u0026#128070; | customizable gestural interaction and interface |\n| \u0026#128160; | [ARKit integration](https://developer.apple.com/arkit/) (beta) |\n| \u0026#128247; | dual, wide angle, telephoto, \u0026 true depth support |\n| \u0026#128034; | adjustable frame rate on supported hardware (ie fast/slow motion capture) |\n| \u0026#127906; | depth data capture support \u0026 portrait effects matte support |\n| \u0026#128269; | video zoom |\n| \u0026#9878; | white balance, focus, and exposure adjustment |\n| \u0026#128294; | flash and torch support |\n| \u0026#128111; | mirroring support |\n| \u0026#9728; | low light boost |\n| \u0026#128374; | smooth auto-focus |\n| \u0026#9881; | configurable encoding and compression settings |\n| \u0026#128736; | simple media capture and editing API |\n| \u0026#127744; | extensible API for image processing and CV |\n| \u0026#128008; | animated GIF creator |\n| \u0026#128526; | face recognition; qr- and bar-codes recognition |\n| \u0026#128038; | [Swift 5](https://developer.apple.com/swift/) |\n\nNeed a different version of Swift?\n* `5.0` - Target your Podfile to the latest release or master\n* `4.2` - Target your Podfile to the `swift4.2` branch\n\n## Quick Start\n\n```ruby\n\n# CocoaPods\npod \"NextLevel\", \"~\u003e 0.16.3\"\n\n# Carthage\ngithub \"nextlevel/NextLevel\" ~\u003e 0.16.3\n\n# Swift PM\nlet package = Package(\n    dependencies: [\n        .Package(url: \"https://github.com/nextlevel/NextLevel\", majorVersion: 0)\n    ]\n)\n\n```\n\nAlternatively, drop the NextLevel [source files](https://github.com/NextLevel/NextLevel/tree/master/Sources) or project file into your Xcode project.\n\n## Important Configuration Note for ARKit and True Depth\n\nARKit and the True Depth Camera software features are enabled with the inclusion of the Swift compiler flag `USE_ARKIT` and `USE_TRUE_DEPTH` respectively.\n\nApple will [reject](https://github.com/NextLevel/NextLevel/issues/106) apps that link against ARKit or the True Depth Camera API and do not use them.\n\nIf you use Cocoapods, you can include `-D USE_ARKIT` or `-D USE_TRUE_DEPTH` with the following `Podfile` addition or by adding it to your Xcode build settings.\n\n```ruby\n  installer.pods_project.targets.each do |target|\n    # setup NextLevel for ARKit use\n    if target.name == 'NextLevel'\n      target.build_configurations.each do |config|\n        config.build_settings['OTHER_SWIFT_FLAGS'] = ['$(inherited)', '-DUSE_ARKIT']\n      end\n    end\n  end\n```\n\n## Overview\n\nBefore starting, ensure that permission keys have been added to your app's `Info.plist`.\n\n```xml\n\u003ckey\u003eNSCameraUsageDescription\u003c/key\u003e\n    \u003cstring\u003eAllowing access to the camera lets you take photos and videos.\u003c/string\u003e\n\u003ckey\u003eNSMicrophoneUsageDescription\u003c/key\u003e\n    \u003cstring\u003eAllowing access to the microphone lets you record audio.\u003c/string\u003e\n```\n\n### Recording Video Clips\n\nImport the library.\n\n```swift\nimport NextLevel\n```\n\nSetup the camera preview.\n\n```swift\nlet screenBounds = UIScreen.main.bounds\nself.previewView = UIView(frame: screenBounds)\nif let previewView = self.previewView {\n    previewView.autoresizingMask = [.flexibleWidth, .flexibleHeight]\n    previewView.backgroundColor = UIColor.black\n    NextLevel.shared.previewLayer.frame = previewView.bounds\n    previewView.layer.addSublayer(NextLevel.shared.previewLayer)\n    self.view.addSubview(previewView)\n}\n```\n\nConfigure the capture session.\n\n```swift\noverride func viewDidLoad() {\n    NextLevel.shared.delegate = self\n    NextLevel.shared.deviceDelegate = self\n    NextLevel.shared.videoDelegate = self\n    NextLevel.shared.photoDelegate = self\n\n    // modify .videoConfiguration, .audioConfiguration, .photoConfiguration properties\n    // Compression, resolution, and maximum recording time options are available\n    NextLevel.shared.videoConfiguration.maximumCaptureDuration = CMTimeMakeWithSeconds(5, 600)\n    NextLevel.shared.audioConfiguration.bitRate = 44000\n }\n```\n\nStart/stop the session when appropriate. These methods create a new \"session\" instance for 'NextLevel.shared.session' when called.\n\n```swift\noverride func viewWillAppear(_ animated: Bool) {\n    super.viewWillAppear(animated)     \n    NextLevel.shared.start()\n    // …\n}\n```\n\n```swift\noverride func viewWillDisappear(_ animated: Bool) {\n    super.viewWillDisappear(animated)        \n    NextLevel.shared.stop()\n    // …\n}\n```\n\nVideo record/pause.\n\n```swift\n// record\nNextLevel.shared.record()\n\n// pause\nNextLevel.shared.pause()\n```\n\n### Editing Recorded Clips\n\nEditing and finalizing the recorded session.\n```swift\n\nif let session = NextLevel.shared.session {\n\n    //..\n\n    // undo\n    session.removeLastClip()\n\n    // various editing operations can be done using the NextLevelSession methods\n\n    // export\n    session.mergeClips(usingPreset: AVAssetExportPresetHighestQuality, completionHandler: { (url: URL?, error: Error?) in\n        if let _ = url {\n            //\n        } else if let _ = error {\n            //\n        }\n     })\n\n    //..\n\n}\n```\nVideos can also be processed using the [NextLevelSessionExporter](https://github.com/NextLevel/NextLevelSessionExporter), a media transcoding library in Swift.\n\n## Custom Buffer Rendering\n\n‘NextLevel’ was designed for sample buffer analysis and custom modification in real-time along side a rich set of camera features.\n\nJust to note, modifications performed on a buffer and provided back to NextLevel may potentially effect frame rate.\n\nEnable custom rendering.\n\n```swift\nNextLevel.shared.isVideoCustomContextRenderingEnabled = true\n```\n\nOptional hook that allows reading `sampleBuffer` for analysis.\n\n```swift\nextension CameraViewController: NextLevelVideoDelegate {\n\n    // ...\n\n    // video frame processing\n    public func nextLevel(_ nextLevel: NextLevel, willProcessRawVideoSampleBuffer sampleBuffer: CMSampleBuffer) {\n        // Use the sampleBuffer parameter in your system for continual analysis\n    }\n```\n\nAnother optional hook for reading buffers for modification, `imageBuffer`. This is also the recommended place to provide the buffer back to NextLevel for recording.\n\n```swift\nextension CameraViewController: NextLevelVideoDelegate {\n\n    // ...\n\n    // enabled by isCustomContextVideoRenderingEnabled\n    public func nextLevel(_ nextLevel: NextLevel, renderToCustomContextWithImageBuffer imageBuffer: CVPixelBuffer, onQueue queue: DispatchQueue) {\n\t\t    // provide the frame back to NextLevel for recording\n        if let frame = self._availableFrameBuffer {\n            nextLevel.videoCustomContextImageBuffer = frame\n        }\n    }\n```\n\nNextLevel will check this property when writing buffers to a destination file. This works for both video and photos with `capturePhotoFromVideo`.\n\n```swift\nnextLevel.videoCustomContextImageBuffer = modifiedFrame\n```\n\n## About\n\nNextLevel was initally a weekend project that has now grown into a open community of camera platform enthusists. The software provides foundational components for managing media recording, camera interface customization, gestural interaction customization, and image streaming on iOS. The same capabilities can also be found in apps such as [Snapchat](http://snapchat.com), [Instagram](http://instagram.com), and [Vine](http://vine.co).\n\nThe goal is to continue to provide a good foundation for quick integration (enabling projects to be taken to the next level) – allowing focus to placed on functionality that matters most whether it's realtime image processing, computer vision methods, augmented reality, or [computational photography](https://om.co/2018/07/23/even-leica-loves-computational-photography/).\n\n## ARKit\n\nNextLevel provides components for capturing ARKit video and photo. This enables a variety of new camera features while leveraging the existing recording capabilities and media management of NextLevel.\n\nIf you are trying to capture frames from SceneKit for ARKit recording, check out the [examples](https://github.com/NextLevel/examples) project.\n\n## Documentation\n\nYou can find [the docs here](https://nextlevel.github.io/NextLevel). Documentation is generated with [jazzy](https://github.com/realm/jazzy) and hosted on [GitHub-Pages](https://pages.github.com).\n\n### Stickers\n\nIf you found this project to be helpful, check out the [Next Level stickers](https://www.stickermule.com/en/user/1070732101/stickers).\n\n### Project\n\nNextLevel is a community – contributions and discussions are welcome!\n\n- Feature idea? Open an [issue](https://github.com/nextlevel/NextLevel/issues).\n- Found a bug? Open an [issue](https://github.com/nextlevel/NextLevel/issues).\n- Need help? Use [Stack Overflow](http://stackoverflow.com/questions/tagged/nextlevel) with the tag ’nextlevel’.\n- Questions? Use [Stack Overflow](http://stackoverflow.com/questions/tagged/nextlevel) with the tag 'nextlevel'.\n- Want to contribute? Submit a pull request.\n\n### Related Projects\n\n- [Player (Swift)](https://github.com/piemonte/player), video player in Swift\n- [PBJVideoPlayer (obj-c)](https://github.com/piemonte/PBJVideoPlayer), video player in obj-c\n- [NextLevelSessionExporter](https://github.com/NextLevel/NextLevelSessionExporter), media transcoding in Swift\n- [GPUImage3](https://github.com/BradLarson/GPUImage3), image processing library\n- [SCRecorder](https://github.com/rFlex/SCRecorder), obj-c capture library\n- [PBJVision](https://github.com/piemonte/PBJVision), obj-c capture library\n\n## Resources\n\n* [iOS Device Camera Summary](https://developer.apple.com/library/archive/documentation/DeviceInformation/Reference/iOSDeviceCompatibility/Cameras/Cameras.html)\n* [AV Foundation Programming Guide](https://developer.apple.com/library/ios/documentation/AudioVideo/Conceptual/AVFoundationPG/Articles/00_Introduction.html)\n* [AV Foundation Framework Reference](https://developer.apple.com/library/ios/documentation/AVFoundation/Reference/AVFoundationFramework/)\n* [ARKit Framework Reference](https://developer.apple.com/documentation/arkit)\n* [Swift Evolution](https://github.com/apple/swift-evolution)\n* [objc.io Camera and Photos](http://www.objc.io/issue-21/)\n* [objc.io Video](http://www.objc.io/issue-23/)\n* [objc.io Core Image and Video](https://www.objc.io/issues/23-video/core-image-video/)\n* [Cameras, ecommerce and machine learning](http://ben-evans.com/benedictevans/2016/11/20/ku6omictaredoge4cao9cytspbz4jt)\n* [Again, iPhone is the default camera](http://om.co/2016/12/07/again-iphone-is-the-default-camera/)\n\n## License\n\nNextLevel is available under the MIT license, see the [LICENSE](https://github.com/NextLevel/NextLevel/blob/master/LICENSE) file for more information.\n","funding_links":[],"categories":[],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fnextlevel%2Fnextlevel","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fnextlevel%2Fnextlevel","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fnextlevel%2Fnextlevel/lists"}