{"id":1206,"url":"https://github.com/rFlex/SCRecorder","last_synced_at":"2025-07-30T20:32:45.987Z","repository":{"id":9899310,"uuid":"11906109","full_name":"rFlex/SCRecorder","owner":"rFlex","description":"iOS camera engine with Vine-like tap to record, animatable filters, slow motion, segments editing","archived":false,"fork":false,"pushed_at":"2021-05-25T11:29:54.000Z","size":4191,"stargazers_count":3062,"open_issues_count":235,"forks_count":576,"subscribers_count":136,"default_branch":"master","last_synced_at":"2025-07-20T11:47:21.592Z","etag":null,"topics":[],"latest_commit_sha":null,"homepage":"","language":"Objective-C","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"apache-2.0","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/rFlex.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null}},"created_at":"2013-08-05T18:55:14.000Z","updated_at":"2025-07-13T18:05:16.000Z","dependencies_parsed_at":"2022-07-13T11:20:29.177Z","dependency_job_id":null,"html_url":"https://github.com/rFlex/SCRecorder","commit_stats":null,"previous_names":[],"tags_count":45,"template":false,"template_full_name":null,"purl":"pkg:github/rFlex/SCRecorder","repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/rFlex%2FSCRecorder","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/rFlex%2FSCRecorder/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/rFlex%2FSCRecorder/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/rFlex%2FSCRecorder/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/rFlex","download_url":"https://codeload.github.com/rFlex/SCRecorder/tar.gz/refs/heads/master","sbom_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/rFlex%2FSCRecorder/sbom","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":267552094,"owners_count":24105998,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","status":"online","status_checked_at":"2025-07-28T02:00:09.689Z","response_time":68,"last_error":null,"robots_txt_status":"success","robots_txt_updated_at":"2025-07-24T06:49:26.215Z","robots_txt_url":"https://github.com/robots.txt","online":true,"can_crawl_api":true,"host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":[],"created_at":"2024-01-05T20:15:41.253Z","updated_at":"2025-07-30T20:32:44.917Z","avatar_url":"https://github.com/rFlex.png","language":"Objective-C","readme":"SCRecorder\n===============\n\n\u003cimg src=\"filters.gif\" width=\"230\" height=\"408\" /\u003e\n\u003cimg src=\"screenshot_2.png\" width=\"230\" height=\"408\" /\u003e\n\u003cimg src=\"animated_filters.gif\" width=\"230\" height=\"408\" /\u003e\n\nA Vine/Instagram like audio/video recorder and filter framework in Objective-C.\n\nIn short, here is a short list of the cool things you can do:\n- Record multiple video segments\n- Zoom/Focus easily\n- Remove any record segment that you don't want\n- Display the result into a convenient video player\n- Save the record session for later somewhere using a serializable NSDictionary (works in NSUserDefaults)\n- Add a configurable and animatable video filter using Core Image\n- Add a UIView as overlay, so you can render anything you want on top of your video\n- Merge and export the video using fine tunings that you choose\n\n\nExamples for iOS are provided.\n\nWant something easy to create your filters in this project? Checkout https://github.com/rFlex/CoreImageShop\n\nFramework needed:\n- CoreVideo\n- AudioToolbox\n- GLKit\n\nPodfile\n----------------\n\nIf you are using cocoapods, you can use this project with the following Podfile\n\n```ruby\nplatform :ios, '7.0'\npod 'SCRecorder'\n```\n\nManual install\n----------------\n\nDrag and drop the [SCRecorder.xcodeproject](Library/SCRecorder.xcodeproject) in your project. In your project, add the libSCRecorder.a dependency in the Build Phases into the \"Link Binary with Librairies\" section (as done in the example).\n\nSwift\n---------------\n\nFor using the project in Swift, follow either the Podfile or Manual install instructions (they both work on Swift too). Then, to allow SCRecorder to be accessible from Swift, just add the following line in your bridge header:\n```objective-c\n#import \u003cSCRecorder/SCRecorder.h\u003e\n```\n\nEasy and quick\n----------------\n\n[SCRecorder](Library/Sources/SCRecorder.h) is the main class that connect the inputs and outputs together. It processes the audio and video buffers and append them in a [SCRecordSession](Library/Sources/SCRecordSession.h).\n\n```objective-c\n// Create the recorder\nSCRecorder *recorder = [SCRecorder recorder]; // You can also use +[SCRecorder sharedRecorder]\n\t\n// Start running the flow of buffers\nif (![recorder startRunning]) {\n\tNSLog(@\"Something wrong there: %@\", recorder.error);\n}\n\n// Create a new session and set it to the recorder\nrecorder.session = [SCRecordSession recordSession];\n\n// Begin appending video/audio buffers to the session\n[recorder record];\n\n// Stop appending video/audio buffers to the session\n[recorder pause];\n```\n\nConfiguring the recorder\n--------------------\n\nYou can configure the input device settings (framerate of the video, whether the flash should be enabled etc...) directly on the SCRecorder.\n\n```objective-c\n// Set the AVCaptureSessionPreset for the underlying AVCaptureSession.\nrecorder.captureSessionPreset = AVCaptureSessionPresetHigh;\n\n// Set the video device to use\nrecorder.device = AVCaptureDevicePositionFront;\n\n// Set the maximum record duration\nrecorder.maxRecordDuration = CMTimeMake(10, 1);\n\n// Listen to the messages SCRecorder can send\nrecorder.delegate = self;\n```\n\nYou can configure the video, audio and photo output settings in their configuration instance ([SCVideoConfiguration](Library/Sources/SCVideoConfiguration.h), [SCAudioConfiguration](Library/Sources/SCAudioConfiguration.h), [SCPhotoConfiguration](Library/Sources/SCPhotoConfiguration.h)),  that you can access just like this:\n```objective-c\n\n// Get the video configuration object\nSCVideoConfiguration *video = recorder.videoConfiguration;\n\n// Whether the video should be enabled or not\nvideo.enabled = YES;\n// The bitrate of the video video\nvideo.bitrate = 2000000; // 2Mbit/s\n// Size of the video output\nvideo.size = CGSizeMake(1280, 720);\n// Scaling if the output aspect ratio is different than the output one\nvideo.scalingMode = AVVideoScalingModeResizeAspectFill;\n// The timescale ratio to use. Higher than 1 makes a slow motion, between 0 and 1 makes a timelapse effect\nvideo.timeScale = 1;\n// Whether the output video size should be infered so it creates a square video\nvideo.sizeAsSquare = NO;\n// The filter to apply to each output video buffer (this do not affect the presentation layer)\nvideo.filter = [SCFilter filterWithCIFilterName:@\"CIPhotoEffectInstant\"];\n\n// Get the audio configuration object\nSCAudioConfiguration *audio = recorder.audioConfiguration;\n\n// Whether the audio should be enabled or not\naudio.enabled = YES;\n// the bitrate of the audio output\naudio.bitrate = 128000; // 128kbit/s\n// Number of audio output channels\naudio.channelsCount = 1; // Mono output\n// The sample rate of the audio output\naudio.sampleRate = 0; // Use same input \n// The format of the audio output\naudio.format = kAudioFormatMPEG4AAC; // AAC\n\n// Get the photo configuration object\nSCPhotoConfiguration *photo = recorder.photoConfiguration;\nphoto.enabled = NO;\n```\n\nPlaying back your recording\n----------------\n\nSCRecorder provides two easy classes to play a video/audio asset: [SCPlayer](Library/Sources/SCPlayer.h) and [SCVideoPlayerView](Library/Sources/SCVideoPlayerView.h).\n\nSCPlayer is a subclass of AVPlayer that adds some methods to make it easier to use. Plus, it also adds the ability to use a filter renderer, to apply a live filter on a video. \n\n```objective-c\nSCRecordSession *recordSession = ... // Some instance of a record session\n\t\n// Create an instance of SCPlayer\nSCPlayer *player = [SCPlayer player];\n\t\n// Set the current playerItem using an asset representing the segments\n// of an SCRecordSession\n[player setItemByAsset:recordSession.assetRepresentingSegments];\n\t\nUIView *view = ... // Some view that will get the video\n\t\n// Create and add an AVPlayerLayer\nAVPlayerLayer *playerLayer = [AVPlayerLayer playerLayerWithPlayer:player];\nplayerLayer.frame = view.bounds;\n[view.layer.addSublayer:playerLayer];\n\n// Start playing the asset and render it into the view\n[player play];\n\t\n// Render the video directly through a filter\nSCFilterImageView *filterView = [[SCFilterImageView alloc] initWithFrame:view.bounds];\nfilterVieww.filter = [SCFilter filterWithCIFilterName:@\"CIPhotoEffectInstant\"];\n\t\nplayer.SCImageView = filterView;\n\t\n[view addSubview:filterView];\n```\n\nSCVideoPlayerView is a subclass of UIView that holds an SCPlayer. The video buffers are rendered directly in this view. It removes the need to handle the creation of an AVPlayerLayer and makes it really easy to play a video in your app.\n\n```objective-c\nSCRecordSession *recordSession = ... // Some instance of a record session\n\t\nSCVideoPlayerView *playerView = // Your instance somewhere\n\t\n// Set the current playerItem using an asset representing the segments\n// of an SCRecordSession\n[playerView.player setItemByAsset:recordSession.assetRepresentingSegments];\n\t\n// Start playing the asset and render it into the view\n[playerView.player play];\n```\n\nEditing your recording\n--------------------\n\n[SCRecordSession](Library/Sources/SCRecordSession.h) gets the video and audio buffers from the SCRecorder and append them into a [SCRecordSessionSegment](Library/Sources/SCRecordSessionSegment.h). A SCRecordSessionSegment is just a continuous file, really. When calling [SCRecorder pause], the SCRecorder asks the SCRecordSession to asynchronously complete its current record segment. Once done, the segment will be added in the [SCRecordSession segments] array. SCRecorder has also [SCRecorder pause:] with a completion handler. In this method, the completion handler will be called once the SCRecordSession has completed and added the record segment in the segments array.\n\nYou can add/remove segments easily in a SCRecordSession. You can also merge all the segments into one file.\n\n```objective-c\nSCRecordSession *recordSession = ... // An SCRecordSession instance\n\n// Remove the last segment\n[recordSession removeLastSegment];\n\n// Add a segment at the end\n[recordSession addSegment:[SCRecordSessionSegment segmentWithURL:anURL info:nil]];\n\n// Get duration of the whole record session\nCMTime duration = recordSession.duration;\n\n// Get a playable asset representing all the record segments\nAVAsset *asset = recordSession.assetRepresentingSegments;\n\n// Get some information about a particular segment\nSCRecordSessionSegment *segment = [recordSession.segments firstObject];\n\n// Get thumbnail of this segment\nUIImage *thumbnail = segment.thumbnail;\n\n// Get duration of this segment\nCMTime duration = segment.duration;\n\n```\t\n\nExporting your recording\n---------------------\n\nYou basically have two ways for exporting an SCRecordSession.\n\nFirst, you can use [SCRecordSession mergeSegmentsUsingPreset:completionHandler:]. This methods takes an AVAssetExportPreset as parameter and will use an AVAssetExportSession behind the hood. Although this is the fastest and easiest way of merging the record segments, this also provide no configuration on the output settings.\n\n```objective-c\n\n// Merge all the segments into one file using an AVAssetExportSession\n[recordSession mergeSegmentsUsingPreset:AVAssetExportPresetHighestQuality completionHandler:^(NSURL *url, NSError *error) {\n\tif (error == nil) {\n\t   \t// Easily save to camera roll\n\t\t[url saveToCameraRollWithCompletion:^(NSString *path, NSError *saveError) {\n\t\t     \n\t\t}];\n\t} else {\n\t\tNSLog(@\"Bad things happened: %@\", error);\n\t}\n}];\n```\n\nYou can also use [SCAssetExportSession](Library/Sources/SCAssetExportSession.h), which is the SCRecorder counterpart of AVAssetExportSession. This provides a lot more options, like configuring the bitrate, the output video size, adding a filter, adding a watermark... This is at a cost of a little more configuration and more processing time. Like SCRecorder, SCAssetExportSession also holds an SCVideoConfiguration and SCAudioConfiguration instance (ain't that amazing?).\n\n```objective-c\n\nAVAsset *asset = session.assetRepresentingSegments;\nSCAssetExportSession assetExportSession = [[SCAssetExportSession alloc] initWithAsset:asset];\nassetExportSession.outputUrl = recordSession.outputUrl;\nassetExportSession.outputFileType = AVFileTypeMPEG4;\nassetExportSession.videoConfiguration.filter = [SCFilter filterWithCIFilterName:@\"CIPhotoEffectInstant\"];\nassetExportSession.videoConfiguration.preset = SCPresetHighestQuality;\nassetExportSession.audioConfiguration.preset = SCPresetMediumQuality;\n[assetExportSession exportAsynchronouslyWithCompletionHandler: ^{\n\tif (assetExportSession.error == nil) {\n\t\t// We have our video and/or audio file\n\t} else {\n\t\t// Something bad happened\n\t}\n}];\n\n```\n\nCreating/manipulating filters\n---------------------\n\nSCRecorder comes with a filter API built on top of Core Image. [SCFilter](Library/Sources/SCFilter.h) is the class that wraps a CIFilter. Each filter can also have a chain of sub filters. When processing an image through a filter, first all its sub filters will process the image then the filter itself. An SCFilter can be saved directly into a file and restored from this file.\n\n```objective-c\n\n\nSCFilter *blackAndWhite = [SCFilter filterWithCIFilterName:@\"CIColorControls\"];\n[blackAndWhite setParameterValue:@0 forKey:@\"inputSaturation\"];\n\nSCFilter *exposure = [SCFilter filterWithCIFilterName:@\"CIExposureAdjust\"];\n[exposure setParameterValue:@0.7 forKey:@\"inputEV\"];\n\n// Manually creating a filter chain\nSCFilter *filter = [SCFilter emptyFilter];\n[filter addSubFilter:blackAndWhite];\n[filter addSubFilter:exposure];\n\nSCVideoConfiguration *videoConfiguration = ... // A video configuration\n\nvideoConfiguration.filter = blackAndWhite; // Will render a black and white video\nvideoConfiguration.filter = exposure; // Will render a video with less exposure\nvideoConfiguration.filter = filter; // Will render a video with both black and white and less exposure\n\n// Saving to a file\nNSError *error = nil;\n[filter writeToFile:[NSURL fileUrlWithPath:@\"some-url.cisf\"] error:\u0026error];\nif (error == nil) {\n\n}\n\n// Restoring the filter group\nSCFilter *restoredFilter = [SCFilter filterWithContentsOfUrl:[NSURL fileUrlWithPath:@\"some-url.cisf\"]];\n\n// Processing a UIImage through the filter\nUIImage *myImage = ... // Some image\nUIImage *processedImage = [restoredFilter UIImageByProcessingUIImage:myImage];\n\n// Save it to the photo library\n[processedImage saveToCameraRollWithCompletion: ^(NSError *error) {\n\n}];\n```\n\nIf you want to create your own filters easily, you can also check out [CoreImageShop](https://github.com/rFlex/CoreImageShop) which is a Mac application that will generate serialized SCFilter directly useable by the filter classes in this project.\n\nUsing the filters\n---------------------\n\nSCFilter can be either used in a view to render a filtered image in real time, or in a processing object to render the filter to a file. You can use an SCFilter in one of the following classes:\n\n- [SCVideoConfiguration](Library/Sources/SCVideoConfiguration.h) (processing)\n- [SCImageView](Library/Sources/SCImageView.h) (live rendering)\n- [SCSwipeableFilterView](Library/Sources/SCSwipeableFilterView.h) (live rendering)\n\nAnimating the filters\n----------------------\n\nParameters of SCFilter can be animated. You can for instance, progressively blur your video. To do so, you need to add an animation within an SCFilter. Animations are represented as SCFilterAnimation which is a model object that represents a ramp from a start value to an end value and start applying at a given time and duration.\n\nSome examples:\n\n```objective-c\n// Fade from completely blurred to sharp at the beginning of the video\nSCFilter *blurFadeFilter = [SCFilter filterWithCIFilterName:@\"CIFilterGaussianBlur\"];\n[blurFadeFilter addAnimationForPameterKey:kCIInputRadiusKey startValue:@100 endValue:@0 startTime:0 duration:0.5];\n\n// Make the video instantly become black and white at 2 seconds for 1 second\nSCFilter *blackAndWhite = [SCFilter filterWithCIFilterName:@\"CIColorControls\"];\n[blackAndWhite addAnimationForParameterKey:kCIInputSaturationKey startValue:@1 endValue:@1 startTime:0 duration:2];\n[blackAndWhite addAnimationForParameterKey:kCIInputSaturationKey startValue:@0 endValue:@0 startTime:2 duration:1];\n[blackAndWhite addAnimationForParameterKey:kCIInputSaturationKey startValue:@1 endValue:@1 startTime:3 duration:1];\n```\n\nSome details about the other provided classes\n---------------------\n\n#### [SCRecorderToolsView](Library/Sources/SCRecorderToolsView.h)\n\nConfigurable view that can have an SCRecorder instance and handle tap to focus, pinch to zoom.\n\n#### [SCImageView](Library/Sources/SCImageView.h)\n\nClass that can render a CIImage through either EAGL, Metal or CoreGraphics. This class is intended for live rendering of CIImage's. If you want to alter the rendering when subclassing, you can override renderedCIImageInRect:.\n\n#### [SCFilterImageView](Library/Sources/SCImageView.h)\n\nA subclass of SCImageView that can have a filter. It renders the input CIImage using the SCFilter, if there is any.\n\n#### [SCSwipeableFilterView](Library/Sources/SCSwipeableFilterView.h)\n\nA subclass of SCImageView that has a scrollview and a list of SCFilter. It let the user scrolls between the filters so he can chose one. The selected filter can be retrieved using -[SCSwipeableFilterView selectedFilter]. This basically works the same as the Snapchat composition page.\n\n#### [SCPlayer](Library/Sources/SCPlayer.h)\n\nPlayer based on the Apple AVPlayer. It adds some convenience methods and the possibility to have a CIImageRenderer that will be used to render the video image buffers. You can combine this class with a CIImageRenderer to render a live filter on a video.\n\n#### [SCVideoPlayerView](Library/Sources/SCVideoPlayerView.h)\n\nA view that render an SCPlayer easily. It supports tap to play/pause. By default, it holds an SCPlayer instance itself and share the same lifecycle as this SCPlayer. You can disable this feature by calling +[SCVideoPlayerView setAutoCreatePlayerWhenNeeded:NO].\n","funding_links":[],"categories":["Hardware","Media","Objective-C  Stars 1000以内排名整理","etc"],"sub_categories":["Camera","Other free courses"],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2FrFlex%2FSCRecorder","html_url":"https://awesome.ecosyste.ms/projects/github.com%2FrFlex%2FSCRecorder","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2FrFlex%2FSCRecorder/lists"}