{"id":13496716,"url":"https://github.com/dmrschmidt/DSWaveformImage","last_synced_at":"2025-03-28T19:30:52.975Z","repository":{"id":11633183,"uuid":"14134952","full_name":"dmrschmidt/DSWaveformImage","owner":"dmrschmidt","description":"Generate waveform images from audio files on iOS, macOS \u0026 visionOS in Swift. Native SwiftUI \u0026 UIKit views.","archived":false,"fork":false,"pushed_at":"2024-10-10T06:54:54.000Z","size":13927,"stargazers_count":1110,"open_issues_count":3,"forks_count":117,"subscribers_count":17,"default_branch":"main","last_synced_at":"2025-03-28T08:01:43.486Z","etag":null,"topics":["audio-analysis","audio-files","audio-visualizer","catalyst","fft","fourier-transform","ios","ipad","ipados","mac-catalyst","macos","macosx","swift","swiftui","vision-os","visionos","waveform","waveform-images"],"latest_commit_sha":null,"homepage":"","language":"Swift","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"mit","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/dmrschmidt.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":".github/FUNDING.yml","license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null},"funding":{"github":["dmrschmidt"],"custom":["https://www.buymeacoffee.com/dmrschmidt"]}},"created_at":"2013-11-05T07:31:09.000Z","updated_at":"2025-03-26T04:16:26.000Z","dependencies_parsed_at":"2023-10-21T19:27:38.874Z","dependency_job_id":"e5b28288-4203-4f9c-b7d1-806ae4539eca","html_url":"https://github.com/dmrschmidt/DSWaveformImage","commit_stats":{"total_commits":295,"total_committers":16,"mean_commits":18.4375,"dds":"0.23389830508474574","last_synced_commit":"5f1ce68474df5a4ab055dfd0df5d2da810eaec7c"},"previous_names":[],"tags_count":57,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/dmrschmidt%2FDSWaveformImage","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/dmrschmidt%2FDSWaveformImage/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/dmrschmidt%2FDSWaveformImage/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/dmrschmidt%2FDSWaveformImage/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/dmrschmidt","download_url":"https://codeload.github.com/dmrschmidt/DSWaveformImage/tar.gz/refs/heads/main","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":246088391,"owners_count":20721678,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["audio-analysis","audio-files","audio-visualizer","catalyst","fft","fourier-transform","ios","ipad","ipados","mac-catalyst","macos","macosx","swift","swiftui","vision-os","visionos","waveform","waveform-images"],"created_at":"2024-07-31T19:01:57.763Z","updated_at":"2025-03-28T19:30:52.596Z","avatar_url":"https://github.com/dmrschmidt.png","language":"Swift","readme":"DSWaveformImage - iOS, macOS \u0026 visionOS realtime audio waveform rendering\n===============\n[![Swift Package Manager compatible](https://img.shields.io/badge/spm-compatible-brightgreen.svg?style=flat)](https://swift.org/package-manager)\n\nDSWaveformImage offers a native interfaces for drawing the envelope waveform of audio data \nin **iOS**, **iPadOS**, **macOS**, **visionOS** or via Catalyst. To do so, you can use\n\n* [`WaveformImageView`](Sources/DSWaveformImageViews/UIKit/WaveformImageView.swift) (UIKit) / [`WaveformView`](Sources/DSWaveformImageViews/SwiftUI/WaveformView.swift) (SwiftUI) to render a static waveform from an audio file or \n* [`WaveformLiveView`](Sources/DSWaveformImageViews/UIKit/WaveformLiveView.swift) (UIKit) / [`WaveformLiveCanvas`](Sources/DSWaveformImageViews/SwiftUI/WaveformLiveCanvas.swift) (SwiftUI) to realtime render a waveform of live audio data (e.g. from `AVAudioRecorder`)\n* `WaveformImageDrawer` to generate a waveform `UIImage` from an audio file\n\nAdditionally, you can get a waveform's (normalized) `[Float]` samples directly as well by\ncreating an instance of `WaveformAnalyzer`.\n\nExample UI (included in repository)\n------------\n\nFor a practical real-world example usage of a SwiftUI live audio recording waveform rendering, see [RecordingIndicatorView](Example/DSWaveformImageExample-iOS/SwiftUIExample/SwiftUIExampleView.swift).\n\n\n\u003cimg src=\"./Promotion/recorder-example.png\" alt=\"Audio Recorder Example\" width=\"358\"\u003e\n\nMore related iOS Controls\n------------\n\nYou may also find the following iOS controls written in Swift interesting:\n\n* [SwiftColorWheel](https://github.com/dmrschmidt/SwiftColorWheel) - a delightful color picker\n* [QRCode](https://github.com/dmrschmidt/QRCode) - a customizable QR code generator\n\nIf you really like this library (aka Sponsoring)\n------------\nI'm doing all this for fun and joy and because I strongly believe in the power of open source. On the off-chance though, that using my library has brought joy to you and you just feel like saying \"thank you\", I would smile like a 4-year old getting a huge ice cream cone, if you'd support my via one of the sponsoring buttons ☺️💕\n\nAlternatively, consider supporting me by downloading one of my side project iOS apps. If you're feeling in the mood of sending someone else a lovely gesture of appreciation, maybe check out my iOS app [💌 SoundCard](https://www.soundcard.io) to send them a real postcard with a personal audio message. Or download my ad-supported free to play game [🕹️ Snekris for iOS](https://apps.apple.com/us/app/snekris-play-like-its-1999/id6446217693).\n\n\u003cp float=\"left\"\u003e\n  \u003ca href=\"https://www.buymeacoffee.com/dmrschmidt\" target=\"_blank\"\u003e\n    \u003cimg src=\"https://cdn.buymeacoffee.com/buttons/v2/default-yellow.png\" alt=\"Buy Me A Coffee\" width=\"217\" height=\"60\"\u003e\u003c/a\u003e\n  \n  \u003ca href=\"https://www.snekris.com\" target=\"_blank\"\u003e\n    \u003cimg src=\"http://snekris.com/images/snekris-banner.png\" alt=\"Play Snekris\" width=\"217\" height=\"60\"\u003e\u003c/a\u003e\n\u003c/p\u003e\n\n\nInstallation\n------------\n\n* use SPM: add `https://github.com/dmrschmidt/DSWaveformImage` and set \"Up to Next Major\" with \"14.0.0\"\n\n```swift\nimport DSWaveformImage // for core classes to generate `UIImage` / `NSImage` directly\nimport DSWaveformImageViews // if you want to use the native UIKit / SwiftUI views\n```\n\nUsage\n-----\n\n`DSWaveformImage` provides 3 kinds of tools to use\n* native SwiftUI views - [SwiftUI example usage code](Example/DSWaveformImageExample-iOS/SwiftUIExample/SwiftUIExampleView.swift)\n* native UIKit views - [UIKit example usage code](Example/DSWaveformImageExample-iOS/ViewController.swift)\n* access to the raw renderes and processors\n\nThe core renderes and processors as well as SwiftUI views natively support iOS \u0026 macOS, using `UIImage` \u0026 `NSImage` respectively.\n\n### SwiftUI\n\n#### `WaveformView` - renders a one-off waveform from an audio file:\n\n```swift\n@State var audioURL = Bundle.main.url(forResource: \"example_sound\", withExtension: \"m4a\")!\nWaveformView(audioURL: audioURL)\n```\n\nDefault styling may be overridden if you have more complex requirements:\n\n```swift\n@State var audioURL = Bundle.main.url(forResource: \"example_sound\", withExtension: \"m4a\")!\nWaveformView(audioURL: audioURL) { waveformShape in\n    waveformShape\n        .stroke(LinearGradient(colors: [.red, [.green, red, orange], startPoint: .zero, endPoint: .topTrailing), lineWidth: 3)\n}\n```\n\nSimilar to [AsyncImage](https://developer.apple.com/documentation/swiftui/asyncimage/init(url:scale:content:placeholder:)), a placeholder can be\nset to show until the load and render operation completes successfully. Thanks to [@alfogrillo](https://github.com/alfogrillo)!\n\n```swift\nWaveformView(audioURL: audioURL) { waveformShape in\n    waveformShape\n        .stroke(LinearGradient(colors: [.red, [.green, red, orange], startPoint: .zero, endPoint: .topTrailing), lineWidth: 3)\n} placeholder: {\n    ProgressView()\n}\n```\n\n#### `WaveformLiveCanvas` - renders a live waveform from `(0...1)` normalized samples:\n\n```swift\n@StateObject private var audioRecorder: AudioRecorder = AudioRecorder() // just an example\nWaveformLiveCanvas(samples: audioRecorder.samples)\n```\n\n### UIKit\n\n#### `WaveformImageView` - renders a one-off waveform from an audio file:\n\n```swift\nlet audioURL = Bundle.main.url(forResource: \"example_sound\", withExtension: \"m4a\")!\nwaveformImageView = WaveformImageView(frame: CGRect(x: 0, y: 0, width: 500, height: 300)\nwaveformImageView.waveformAudioURL = audioURL\n```\n\n#### `WaveformLiveView` - renders a live waveform from `(0...1)` normalized samples:\n\nFind a full example in the [sample project's RecordingViewController](Example/DSWaveformImageExample-iOS/RecordingViewController.swift).\n\n```swift\nlet waveformView = WaveformLiveView()\n\n// configure and start AVAudioRecorder\nlet recorder = AVAudioRecorder()\nrecorder.isMeteringEnabled = true // required to get current power levels\n\n// after all the other recording (omitted for focus) setup, periodically (every 20ms or so):\nrecorder.updateMeters() // gets the current value\nlet currentAmplitude = 1 - pow(10, recorder.averagePower(forChannel: 0) / 20)\nwaveformView.add(sample: currentAmplitude)\n```\n\n### Raw API\n\n#### Configuration\n\n*Note:* Calculations are always performed and returned on a background thread, so make sure to return to the main thread before doing any UI work.\n\nCheck `Waveform.Configuration` in [WaveformImageTypes](./Sources/DSWaveformImage/WaveformImageTypes.swift) for various configuration options.\n\n#### `WaveformImageDrawer` - creates a `UIImage` waveform from an audio file:\n\n```swift\nlet waveformImageDrawer = WaveformImageDrawer()\nlet audioURL = Bundle.main.url(forResource: \"example_sound\", withExtension: \"m4a\")!\nlet image = try await waveformImageDrawer.waveformImage(\n    fromAudioAt: audioURL,\n    with: .init(size: topWaveformView.bounds.size, style: .filled(UIColor.black)),\n    renderer: LinearWaveformRenderer()\n)\n\n// need to jump back to main queue\nDispatchQueue.main.async {\n    self.topWaveformView.image = image\n}\n```\n\n#### `WaveformAnalyzer` - calculates an audio file's waveform sample:\n\n```swift\nlet audioURL = Bundle.main.url(forResource: \"example_sound\", withExtension: \"m4a\")!\nwaveformAnalyzer = WaveformAnalyzer()\nlet samples = try await waveformAnalyzer.samples(fromAudioAt: audioURL, count: 200)\nprint(\"samples: \\(samples)\")\n```\n\n### Playback Progress Indication\n\nIf you're playing back audio files and would like to indicate the playback progress to your users, you can [find inspiration in the example app](https://github.com/dmrschmidt/DSWaveformImage/blob/main/Example/DSWaveformImageExample-iOS/ProgressViewController.swift). UIKit and [SwiftUI](https://github.com/dmrschmidt/DSWaveformImage/blob/main/Example/DSWaveformImageExample-iOS/SwiftUIExample/ProgressWaveformView.swift) examples are provided.\n\nBoth approaches will result in something like the image below. \n\n\u003cdiv align=\"center\"\u003e\n  \u003cimg src=\"./Promotion/progress-example.png\" height=\"200\" alt=\"playback progress waveform\"\u003e\n\u003c/div\u003e\n\n\nThere is currently no plan to integrate this as a 1st class citizen to the library itself, as every app will have different design requirements, and `WaveformImageDrawer` as well as `WaveformAnalyzer` are as simple to use as the views themselves as you can see in the examples.\n\n### Loading remote audio files from URL\n\nFor one example way to display waveforms for audio files on remote URLs see https://github.com/dmrschmidt/DSWaveformImage/issues/22.\n\nWhat it looks like\n------------------\n\nWaveforms can be rendered in 2 different ways and 5 different styles each.\n\nBy default [`LinearWaveformRenderer`](https://github.com/dmrschmidt/DSWaveformImage/blob/main/Sources/DSWaveformImage/Renderers/LinearWaveformRenderer.swift) is used, which draws a linear 2D amplitude envelope.\n\n[`CircularWaveformRenderer`](https://github.com/dmrschmidt/DSWaveformImage/blob/main/Sources/DSWaveformImage/Renderers/CircularWaveformRenderer.swift) is available as an alternative, which can be passed in to the `WaveformView` or `WaveformLiveView` respectively. It draws a circular\n2D amplitude envelope.\n\nYou can implement your own renderer by implementing [`WaveformRenderer`](https://github.com/dmrschmidt/DSWaveformImage/blob/main/Sources/DSWaveformImage/Renderers/WaveformRenderer.swift).\n \nThe following styles can be applied to either renderer:\n - **filled**: Use solid color for the waveform.\n - **outlined**: Draws the envelope as an outline with the provided thickness.\n - **gradient**: Use gradient based on color for the waveform.\n - **gradientOutlined**: Use gradient based on color for the waveform. Draws the envelope as an outline with the provided thickness.\n - **striped**: Use striped filling based on color for the waveform.\n\n\u003cdiv align=\"center\"\u003e\n  \u003cimg src=\"./Promotion/screenshot.png\" width=\"500\" alt=\"Screenshot\"\u003e\n\u003c/div\u003e\n\n\n### Live waveform rendering\nhttps://user-images.githubusercontent.com/69365/127739821-061a4345-0adc-4cc1-bfd6-f7cfbe1268c9.mov\n\n\nMigration\n---------\n### In 14.0.0\n* Minimum iOS Deployment target is 15.0, macOS is 12.0 to remove internal usage of deprecated APIs\n* `WaveformAnalyzer` and `WaveformImageDrawer` now return `Result\u003c[Float] | DSImage, Error\u003e` when used with completionHandler for better error handling\n* `WaveformAnalyzer` is now stateless and requires the URL in `.samples(fromAudioAt:count:qos:)` instead of its constructor\n* SwiftUI's `WaveformView` has a new constructor that provides optional access to the underlying `WaveformShape`, which is now used for rendering, see [#78](https://github.com/dmrschmidt/DSWaveformImage/issues/78)\n\n### In 13.0.0\n* Any mentions of `dampening` \u0026 similar were corrected to `damping` etc in [11460b8b](https://github.com/dmrschmidt/DSWaveformImage/commit/11460b8b8203f163868ba774d1533116d2fe68a1). Most notably in `Waveform.Configuration`. See [#64](https://github.com/dmrschmidt/DSWaveformImage/issues/64). \n* styles `.outlined` \u0026 `.gradientOutlined` were added to `Waveform.Style`, see https://github.com/dmrschmidt/DSWaveformImage#what-it-looks-like\n* `Waveform.Position` was removed. If you were using it to place the view somewhere, move this responsibility up to its parent for positioning, like with any other view as well.\n\n### In 12.0.0\n* The rendering pipeline was split out from the analysis. You can now create your own renderes by subclassing [`WaveformRenderer`](https://github.com/dmrschmidt/DSWaveformImage/blob/main/Sources/DSWaveformImage/Renderers/WaveformRenderer.swift).\n* A new [`CircularWaveformRenderer`](https://github.com/dmrschmidt/DSWaveformImage/blob/main/Sources/DSWaveformImage/Renderers/CircularWaveformRenderer.swift) has been added.\n* `position` was removed from `Waveform.Configuration`, see [0447737](https://github.com/dmrschmidt/DSWaveformImage/commit/044773782092becec0424527f6feef061988db7a).\n* new `Waveform.Style` option have been added and need to be accounted for in `switch` statements etc.\n\n### In 11.0.0 \nthe library was split into two: `DSWaveformImage` and `DSWaveformImageViews`. If you've used any of the native views bevore, just add the additional `import DSWaveformImageViews`.\nThe SwiftUI views have changed from taking a Binding to the respective plain values instead.\n\n### In 9.0.0 \na few public API's have been slightly changed to be more concise. All types have also been grouped under the `Waveform` enum-namespace. Meaning `WaveformConfiguration` for instance has become `Waveform.Configuration` and so on.\n\n### In 7.0.0 \ncolors have moved into associated values on the respective `style` enum.\n\n`Waveform` and the `UIImage` category have been removed in 6.0.0 to simplify the API.\nSee `Usage` for current usage.\n\n## See it live in action\n\n[SoundCard - postcards with sound](https://www.soundcard.io) lets you send real, physical postcards with audio messages. Right from your iOS device.\n\nDSWaveformImage is used to draw the waveforms of the audio messages that get printed on the postcards sent by [SoundCard - postcards with audio](https://www.soundcard.io).\n\n\u0026nbsp;\n\n\u003cdiv align=\"center\"\u003e\n    \u003ca href=\"http://bit.ly/soundcardio\"\u003e\n        \u003cimg src=\"./Promotion/appstore.svg\" alt=\"Download SoundCard\"\u003e\n        \nDownload SoundCard on the App Store.\n    \u003c/a\u003e\n\u003c/div\u003e\n\n\u0026nbsp;\n\n\u003ca href=\"http://bit.ly/soundcardio\"\u003e\n\u003cimg src=\"https://www.soundcard.io/images/opengraph-preview.jpg\" alt=\"Screenshot\"\u003e\n\u003c/a\u003e\n","funding_links":["https://github.com/sponsors/dmrschmidt","https://www.buymeacoffee.com/dmrschmidt"],"categories":["Swift","OOM-Leaks-Crash","Libraries"],"sub_categories":["Waver","SwiftUI"],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fdmrschmidt%2FDSWaveformImage","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fdmrschmidt%2FDSWaveformImage","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fdmrschmidt%2FDSWaveformImage/lists"}