{"id":13908322,"url":"https://github.com/ThibaultBee/StreamPack","last_synced_at":"2025-07-18T07:30:53.284Z","repository":{"id":37977786,"uuid":"262623449","full_name":"ThibaultBee/StreamPack","owner":"ThibaultBee","description":"Multiprotocol (SRT, RTMP and others) live streaming libraries for Android","archived":false,"fork":false,"pushed_at":"2025-07-09T10:23:25.000Z","size":9563,"stargazers_count":262,"open_issues_count":15,"forks_count":82,"subscribers_count":5,"default_branch":"main","last_synced_at":"2025-07-10T16:40:53.246Z","etag":null,"topics":["aac","android","av1","h264-avc","hevc","live-streaming","live-streaming-videos","opus","rtmp","srt","streaming","vp9"],"latest_commit_sha":null,"homepage":"https://thibaultbee.github.io/StreamPack","language":"Kotlin","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"apache-2.0","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/ThibaultBee.png","metadata":{"files":{"readme":"README.md","changelog":"CHANGELOG.md","contributing":null,"funding":".github/FUNDING.yml","license":"LICENSE.md","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null,"zenodo":null},"funding":{"github":"ThibaultBee"}},"created_at":"2020-05-09T17:30:14.000Z","updated_at":"2025-07-09T19:28:44.000Z","dependencies_parsed_at":"2023-10-13T02:47:45.161Z","dependency_job_id":"83662d45-9d7d-480a-b4fa-9b9c8434f585","html_url":"https://github.com/ThibaultBee/StreamPack","commit_stats":null,"previous_names":[],"tags_count":24,"template":false,"template_full_name":null,"purl":"pkg:github/ThibaultBee/StreamPack","repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/ThibaultBee%2FStreamPack","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/ThibaultBee%2FStreamPack/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/ThibaultBee%2FStreamPack/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/ThibaultBee%2FStreamPack/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/ThibaultBee","download_url":"https://codeload.github.com/ThibaultBee/StreamPack/tar.gz/refs/heads/main","sbom_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/ThibaultBee%2FStreamPack/sbom","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":265720436,"owners_count":23817237,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["aac","android","av1","h264-avc","hevc","live-streaming","live-streaming-videos","opus","rtmp","srt","streaming","vp9"],"created_at":"2024-08-06T23:02:38.431Z","updated_at":"2025-07-18T07:30:53.249Z","avatar_url":"https://github.com/ThibaultBee.png","language":"Kotlin","readme":"# StreamPack: RTMP and [SRT](https://github.com/Haivision/srt) live streaming SDK for Android\n\nStreamPack is a flexible live streaming library for Android made for both demanding video\nbroadcasters and new video enthusiasts.\n\nIt is designed to be used in live streaming and gaming apps.\n\n## Setup\n\nGet StreamPack core latest artifacts on Maven Central:\n\n```groovy\ndependencies {\n    implementation 'io.github.thibaultbee.streampack:streampack-core:3.0.0-RC2'\n    // For UI (incl. PreviewView)\n    implementation 'io.github.thibaultbee.streampack:streampack-ui:3.0.0-RC2'\n    // For services (incl. screen capture/media projection service)\n    implementation 'io.github.thibaultbee.streampack:streampack-services:3.0.0-RC2'\n    // For RTMP\n    implementation 'io.github.thibaultbee.streampack:streampack-rtmp:3.0.0-RC2'\n    // For SRT\n    implementation 'io.github.thibaultbee.streampack:streampack-srt:3.0.0-RC2'\n}\n```\n\n## Features\n\n* Video:\n    * Source: Cameras, Screen recorder\n      or [custom video source](docs/AdvancedStreamer.md#creates-your-custom-sources)\n    * Orientation: portrait or landscape\n    * Codec: HEVC/H.265, AVC/H.264, VP9 or AV1\n    * HDR (experimental, see https://github.com/ThibaultBee/StreamPack/discussions/91)\n    * Configurable bitrate, resolution, frame rate (tested up to 60), encoder level, encoder profile\n    * Video only mode\n    * Device video capabilities\n    * Switch between video sources\n    * Camera settings: auto-focus, exposure, white balance, zoom, flash,...\n* Audio:\n    * Source: Microphone, device audio\n      or [custom audio source](docs/AdvancedStreamer.md#creates-your-custom-sources)\n    * Codec: AAC:LC, HE, HEv2,... or Opus\n    * Configurable bitrate, sample rate, stereo/mono, data format\n    * Processing: Noise suppressor or echo cancellation\n    * Audio only mode\n    * Device audio capabilities\n    * Switch between audio sources\n* File: TS, FLV, MP4, WebM and Fragmented MP4\n    * Write to a single file or multiple chunk files\n* Streaming: RTMP/RTMPS or SRT\n    * [Record to a file and stream at the same time](docs/LiveAndRecordSimultaneously.md)\n    * Support for enhanced RTMP\n    * Ultra low-latency based on [SRT](https://github.com/Haivision/srt)\n    * Network adaptive bitrate mechanism for [SRT](https://github.com/Haivision/srt)\n\n## Quick start\n\nIf you want to create a new application, you should use the\ntemplate [StreamPack boilerplate](https://github.com/ThibaultBee/StreamPack-boilerplate). In 5\nminutes, you will be able to stream live video to your server.\n\n## Getting started\n\n### Getting started for a camera stream\n\n1. Request the required permissions in your Activity/Fragment. See the\n   [Permissions](#permissions) section for more information.\n\n2. Creates a `View` to display the preview in your layout\n\n   As a camera preview, you can also use a `SurfaceView`, a `TextureView` or any\n   `View` where that can provide a `Surface`.\n\n   To simplify integration, StreamPack provides an `PreviewView` in the `streampack-ui` package.\n\n    ```xml\n    \n    \u003clayout\u003e\n        \u003cio.github.thibaultbee.streampack.views.PreviewView android:id=\"@+id/preview\"\n            android:layout_width=\"match_parent\" android:layout_height=\"match_parent\"\n            app:enableZoomOnPinch=\"true\" /\u003e\n    \u003c/layout\u003e\n    ```\n\n   `app:enableZoomOnPinch` is a boolean to enable zoom on pinch gesture.\n\n3. Instantiates the streamer (main live streaming class)\n\n   A `Streamer` is a class that represents a whole streaming pipeline from capture to endpoint (\n   incl. encoding, muxing, sending).\n   Multiple streamers are available depending on the number of independent outputs you want to\n   have:\n    - `SingleStreamer`: for a single output (such as live or record)\n    - `DualStreamer`: for 2 independent outputs (such as independent live and record)\n    - for multiple outputs, you can use the `StreamerPipeline` class that allows to create more\n      complex pipeline with multiple independent outputs (such as audio in one file, video in\n      another file)\n\n   The `SingleStreamer` and the `DualStreamer` comes with factory for `Camera` and\n   `MediaProjection` (for screen capture).\n   Otherwise, you can set the audio and the video source manually.\n\n    ```kotlin\n    /**\n     * Most StreamPack components are coroutine based.\n     * Suspend and flow have to be called from a coroutine scope.\n     * Android comes with coroutine scopes like `lifecycleScope` or `viewModelScope`.\n     * Call suspend functions from a coroutine scope:\n     *  viewModelScope.launch {\n     *  }\n     */\n    val streamer = cameraSingleStreamer(context = requireContext())\n    // To have multiple independent outputs (like for live and record), use a `cameraDualStreamer` or even the `StreamerPipeline`.\n    // You can also use the `SingleStreamer`or the `DualStreamer` and add later the audio and video source with `setAudioSource` \n    // and `setVideoSource`.\n    ```\n\n   For more information, check the [Streamers](docs/Streamers.md) documentation.\n\n4. Configures audio and video settings\n\n    ```kotlin\n    val streamer = cameraSingleStreamer(context = requireContext()) // Already instantiated streamer\n    \n    // Creates a new audio and video config\n    val audioConfig = AudioConfig(\n        startBitrate = 128000,\n        sampleRate = 44100,\n        channelConfig = AudioFormat.CHANNEL_IN_STEREO\n    )\n    \n    val videoConfig = VideoConfig(\n        startBitrate = 2000000, // 2 Mb/s\n        resolution = Size(1280, 720),\n        fps = 30\n    )\n    \n    // Sets the audio and video config\n    viewModelScope.launch {\n        streamer.setAudioConfig(audioConfig)\n        streamer.setVideoConfig(videoConfig)\n    }\n    ```\n\n5. Inflates the preview with the streamer\n\n    ```kotlin\n    val streamer = cameraSingleStreamer(context = requireContext()) // Already instantiated streamer\n    val preview = findViewById\u003cPreviewView\u003e(R.id.preview) // Already inflated preview\n    /**\n     * If the preview is a `PreviewView`\n     */\n    preview.streamer = streamer\n    // Then start the preview\n    streamer.startPreview()\n    \n    /**\n     * Otherwise if the preview is in a [SurfaceView], a [TextureView], a [Surface],... you can use:\n     */\n    streamer.startPreview(preview)\n    ```\n\n6. Sets the device orientation\n\n    ```kotlin\n    // Already instantiated streamer\n    val streamer = cameraSingleStreamer(context = requireContext())\n    \n    // Sets the device orientation\n    streamer.setTargetRotation(Surface.ROTATION_90) // Or Surface.ROTATION_0, Surface.ROTATION_180, Surface.ROTATION_270\n    ```\n\n   StreamPack comes with 2 `RotationProvider` that fetches and listens the device rotation:\n\n    - the `SensorRotationProvider`. The `SensorRotationProvider` is backed by the\n      `OrientationEventListener` and it follows the device orientation.\n    - the `DisplayRotationProvider`. The `DisplayRotationProvider` is backed by the `DisplayManager`\n      and if orientation is locked, it will return the last known orientation.\n\n    ```kotlin\n    val streamer = cameraSingleStreamer(context = requireContext()) // Already instantiated streamer\n    val rotationProvider = SensorRotationProvider(context = requireContext())\n    \n    // Sets the device orientation\n   rotationProvider.addListener(object : IRotationProvider.Listener {\n        override fun onOrientationChanged(rotation: Int) {\n            streamer.setTargetRotation(rotation)\n        }\n    })\n\n    // Don't forget to remove the listener when you don't need it anymore\n    rotationProvider.removeListener(listener)\n    ```\n\n   You can transform the `RotationProvider` into a `Flow` provider through the `asFlowProvider`.\n\n   ```kotlin\n    val streamer = cameraSingleStreamer(context = requireContext()) // Already instantiated streamer\n    val rotationProvider = SensorRotationProvider(context = requireContext())\n   \n    // For coroutine based\n    val rotationFlowProvider = rotationProvider.asFlowProvider()\n    // Then in a coroutine suspend function\n    rotationFlowProvider.rotationFlow.collect { rotation -\u003e\n    streamer.setTargetRotation(rotation)\n    }\n    ```\n\n   You can also create your own `targetRotation` provider.\n\n7. Starts the live streaming\n\n    ```kotlin\n    // Already instantiated streamer\n    val streamer = cameraSingleStreamer(context = requireContext())\n    \n    val descriptor =\n        UriMediaDescriptor(\"rtmps://serverip:1935/s/streamKey\") // For RTMP/RTMPS. Uri also supports SRT url, file path, content path,...\n    /**\n     * Alternatively, you can use object syntax:\n     * - RtmpMediaDescriptor(\"rtmps\", \"serverip\", 1935, \"s\", \"streamKey\") // For RTMP/RTMPS\n     * - SrtMediaDescriptor(\"serverip\", 1234) // For SRT\n     */\n    \n    streamer.startStream(descriptor) \n    // You can also use:\n    // streamer.startStream(\"rtmp://serverip:1935/s/streamKey\") // For RTMP/RTMPS\n    ```\n\n8. Stops and releases the streamer\n\n    ```kotlin\n    // Already instantiated streamer\n    val streamer = cameraSingleStreamer(context = requireContext())\n    \n    streamer.stopStream()\n    streamer.close() // Disconnect from server or close the file\n    streamer.release()\n    ```\n\nFor more detailed explanation, check out\nthe [documentation](#documentations).\n\nFor a complete example, check out the [demos/camera](demos/camera) directory.\n\n### Getting started for a screen recorder stream\n\n1. Add the `streampack-services` dependency in your `build.gradle` file:\n\n    ```groovy\n    dependencies {\n        implementation 'io.github.thibaultbee.streampack:streampack-services:3.0.0-RC2'\n    }\n    ```\n\n2. Requests the required permissions in your Activity/Fragment. See the\n   [Permissions](#permissions) section for more information.\n3. Creates a `MyService` that extends `MediaProjectionService` (so you can customize\n   notifications among other things).\n4. Creates a screen record `Intent` and requests the activity result\n\n    ```kotlin\n    MediaProjectionUtils.createScreenCaptureIntent(context = requireContext())\n    ```\n\n5. Starts the service\n\n    ```kotlin\n    MediaProjectionService.bindService(\n        requireContext(),\n        MyService::class.java,\n        result.resultCode,\n        result.data,\n        { streamer -\u003e\n            try {\n                configure(streamer)\n            } catch (t: Throwable) {\n                // Handle exception\n            }\n            startStream(streamer)\n        }\n    )\n    ```\n\nFor a complete example, check out the [demos/screenrecorder](demos/screenrecorder) directory .\n\n## Permissions\n\nYou need to add the following permissions in your `AndroidManifest.xml`:\n\n```xml\n\n\u003cmanifest\u003e\n    \u003c!-- Only for a live --\u003e\n    \u003cuses-permission android:name=\"android.permission.INTERNET\" /\u003e\n    \u003c!-- Only for a local record --\u003e\n    \u003cuses-permission android:name=\"android.permission.WRITE_EXTERNAL_STORAGE\" /\u003e\n\u003c/manifest\u003e\n```\n\nTo record locally, you also need to request the following dangerous\npermission: `android.permission.WRITE_EXTERNAL_STORAGE`.\n\n### Permissions for a camera stream\n\nTo use the camera, you need to request the following permission:\n\n```xml\n\n\u003cmanifest\u003e\n    \u003cuses-permission android:name=\"android.permission.RECORD_AUDIO\" /\u003e\n    \u003cuses-permission android:name=\"android.permission.CAMERA\" /\u003e\n\u003c/manifest\u003e\n```\n\nYour application also has to request the following dangerous\npermission: `android.permission.RECORD_AUDIO`, `android.permission.CAMERA`.\n\nFor the PlayStore, your application might declare this in its `AndroidManifest.xml`\n\n```xml\n\n\u003cmanifest\u003e\n    \u003cuses-feature android:name=\"android.hardware.camera\" android:required=\"true\" /\u003e\n    \u003cuses-feature android:name=\"android.hardware.camera.autofocus\" android:required=\"false\" /\u003e\n\u003c/manifest\u003e\n```\n\n### Permissions for a screen recorder stream\n\nTo use the screen recorder, you need to request the following permission:\n\n```xml\n\n\u003cmanifest\u003e\n    \u003cuses-permission android:name=\"android.permission.FOREGROUND_SERVICE_MEDIA_PROJECTION\" /\u003e\n    \u003cuses-permission android:name=\"android.permission.FOREGROUND_SERVICE\" /\u003e\n    \u003cuses-permission android:name=\"android.permission.POST_NOTIFICATIONS\" /\u003e\n    \u003c!-- Only if you have to record audio --\u003e\n    \u003cuses-permission android:name=\"android.permission.RECORD_AUDIO\" /\u003e\n\u003c/manifest\u003e\n```\n\nYou will also have to declare the `Service`,\n\n```xml\n\n\u003capplication\u003e\n    \u003c!-- YourScreenRecorderService extends DefaultScreenRecorderService --\u003e\n    \u003cservice android:name=\".services.MyService\" android:exported=\"false\"\n        android:foregroundServiceType=\"mediaProjection\" /\u003e\n\u003c/application\u003e\n```\n\n## Documentations\n\n[StreamPack API guide](https://thibaultbee.github.io/StreamPack)\n\n- Additional documentations are available in the `docs` directory:\n    - [Live and record simultaneously](docs/LiveAndRecordSimultaneously.md)\n    - [Streamers](docs/Streamers.md)\n    - [Streamer elements](docs/AdvancedStreamer)\n\n## Demos\n\n### Camera and audio demo\n\nFor source code example on how to use camera and audio streamers,\ncheck [demos/camera](demos/camera). On\nfirst launch, you will have to set RTMP url or SRT server IP in the settings menu.\n\n### Screen recorder demo\n\nFor source code example on how to use screen recorder streamer, check\nthe [demos/screenrecorder](demos/screenrecorder)\n. On first launch, you will have to set RTMP url or SRT server IP in the settings menu.\n\n### Tests with a FFmpeg server\n\nFFmpeg has been used as an SRT server+demuxer+decoder for the tests.\n\n#### RTMP\n\nTells FFplay to listen on IP `0.0.0.0` and port `1935`.\n\n```\nffplay -listen 1 -i 'rtmp://0.0.0.0:1935/s/streamKey'\n```\n\nOn StreamPack sample app settings, set `Endpoint` -\u003e `Type` to `Stream to a remove RTMP device`,\nthen set the server `URL` to `rtmp://serverip:1935/s/streamKey`. At this point, StreamPack sample\napp should successfully sends audio and video frames. On FFplay side, you should be able to watch\nthis live stream.\n\n#### SRT\n\nTells FFplay to listen on IP `0.0.0.0` and port `9998`:\n\n```\nffplay -fflags nobuffer 'srt://0.0.0.0:9998?mode=listener'\n```\n\nOn StreamPack sample app settings, set the server `IP` to your server IP and server `Port` to`9998`.\nAt this point, StreamPack sample app should successfully sends audio and video frames. On FFplay\nside, you should be able to watch this live stream.\n\n## Tips\n\n### RTMP or SRT\n\nRTMP and SRT are both live streaming protocols. SRT is a UDP-based modern protocol, it is\nreliable and ultra low latency. RTMP is a TCP-based protocol, it is also reliable but it is only low\nlatency.\nThere are already a lot of comparison over the Internet, so here is a summary:\n\n* SRT:\n    - Ultra low latency(\u003c 1 s)\n* RTMP:\n    - Low latency (2 - 3 s)\n\nSo, the main question is : \"which protocol to use?\"\nIt is easy: if your server has SRT support, use SRT otherwise use RTMP.\n\n### Get device and protocol capabilities\n\nHave you ever wonder : \"What are the supported resolution of my cameras?\" or \"What is the supported\nsample rate of my audio codecs ?\"? `Info` classes are made for this. All `Streamer` comes with a\nspecific `Info` object:\n\n ```kotlin\nval info = streamer.getInfo(MediaDescriptor(\"rtmps://serverip:1935/s/streamKey\"))\n```\n\nFor static endpoint or an opened dynamic endpoint, you can directly get the info:\n\n```kotlin\nval info = streamer.info\n```\n\n### Element specific configuration\n\nIf you are looking for more settings on streamer, like the exposure compensation of your camera, you\nmust have a look on `Settings` class. Each `Streamer` elements (such as `IVideoSource`,\n`IAudioSource`,...)\ncomes with a public interface that allows to have access to specific information or configuration.\n\nExample: if the video source can be cast to `ICameraSource` interface. You get the access to\n`settings`\nthat allows to get and set the current camera settings:\n\n```kotlin\n(streamer.videoSource as ICameraSource).settings\n```\n\nExample: you can change the exposure compensation of your camera, on a `CameraStreamers`\nyou can do it like this:\n\n```kotlin\n(streamer.videoSource as ICameraSource).settings.exposure.compensation = value\n```\n\nMoreover you can retrieve exposure range and step with:\n\n```kotlin\n(streamer.videoSource as ICameraSource).settings.exposure.availableCompensationRange\n(streamer.videoSource as ICameraSource).settings.exposure.availableCompensationStep\n```\n\nSee the [docs/AdvancedStreamer.md](docs/AdvancedStreamer#element-specific-configuration) for more\ninformation.\n\n### Android versions\n\nEven if StreamPack sdk supports a `minSdkVersion` 21. I strongly recommend to set the\n`minSdkVersion` of your application to a higher version (the highest is the best!) for better\nperformance.\n\n## Licence\n\n    Copyright 2021 Thibault B.\n\n    Licensed under the Apache License, Version 2.0 (the \"License\");\n    you may not use this file except in compliance with the License.\n    You may obtain a copy of the License at\n\n       http://www.apache.org/licenses/LICENSE-2.0\n\n    Unless required by applicable law or agreed to in writing, software\n    distributed under the License is distributed on an \"AS IS\" BASIS,\n    WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n    See the License for the specific language governing permissions and\n    limitations under the License.\n","funding_links":["https://github.com/sponsors/ThibaultBee"],"categories":["HarmonyOS"],"sub_categories":["Windows Manager"],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2FThibaultBee%2FStreamPack","html_url":"https://awesome.ecosyste.ms/projects/github.com%2FThibaultBee%2FStreamPack","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2FThibaultBee%2FStreamPack/lists"}