{"id":16143482,"url":"https://github.com/lhns/fs2-compress","last_synced_at":"2025-03-18T17:31:24.036Z","repository":{"id":65221309,"uuid":"587677960","full_name":"lhns/fs2-compress","owner":"lhns","description":"Compression Algorithms for Fs2","archived":false,"fork":false,"pushed_at":"2024-04-13T00:56:25.000Z","size":192,"stargazers_count":31,"open_issues_count":3,"forks_count":1,"subscribers_count":2,"default_branch":"main","last_synced_at":"2024-04-14T05:39:38.138Z","etag":null,"topics":["bzip2","compression","fs2","gzip","scala","tar","zip","zstd"],"latest_commit_sha":null,"homepage":"","language":"Scala","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"apache-2.0","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/lhns.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":"LICENSE","code_of_conduct":"CODE_OF_CONDUCT.md","threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null}},"created_at":"2023-01-11T10:19:13.000Z","updated_at":"2024-04-15T09:36:02.544Z","dependencies_parsed_at":"2023-02-16T14:00:19.815Z","dependency_job_id":"c932721e-3be7-4253-b439-8f43ba24a04f","html_url":"https://github.com/lhns/fs2-compress","commit_stats":null,"previous_names":[],"tags_count":8,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/lhns%2Ffs2-compress","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/lhns%2Ffs2-compress/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/lhns%2Ffs2-compress/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/lhns%2Ffs2-compress/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/lhns","download_url":"https://codeload.github.com/lhns/fs2-compress/tar.gz/refs/heads/main","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":221714786,"owners_count":16868494,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["bzip2","compression","fs2","gzip","scala","tar","zip","zstd"],"created_at":"2024-10-10T00:09:22.478Z","updated_at":"2025-03-18T17:31:24.030Z","avatar_url":"https://github.com/lhns.png","language":"Scala","readme":"# fs2-compress\n\n[![Typelevel Affiliate Project](https://img.shields.io/badge/typelevel-affiliate%20project-FFB4B5.svg)](https://typelevel.org/projects/)\n[![build](https://github.com/lhns/fs2-compress/actions/workflows/build.yml/badge.svg)](https://github.com/lhns/fs2-compress/actions/workflows/build.yml)\n[![Release Notes](https://img.shields.io/github/release/lhns/fs2-compress.svg?maxAge=3600)](https://github.com/lhns/fs2-compress/releases/latest)\n[![Maven Central](https://img.shields.io/maven-central/v/de.lhns/fs2-compress_2.13)](https://search.maven.org/artifact/de.lhns/fs2-compress_2.13)\n[![Apache License 2.0](https://img.shields.io/github/license/lhns/fs2-compress.svg?maxAge=3600)](https://www.apache.org/licenses/LICENSE-2.0)\n[![Scala Steward badge](https://img.shields.io/badge/Scala_Steward-helping-blue.svg?style=flat\u0026logo=data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAAA4AAAAQCAMAAAARSr4IAAAAVFBMVEUAAACHjojlOy5NWlrKzcYRKjGFjIbp293YycuLa3pYY2LSqql4f3pCUFTgSjNodYRmcXUsPD/NTTbjRS+2jomhgnzNc223cGvZS0HaSD0XLjbaSjElhIr+AAAAAXRSTlMAQObYZgAAAHlJREFUCNdNyosOwyAIhWHAQS1Vt7a77/3fcxxdmv0xwmckutAR1nkm4ggbyEcg/wWmlGLDAA3oL50xi6fk5ffZ3E2E3QfZDCcCN2YtbEWZt+Drc6u6rlqv7Uk0LdKqqr5rk2UCRXOk0vmQKGfc94nOJyQjouF9H/wCc9gECEYfONoAAAAASUVORK5CYII=)](https://scala-steward.org)\n\nIntegrations for several compression algorithms with [Fs2](https://github.com/typelevel/fs2).\n\n## Usage\n\n### build.sbt\n\n```sbt\nlibraryDependencies += \"de.lhns\" %% \"fs2-compress-gzip\" % \"2.3.0\"\nlibraryDependencies += \"de.lhns\" %% \"fs2-compress-zip\" % \"2.3.0\"\nlibraryDependencies += \"de.lhns\" %% \"fs2-compress-zip4j\" % \"2.3.0\"\nlibraryDependencies += \"de.lhns\" %% \"fs2-compress-tar\" % \"2.3.0\"\nlibraryDependencies += \"de.lhns\" %% \"fs2-compress-bzip2\" % \"2.3.0\"\nlibraryDependencies += \"de.lhns\" %% \"fs2-compress-zstd\" % \"2.3.0\"\nlibraryDependencies += \"de.lhns\" %% \"fs2-compress-brotli\" % \"2.3.0\"\nlibraryDependencies += \"de.lhns\" %% \"fs2-compress-brotli4j\" % \"2.3.0\"\nlibraryDependencies += \"de.lhns\" %% \"fs2-compress-lz4\" % \"2.3.0\"\nlibraryDependencies += \"de.lhns\" %% \"fs2-compress-snappy\" % \"2.3.0\"\n```\n\n## Concepts\n\nThis library introduces the following abstractions in order to work with several different compression algorithms and\narchive methods.\n\n### Compression\n\n#### Compressor\n\nThe `Compressor` typeclass abstracts the compression of a stream of bytes.\n```scala\ntrait Compressor[F[_]] {\n  def compress: Pipe[F, Byte, Byte]\n}\n```\nPassing a stream of bytes through the `Compressor.compress` pipe will result in a compressed stream of bytes. :tada:\n\n#### Decompressor\n\nThe `Decompressor` typeclass abstracts the decompression of a stream of bytes.\n```scala\ntrait Decompressor[F[_]] {\n  def decompress: Pipe[F, Byte, Byte]\n}\n```\nPassing a stream of bytes through the `Decompressor.decompress` pipe will result in a decompressed stream of bytes. :tada:\n\n### Archives\n\nThe library also provides abstractions for working with archive formats. An archive is a collection of files and directories\nwhich may or may not also include compression depending on the archive format.\n\n#### ArchiveEntry\n\nAn `ArchiveEntry` represents a file or directory in an archive. It has the following signature:\n```scala\ncase class ArchiveEntry[+Size[A] \u003c: Option[A], Underlying](name: String, uncompressedSize: Size[Long], underlying: Underlying, ...)\n```\nThe `Size` type parameter is used to encode whether the size of the entry is known or not. For some archive formats the size\nof an entry must be known in advance, and as such the relevant `Archiver` will require that the `Size` type parameter is `Some`.\n\n#### Archiver\n\nThe `Archiver` typeclass abstracts the creation of an archive from a stream of `ArchiveEntry` paired with the relevant data.\n```scala\ntrait Archiver[F[_], Size[A] \u003c: Option[A]] {\n  def archive: Pipe[F, (ArchiveEntry[Size, Any], Stream[F, Byte]), Byte]\n}\n```\n#### Unarchiver\n\nThe `Unarchiver` typeclass abstracts the extraction of an archive into a stream of `ArchiveEntry` paired with the relevant data.\n```scala\ntrait Unarchiver[F[_], Size[A] \u003c: Option[A], Underlying] {\n  def unarchive: Pipe[F, Byte, (ArchiveEntry[Size, Underlying], Stream[F, Byte])]\n}\n```\n\n## Examples\n\nThe following examples does not check that the paths used are valid. For real world applications you will probably want to\nadd some checks to that effect.\n\n### Compression\n\nCompression can be abstracted over using the `Compressor` typeclass. Adapt the following examples based on which compression algorithm you want to use.\n```scala\nimport cats.effect.Async\nimport de.lhns.fs2.compress._\nimport fs2.io.file.{Files, Path}\n\n// implicit def bzip2[F[_]: Async]: Compressor[F] = Bzip2Compressor.make()\n// implicit def lz4[F[_]: Async]: Compressor[F] = Lz4Compressor.make()\n// implicit def zstd[F[_]: Async]: Compressor[F] = ZstdCompressor.make()\nimplicit def gzip[F[_]: Async]: Compressor[F] = GzipCompressor.make()\n\ndef compressFile[F[_]: Async](toCompress: Path, writeTo: Path)(implicit compressor: Compressor[F]): F[Unit] =\n  Files[F]\n    .readAll(toCompress)\n    .through(compressor.compress)\n    .through(Files[F].writeAll(writeTo))\n    .compile\n    .drain\n```\n\n### Decompression\n\nSimilarly, decompression can be abstracted over using the `Decompressor` typeclass. Adapt the following examples based on which compression algorithm was used to write the source file.\n```scala\nimport cats.effect.Async\nimport de.lhns.fs2.compress._\nimport fs2.io.file.{Files, Path}\n\n// implicit def brotli[F[_]: Async]: Decompressor[F] = BrotliDecompressor.make()\n// implicit def bzip2[F[_]: Async]: Decompressor[F] = Bzip2Decompressor.make()\n// implicit def lz4[F[_]: Async]: Decompressor[F] = Lz4Decompressor.make()\n// implicit def zstd[F[_]: Async]: Decompressor[F] = ZstdDecompressor.make()\nimplicit def gzip[F[_]: Async]: Decompressor[F] = GzipDecompressor.make()\n\ndef decompressFile[F[_]: Async](toDecompress: Path, writeTo: Path)(implicit decompressor: Decompressor[F]): F[Unit] =\n  Files[F]\n    .readAll(toCompress)\n    .through(decompressor.decompress)\n    .through(Files[F].writeAll(writeTo))\n    .compile\n    .drain\n```\n\n### Archiving\n\nThe library supports both `.zip` and `.tar` archives, with support for `.zip` through both the native Java implementation and the [zip4j](https://github.com/srikanth-lingala/zip4j) library.\n\n```scala\nimport cats.effect.Async\nimport de.lhns.fs2.compress._\nimport fs2.io.file.{Files, Path}\n\n// implicit def tar[F[_]: Async]: Archiver[F, Some] = TarArchiver.make()\n// implicit def zip4j[F[_]: Async]: Archiver[F, Some] = Zip4JArchiver.make()\nimplicit def zip[F[_]: Async]: Archiver[F, Option] = ZipArchiver.makeDeflated()\n\ndef archiveDirectory[F[_]](directory: Path, writeTo: Path)(implicit archiver: Archiver[F, Option]): F[Unit] =\n  Files[F]\n    .list(directory)\n    .evalMap { path =\u003e\n      Files[F]\n        .size(path)\n        .map { size =\u003e\n          // Name the entry based on the relative path between the source directory and the file\n          val name = directory.relativize(path).toString\n          ArchiveEntry[Some, Unit](name, uncompressedSize = Some(size)) -\u003e Files[F].readAll(path)\n        }\n    }\n    .through(archiver.archive)\n    .through(Files[F].writeAll(writeTo))\n    .compile\n    .drain\n```\nNote that `.tar` doesn't compress the archive, so to create a `.tar.gz` file you will have to combine the archiver with\nthe `GzipCompressor`\n\n```scala\nimport cats.effect.Async\nimport de.lhns.fs2.compress._\nimport fs2.io.file.{Files, Path}\n\nimplicit def gzip[F[_]: Async]: Compressor[F] = GzipCompressor.make()\nimplicit def tar[F[_]: Async]: Archiver[F, Some] = TarArchiver.make()\n\ndef tarAndGzip[F[_]: Async](directory: Path, writeTo: Path)(implicit archiver: Archiver[F, Some], compressor: Compressor[F]): F[Unit] =\n  Files[F]\n    .list(directory)\n    .evalMap { path =\u003e\n      Files[F]\n        .size(path)\n        .map { size =\u003e\n          // Name the entry based on the relative path between the source directory and the file\n          val name = directory.relativize(path).toString\n          ArchiveEntry[Some, Unit](name, uncompressedSize = Some(size)) -\u003e Files[F].readAll(path)\n        }\n    }\n    .through(archiver.archive)\n    .through(compressor.compress)\n    .through(Files[F].writeAll(writeTo))\n    .compile\n    .drain\n```\n\n### Unarchiving\n\nTo unarchive we use the `Unarchiver` typeclass matching our archive format.\n\n```scala\nimport cats.effect.Async\nimport de.lhns.fs2.compress._\nimport fs2.io.file.{Files, Path}\n\n// implicit def tar[F[_]: Async]: Unarchiver[F, Option] = TarUnarchiver.make()\n// implicit def zip4j[F[_]: Async]: Unarchiver[F, Option] = Zip4JUnarchiver.make()\nimplicit def zip[F[_]: Async]: Unarchiver[F, Option] = ZipUnarchiver.make()\n\ndef unArchive[F[_]](archive: Path, writeTo: Path)(implicit archiver: Unarchiver[F, Option]): F[Unit] =\n  Files[F]\n    .readAll(archive)\n    .through(archiver.unarchive)\n    .flatMap { case (entry, data) =\u003e\n      data.through(Files[F].writeAll(writeTo.resolve(entry.name)))\n    }\n    .compile\n    .drain\n```\nOnce again if you have a `.tar.gz` file you will have to combine the `Unarchiver` with the `GzipDecompressor`\n\n```scala\nimport cats.effect.Async\nimport de.lhns.fs2.compress._\nimport fs2.io.file.{Files, Path}\n\nimplicit def gzip[F[_]: Async]: Decompressor[F] = GzipDecompressor.make()\nimplicit def tar[F[_]: Async]: Unarchiver[F, Option] = TarUnarchiver.make()\n\ndef unArchive[F[_]](archive: Path, writeTo: Path)(implicit archiver: Unarchiver[F, Option], decompressor: Decompress[F]): F[Unit] =\n  Files[F]\n    .readAll(archive)\n    .through(decompressor.decompress)\n    .through(archiver.unarchive)\n    .flatMap { case (entry, data) =\u003e\n      data.through(Files[F].writeAll(writeTo.resolve(entry.name)))\n    }\n    .compile\n    .drain\n```\n\n## License\n\nThis project uses the Apache 2.0 License. See the file called LICENSE.\n","funding_links":[],"categories":[],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Flhns%2Ffs2-compress","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Flhns%2Ffs2-compress","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Flhns%2Ffs2-compress/lists"}