{"id":37022664,"url":"https://github.com/exini/dicom-streams","last_synced_at":"2026-01-14T02:42:36.516Z","repository":{"id":34463032,"uuid":"177833646","full_name":"exini/dicom-streams","owner":"exini","description":"A streaming and non-blocking API for reading and processing DICOM data","archived":false,"fork":true,"pushed_at":"2024-11-18T20:42:38.000Z","size":3768,"stargazers_count":10,"open_issues_count":1,"forks_count":3,"subscribers_count":1,"default_branch":"develop","last_synced_at":"2024-11-18T21:49:36.381Z","etag":null,"topics":["akka-streams","backpressure","dicom"],"latest_commit_sha":null,"homepage":"","language":"Scala","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":"slicebox/dicom-streams","license":"apache-2.0","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/exini.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null}},"created_at":"2019-03-26T17:04:36.000Z","updated_at":"2024-09-20T18:19:54.000Z","dependencies_parsed_at":null,"dependency_job_id":null,"html_url":"https://github.com/exini/dicom-streams","commit_stats":null,"previous_names":[],"tags_count":24,"template":false,"template_full_name":null,"purl":"pkg:github/exini/dicom-streams","repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/exini%2Fdicom-streams","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/exini%2Fdicom-streams/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/exini%2Fdicom-streams/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/exini%2Fdicom-streams/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/exini","download_url":"https://codeload.github.com/exini/dicom-streams/tar.gz/refs/heads/develop","sbom_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/exini%2Fdicom-streams/sbom","scorecard":{"id":387931,"data":{"date":"2025-08-11","repo":{"name":"github.com/exini/dicom-streams","commit":"b73e2830caeb471dac66790951e12023b5c4b352"},"scorecard":{"version":"v5.2.1-40-gf6ed084d","commit":"f6ed084d17c9236477efd66e5b258b9d4cc7b389"},"score":4.2,"checks":[{"name":"Maintained","score":0,"reason":"0 commit(s) and 0 issue activity found in the last 90 days -- score normalized to 0","details":null,"documentation":{"short":"Determines if the project is \"actively maintained\".","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#maintained"}},{"name":"Code-Review","score":3,"reason":"Found 4/12 approved changesets -- score normalized to 3","details":null,"documentation":{"short":"Determines if the project requires human code review before pull requests (aka merge requests) are merged.","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#code-review"}},{"name":"Token-Permissions","score":0,"reason":"detected GitHub workflow tokens with excessive permissions","details":["Warn: no topLevel permission defined: .github/workflows/release.yml:1","Warn: no topLevel permission defined: .github/workflows/test.yml:1","Info: no jobLevel write permissions found"],"documentation":{"short":"Determines if the project's workflows follow the principle of least privilege.","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#token-permissions"}},{"name":"Binary-Artifacts","score":10,"reason":"no binaries found in the repo","details":null,"documentation":{"short":"Determines if the project has generated executable (binary) artifacts in the source repository.","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#binary-artifacts"}},{"name":"Dangerous-Workflow","score":10,"reason":"no dangerous workflow patterns detected","details":null,"documentation":{"short":"Determines if the project's GitHub Action workflows avoid dangerous patterns.","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#dangerous-workflow"}},{"name":"CII-Best-Practices","score":0,"reason":"no effort to earn an OpenSSF best practices badge detected","details":null,"documentation":{"short":"Determines if the project has an OpenSSF (formerly CII) Best Practices Badge.","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#cii-best-practices"}},{"name":"Pinned-Dependencies","score":0,"reason":"dependency not pinned by hash detected -- score normalized to 0","details":["Warn: GitHub-owned GitHubAction not pinned by hash: .github/workflows/release.yml:10: update your workflow using https://app.stepsecurity.io/secureworkflow/exini/dicom-streams/release.yml/develop?enable=pin","Warn: third-party GitHubAction not pinned by hash: .github/workflows/release.yml:13: update your workflow using https://app.stepsecurity.io/secureworkflow/exini/dicom-streams/release.yml/develop?enable=pin","Warn: GitHub-owned GitHubAction not pinned by hash: .github/workflows/test.yml:17: update your workflow using https://app.stepsecurity.io/secureworkflow/exini/dicom-streams/test.yml/develop?enable=pin","Warn: GitHub-owned GitHubAction not pinned by hash: .github/workflows/test.yml:23: update your workflow using https://app.stepsecurity.io/secureworkflow/exini/dicom-streams/test.yml/develop?enable=pin","Info:   0 out of   3 GitHub-owned GitHubAction dependencies pinned","Info:   0 out of   1 third-party GitHubAction dependencies pinned"],"documentation":{"short":"Determines if the project has declared and pinned the dependencies of its build process.","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#pinned-dependencies"}},{"name":"Security-Policy","score":0,"reason":"security policy file not detected","details":["Warn: no security policy file detected","Warn: no security file to analyze","Warn: no security file to analyze","Warn: no security file to analyze"],"documentation":{"short":"Determines if the project has published a security policy.","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#security-policy"}},{"name":"Vulnerabilities","score":10,"reason":"0 existing vulnerabilities detected","details":null,"documentation":{"short":"Determines if the project has open, known unfixed vulnerabilities.","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#vulnerabilities"}},{"name":"Fuzzing","score":0,"reason":"project is not fuzzed","details":["Warn: no fuzzer integrations found"],"documentation":{"short":"Determines if the project uses fuzzing.","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#fuzzing"}},{"name":"License","score":10,"reason":"license file detected","details":["Info: project has a license file: LICENSE:0","Info: FSF or OSI recognized license: Apache License 2.0: LICENSE:0"],"documentation":{"short":"Determines if the project has defined a license.","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#license"}},{"name":"Packaging","score":-1,"reason":"packaging workflow not detected","details":["Warn: no GitHub/GitLab publishing workflow detected."],"documentation":{"short":"Determines if the project is published as a package that others can easily download, install, easily update, and uninstall.","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#packaging"}},{"name":"Signed-Releases","score":-1,"reason":"no releases found","details":null,"documentation":{"short":"Determines if the project cryptographically signs release artifacts.","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#signed-releases"}},{"name":"Branch-Protection","score":5,"reason":"branch protection is not maximal on development and all release branches","details":["Info: 'allow deletion' disabled on branch 'develop'","Info: 'allow deletion' disabled on branch 'master'","Info: 'force pushes' disabled on branch 'develop'","Info: 'force pushes' disabled on branch 'master'","Warn: 'branch protection settings apply to administrators' is disabled on branch 'develop'","Warn: 'branch protection settings apply to administrators' is disabled on branch 'master'","Warn: 'stale review dismissal' is disabled on branch 'develop'","Warn: 'stale review dismissal' is disabled on branch 'master'","Warn: required approving review count is 1 on branch 'develop'","Warn: required approving review count is 1 on branch 'master'","Warn: codeowners review is not required on branch 'develop'","Warn: codeowners review is not required on branch 'master'","Warn: 'last push approval' is disabled on branch 'develop'","Warn: 'last push approval' is disabled on branch 'master'","Warn: no status checks found to merge onto branch 'develop'","Warn: no status checks found to merge onto branch 'master'","Info: PRs are required in order to make changes on branch 'develop'","Info: PRs are required in order to make changes on branch 'master'"],"documentation":{"short":"Determines if the default and release branches are protected with GitHub's branch protection settings.","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#branch-protection"}},{"name":"SAST","score":0,"reason":"SAST tool is not run on all commits -- score normalized to 0","details":["Warn: 0 commits out of 24 are checked with a SAST tool"],"documentation":{"short":"Determines if the project uses static code analysis.","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#sast"}}]},"last_synced_at":"2025-08-18T17:07:17.049Z","repository_id":34463032,"created_at":"2025-08-18T17:07:17.049Z","updated_at":"2025-08-18T17:07:17.049Z"},"host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":286080680,"owners_count":28408712,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2026-01-14T01:52:23.358Z","status":"online","status_checked_at":"2026-01-14T02:00:06.678Z","response_time":107,"last_error":null,"robots_txt_status":"success","robots_txt_updated_at":"2025-07-24T06:49:26.215Z","robots_txt_url":"https://github.com/robots.txt","online":true,"can_crawl_api":true,"host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["akka-streams","backpressure","dicom"],"created_at":"2026-01-14T02:42:35.842Z","updated_at":"2026-01-14T02:42:36.502Z","avatar_url":"https://github.com/exini.png","language":"Scala","funding_links":[],"categories":[],"sub_categories":[],"readme":"# dicom-streams\n\nService | Status | Description\n------- | ------ | -----------\nGitter            | [![Join the chat at https://gitter.im/exini/dicom-streams](https://badges.gitter.im/exini/dicom-streams.svg)](https://gitter.im/exini/dicom-streams?utm_source=badge\u0026utm_medium=badge\u0026utm_campaign=pr-badge\u0026utm_content=badge) | Chatroom\n\nThe purpose of this project is to create a streaming API for reading and processing DICOM data using [pekko-streams](https://pekko.apache.org/docs/pekko/current/stream/index.html). \n\nAdvantages of streaming DICOM data include better control over resource allocation such as memory via strict bounds on\nDICOM data chunk size and network utilization using back-pressure as specified in the\n[Reactive Streams](http://www.reactive-streams.org/) protocol.\n\nThe library is split in two projects. The `data` project defines data structures and common functionality with minimal\ndependencies and offers synchronous parsing of (possibly chunked) binary data. The `streams` project provides \nfunctionality for streaming DICOM data and pulls in [Pekko](https://pekko.apache.org) as a required dependency.\n\nThe logic of parsing and handling DICOM data is inspired by [dcm4che](https://github.com/dcm4che/dcm4che)\nwhich provides a far more complete (albeit blocking and synchronous) implementation of the DICOM standard.\n\n### Setup\n\nThe dicom-streams library is deployed to Sonatype. You need to include the Sonatype resolvers to find the package.\n\n```scala\nresolvers ++= Seq(Resolver.sonatypeRepo(\"releases\"), Resolver.sonatypeRepo(\"snapshots\"))\n```\n\nThe library is included by\n```scala\nlibraryDependencies += \"com.exini\" %% \"dicom-streams\" % \"x.y.z\"\n```\nIf you want to use the basic data package without the streams functionality use\n```scala\nlibraryDependencies += \"com.exini\" %% \"dicom-data\" % \"x.y.z\"\n```\n\n### Data Model\n\nStreaming binary DICOM data may originate from many different sources such as files, a HTTP POST request, or a read from\na database. Pekko Streams provide a multitude of connectors for streaming binary data. Streaming data arrives in chunks. \nIn the Pekko Stream nomenclature, chunks originate from _sources_, they are processed in _flows_ and folded into a \nnon-streaming plain objects using _sinks_. \n\nThis library provides flows for parsing binary DICOM data into DICOM parts (represented by the `DicomPart` interface) - \nsmall objects representing a part of a data element. These DICOM parts are bounded in size by a user specified chunk \nsize parameter. Flows of DICOM parts can be processed using a series of flows in this library. There are flows for \nfiltering based on tag path conditions, flows for converting between transfer syntaxes, flows for re-encoding sequences \nand items, etc. \n\nThe `Element` interface provides a set of higher level data classes, each roughly corresponding to one row in a textual\ndump of a DICOM files. Here, chunks are aggregated into complete data elements. There are representations for standard\ntag-value elements, sequence and item start elements, sequence and item delimitation elements, fragments start elements,\netc. A `DicomPart` stream is transformed into an `Element` stream via the `elementFlow` flow.\n\nA flow of `Element`s can be materialized into a representation of a dataset called an `Elements` using the `elementSink`\nsink. For processing of large sets of data, one should strive for a fully streaming DICOM pipeline, however, in some \ncases it can be convenient to work with a plain dataset; `Elements` serves this purpose. Internally, the sink aggregates\n`Element`s into `ElementSet`s, each with an asssociated tag number (value elements, sequences and fragments). `Elements`\nimplements a straight-forward data hierarchy:\n* An `Elements` holds a list of `ElementSet`s (`ValueElement`, `Sequence` and `Fragments`)\n* A `ValueElement` is a standard attribute with tag number and binary value\n* A `Sequence` holds a list of `Item`s\n  * An `Item` contains zero or one `Elements` (note the recursion)\n* A `Fragments` holds a list of `Fragment`s\n  * A `Fragment` holds a binary value. \n\nThe following diagram shows an overview of the data model at the `DicomPart`, `Element` and `ElementSet` levels.\n\n![Data model](README/data-model.png)\n\nAs seen, a standard attribute, represented by the `ValueElement` class is composed by one `HeaderPart` followed by zero,\none or more `ValueChunk`s of data. Likewise, ecapsulated data such as a jpeg image is composed by one `FragmentsPart`\nfollowed by, for each fragment, one `ItemPart` followed by `ValueChunk`s of data, and ends with a\n`SequenceDelimitationPart`.\n\n### Examples\n\nThe following example reads a DICOM file from disk, validates that it is a DICOM file, discards all private elements\nand writes it to a new file.\n\n```scala\nFileIO.fromPath(Paths.get(\"source-file.dcm\"))\n  .via(parseFlow)\n  .via(tagFilter(tagPath =\u003e tagPath.toList.map(_.tag).exists(isPrivate))) // no private elements anywhere on tag path\n  .map(_.bytes)\n  .runWith(FileIO.toPath(Paths.get(\"target-file.dcm\")))\n```\n\nCare should be taken when modifying DICOM data so that the resulting data is still valid. For instance, group length\ntags may need to be removed or updated after modifying elements. Here is an example that modifies the `PatientName`\nand `SOPInstanceUID` attributes. To ensure the resulting data is valid, group length tags in the dataset are removed and\nthe meta information group tag is updated.\n\n```scala\nval updatedSOPInstanceUID = padToEvenLength(createUID().utf8Bytes, VR.UI)\n\nFileIO.fromPath(Paths.get(\"source-file.dcm\"))\n  .via(parseFlow)\n  .via(groupLengthDiscardFilter) // discard group length elements in dataset\n  .via(modifyFlow(\n    Seq(\n      TagModification.endsWith(TagPath.fromTag(Tag.PatientName), _ =\u003e padToEvenLength(\"John Doe\".utf8Bytes, VR.PN)),\n      TagModification.endsWith(TagPath.fromTag(Tag.MediaStorageSOPInstanceUID), _ =\u003e updatedSOPInstanceUID)\n    ), \n    Seq(\n      TagInsertion(TagPath.fromTag(Tag.SOPInstanceUID), _ =\u003e updatedSOPInstanceUID)\n    )\n  ))\n  .via(fmiGroupLengthFlow) // update group length in meta information, if present\n  .map(_.bytes)\n  .runWith(FileIO.toPath(Paths.get(\"target-file.dcm\")))\n```\n\n### Custom Processing\nNew non-trivial DICOM flows can be built using a modular system of capabilities that are mixed in as appropriate with a \ncore class implementing a common base interface. The base interface for DICOM flows is `DicomFlow`. It provides an\nassociated logic class `DicomLogic` that should be similarly extended and is where the state and logic of the flow should\nreside. The `DicomLogic` base class defines a series of events, one for each type of `DicomPart` that is produced when \nparsing DICOM data with `DicomParseFlow`. The core events are:\n```scala\n  def onPreamble(part: PreamblePart): List[DicomPart]\n  def onHeader(part: HeaderPart): List[DicomPart]\n  def onValueChunk(part: ValueChunk): List[DicomPart]\n  def onSequence(part: SequencePart): List[DicomPart]\n  def onSequenceDelimitation(part: SequenceDelimitationPart): List[DicomPart]\n  def onFragments(part: FragmentsPart): List[DicomPart]\n  def onItem(part: ItemPart): List[DicomPart]\n  def onItemDelimitation(part: ItemDelimitationPart): List[DicomPart]\n  def onDeflatedChunk(part: DeflatedChunk): List[DicomPart]\n  def onUnknown(part: UnknownPart): List[DicomPart]\n  def onPart(part: DicomPart): List[DicomPart]\n```\nDefault behavior to these events are implemented in core classes. The most natural behavior is to simply pass parts on\ndown the stream, e.g. \n```scala\n  def onPreamble(part: PreamblePart): List[DicomPart] = part :: Nil\n  def onHeader(part: HeaderPart): List[DicomPart] = part :: Nil\n  ...\n```\nThis behavior is implemented in the `IdentityFlow` core class. Another option is to defer handling to the `onPart` method\nwhich is implemented in the `DeferToPartFlow` core class. This is appropriate for flows which define a common \nbehavior for all part types. \n\nTo give an example of a custom flow, here is the implementation of a filter that removes \nnested sequences from a dataset. We define a nested dataset as a sequence with `depth \u003e 1` given that the root dataset \nhas `depth = 0`.\n```scala\n  def nestedSequencesFilter() = new DeferToPartFlow[DicomPart] with TagPathTracking[DicomPart] {\n    override def createLogic(attr: Attributes): GraphStageLogic = new DeferToPartLogic with TagPathTrackingLogic {\n      override def onPart(part: DicomPart): List[DicomPart] = if (tagPath.depth \u003e 1) Nil else part :: Nil\n    }\n  }\n```\nIn this example, we chose to use `DeferToPartFlow` as the core class and mixed in the `TagPathTracking` capability\nwhich gives access to a `tagPath: TagPath` variable at all times which is automatically updated as the flow progresses.\n\n### Releasing\n\nThe plugin [sbt-ci-release](https://github.com/olafurpg/sbt-ci-release) is used to manage releasing to Sonatype. The\nlibrary is automatically released on each push to the master branch. The version is derived from the tag closet to the\nlatest commit to master. Given this, the following procedure should be used:\n1. Merge develop into master via a new branch\n1. Tag the new release as `vx.y.z`, e.g. using `git tag v3.2.1`, following semantic versioning recommendations\n1. Push tags to origin by `git push --tags`\n1. Push release commit to origin by `git push`, this will trigger the release using the tagged version\n1. From the Github project page, create a new release using the existing release tag describing your changes\n1. Merge master into develop via a new branch.\n\n### License\n\nThis project is released under the [Apache License, version 2.0](./LICENSE).\n","project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fexini%2Fdicom-streams","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fexini%2Fdicom-streams","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fexini%2Fdicom-streams/lists"}