{"id":28250252,"url":"https://github.com/spring-attic/spring-cloud-dataflow","last_synced_at":"2025-06-14T01:31:16.887Z","repository":{"id":35210526,"uuid":"39469487","full_name":"spring-attic/spring-cloud-dataflow","owner":"spring-attic","description":"A microservices-based Streaming and Batch data processing in Cloud Foundry and Kubernetes","archived":true,"fork":false,"pushed_at":"2025-04-30T15:43:28.000Z","size":71614,"stargazers_count":1133,"open_issues_count":303,"forks_count":590,"subscribers_count":94,"default_branch":"main","last_synced_at":"2025-06-10T07:05:24.099Z","etag":null,"topics":["batch-processing","cloud-native","datapipelines","microservices-architecture","orchestration","predictive-analytics","stream-processing"],"latest_commit_sha":null,"homepage":"https://dataflow.spring.io","language":"Java","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"apache-2.0","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/spring-attic.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":"CONTRIBUTING.adoc","funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":"SECURITY.md","support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null,"zenodo":null}},"created_at":"2015-07-21T20:56:11.000Z","updated_at":"2025-06-05T05:56:03.000Z","dependencies_parsed_at":"2023-02-18T08:15:56.265Z","dependency_job_id":"673e80b5-de32-4acd-8576-dde802a1b70d","html_url":"https://github.com/spring-attic/spring-cloud-dataflow","commit_stats":{"total_commits":4787,"total_committers":139,"mean_commits":34.43884892086331,"dds":0.8140797994568624,"last_synced_commit":"d6fc1df014e84e5f33e660e2bdafcdcb99574255"},"previous_names":["spring-attic/spring-cloud-dataflow","spring-cloud/spring-cloud-dataflow"],"tags_count":136,"template":false,"template_full_name":null,"purl":"pkg:github/spring-attic/spring-cloud-dataflow","repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/spring-attic%2Fspring-cloud-dataflow","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/spring-attic%2Fspring-cloud-dataflow/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/spring-attic%2Fspring-cloud-dataflow/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/spring-attic%2Fspring-cloud-dataflow/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/spring-attic","download_url":"https://codeload.github.com/spring-attic/spring-cloud-dataflow/tar.gz/refs/heads/main","sbom_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/spring-attic%2Fspring-cloud-dataflow/sbom","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":259745050,"owners_count":22905034,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["batch-processing","cloud-native","datapipelines","microservices-architecture","orchestration","predictive-analytics","stream-processing"],"created_at":"2025-05-19T14:11:21.220Z","updated_at":"2025-06-14T01:31:16.881Z","avatar_url":"https://github.com/spring-attic.png","language":"Java","readme":"\u003cp align=\"center\"\u003e\n  \u003ca href=\"https://dataflow.spring.io/\"\u003e\n    \u003cimg alt=\"Spring Data Flow Dashboard\" title=\"Spring Data Flow\" src=\"https://i.imgur.com/hpeKaRk.png\" width=\"450\" /\u003e\n  \u003c/a\u003e\n\u003c/p\u003e\n\n# Spring Cloud Data Flow is no longer maintained as an open-source project by Broadcom, Inc.\n\n## For information about extended support or commercial options for Spring Cloud Data Flow, please read the official blog post [here](https://spring.io/blog/2025/04/21/spring-cloud-data-flow-commercial).\n\n\n\n*Spring Cloud Data Flow* is a microservices-based toolkit for building streaming and batch data processing pipelines in\nCloud Foundry and Kubernetes.\n\nData processing pipelines consist of Spring Boot apps, built using the [Spring Cloud Stream](https://github.com/spring-cloud/spring-cloud-stream)\nor [Spring Cloud Task](https://github.com/spring-cloud/spring-cloud-task) microservice frameworks. \n\nThis makes Spring Cloud Data Flow ideal for a range of data processing use cases, from import/export to event streaming\nand predictive analytics.\n\n----\n\n## Components\n\n**Architecture**: The Spring Cloud Data Flow Server is a Spring Boot application that provides RESTful API and REST clients\n(Shell, Dashboard, Java DSL).\nA single Spring Cloud Data Flow installation can support orchestrating the deployment of streams and tasks to Local,\nCloud Foundry, and Kubernetes.\n\nFamiliarize yourself with the Spring Cloud Data Flow [architecture](https://dataflow.spring.io/docs/concepts/architecture/)\nand [feature capabilities](https://dataflow.spring.io/features/).\n\n**Deployer SPI**: A Service Provider Interface (SPI) is defined in the [Spring Cloud Deployer](https://github.com/spring-cloud/spring-cloud-deployer)\nproject. The Deployer SPI provides an abstraction layer for deploying the apps for a given streaming or batch data pipeline\nand managing the application lifecycle.\n\nSpring Cloud Deployer Implementations:\n\n* [Local](https://github.com/spring-cloud/spring-cloud-deployer-local)\n* [Cloud Foundry](https://github.com/spring-cloud/spring-cloud-deployer-cloudfoundry)\n* [Kubernetes](https://github.com/spring-cloud/spring-cloud-deployer-kubernetes)\n\n**Domain Model**: The Spring Cloud Data Flow [domain module](https://github.com/spring-cloud/spring-cloud-dataflow/tree/master/spring-cloud-dataflow-core)\nincludes the concept of a *stream* that is a composition of Spring Cloud Stream applications in a linear data pipeline\nfrom a *source* to a *sink*, optionally including *processor* application(s) in between. The domain also includes the\nconcept of a *task*, which may be any process that does not run indefinitely, including [Spring Batch](https://github.com/spring-projects/spring-batch)\njobs.\n\n**Application Registry**: The [App Registry](https://github.com/spring-cloud/spring-cloud-dataflow/tree/master/spring-cloud-dataflow-registry)\nmaintains the metadata of the catalog of reusable applications.\nFor example, if relying on Maven coordinates, an application URI would be of the format:\n`maven://\u003cgroupId\u003e:\u003cartifactId\u003e:\u003cversion\u003e`.\n\n**Shell/CLI**: The [Shell](https://github.com/spring-cloud/spring-cloud-dataflow/tree/master/spring-cloud-dataflow-shell)\nconnects to the Spring Cloud Data Flow Server's REST API and supports a DSL that simplifies the process of defining a\nstream or task and managing its lifecycle.\n\n----\n\n## Building\n\nClone the repo and type \n\n    $ ./mvnw -s .settings.xml clean install \n\nLooking for more information? Follow this [link](https://github.com/spring-cloud/spring-cloud-dataflow/blob/master/spring-cloud-dataflow-docs/src/main/asciidoc/appendix-building.adoc).\n\n### Building on Windows\n\nWhen using Git on Windows to check out the project, it is important to handle line-endings correctly during checkouts.\nBy default Git will change the line-endings during checkout to `CRLF`. This is, however, not desired for _Spring Cloud Data Flow_\nas this may lead to test failures under Windows.\n\nTherefore, please ensure that you set Git property `core.autocrlf` to `false`, e.g. using: `$ git config core.autocrlf false`.\nFor more information please refer to the [Git documentation, Formatting and Whitespace](https://git-scm.com/book/en/v2/Customizing-Git-Git-Configuration).\n\n----\n\n## Running Locally w/ Oracle \nBy default, the Dataflow server jar does not include the Oracle database driver dependency.\nIf you want to use Oracle for development/testing when running locally, you can specify the `local-dev-oracle` Maven profile when building.\nThe following command will include the Oracle driver dependency in the jar:\n```\n$ ./mvnw -s .settings.xml clean package -Plocal-dev-oracle\n```\nYou can follow the steps in the [Oracle on Mac ARM64](https://github.com/spring-cloud/spring-cloud-dataflow/wiki/Oracle-on-Mac-ARM64#run-container-in-docker) Wiki to run Oracle XE locally in Docker with Dataflow pointing at it.\n\n\u003e **NOTE:** If you are not running Mac ARM64 just skip the steps related to Homebrew and Colima \n\n----\n\n## Running Locally w/ Microsoft SQL Server\nBy default, the Dataflow server jar does not include the MSSQL database driver dependency.\nIf you want to use MSSQL for development/testing when running locally, you can specify the `local-dev-mssql` Maven profile when building.\nThe following command will include the MSSQL driver dependency in the jar:\n```\n$ ./mvnw -s .settings.xml clean package -Plocal-dev-mssql\n```\nYou can follow the steps in the [MSSQL on Mac ARM64](https://github.com/spring-cloud/spring-cloud-dataflow/wiki/MSSQL-on-Mac-ARM64#running-dataflow-locally-against-mssql) Wiki to run MSSQL locally in Docker with Dataflow pointing at it.\n\n\u003e **NOTE:** If you are not running Mac ARM64 just skip the steps related to Homebrew and Colima\n\n----\n\n## Running Locally w/ IBM DB2\nBy default, the Dataflow server jar does not include the DB2 database driver dependency.\nIf you want to use DB2 for development/testing when running locally, you can specify the `local-dev-db2` Maven profile when building.\nThe following command will include the DB2 driver dependency in the jar:\n```\n$ ./mvnw -s .settings.xml clean package -Plocal-dev-db2\n```\nYou can follow the steps in the [DB2 on Mac ARM64](https://github.com/spring-cloud/spring-cloud-dataflow/wiki/DB2-on-Mac-ARM64#running-dataflow-locally-against-db2) Wiki to run DB2 locally in Docker with Dataflow pointing at it.\n\n\u003e **NOTE:** If you are not running Mac ARM64 just skip the steps related to Homebrew and Colima\n\n----\n\n## Contributing\n\nWe welcome contributions! See the [CONTRIBUTING](./CONTRIBUTING.adoc) guide for details.\n\n----\n\n## Code formatting guidelines\n\n* The directory ./src/eclipse has two files for use with code formatting, `eclipse-code-formatter.xml` for the majority of the code formatting rules and `eclipse.importorder` to order the import statements.\n\n* In eclipse you import these files by navigating `Windows -\u003e Preferences` and then the menu items `Preferences \u003e Java \u003e Code Style \u003e Formatter` and `Preferences \u003e Java \u003e Code Style \u003e Organize Imports` respectfully.\n\n* In `IntelliJ`, install the plugin `Eclipse Code Formatter`.  You can find it by searching the \"Browse Repositories\" under the plugin option within `IntelliJ` (Once installed you will need to reboot Intellij for it to take effect).\nThen navigate to `Intellij IDEA \u003e Preferences` and select the Eclipse Code Formatter.  Select the `eclipse-code-formatter.xml` file for the field `Eclipse Java Formatter config file` and the file `eclipse.importorder` for the field `Import order`.\nEnable the `Eclipse code formatter` by clicking `Use the Eclipse code formatter` then click the *OK* button.\n** NOTE: If you configure the `Eclipse Code Formatter` from `File \u003e Other Settings \u003e Default Settings` it will set this policy across all of your Intellij projects.\n\n## License\n\nSpring Cloud Data Flow is Open Source software released under the [Apache 2.0 license](https://www.apache.org/licenses/LICENSE-2.0.html).\n","funding_links":[],"categories":[],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fspring-attic%2Fspring-cloud-dataflow","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fspring-attic%2Fspring-cloud-dataflow","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fspring-attic%2Fspring-cloud-dataflow/lists"}