Movatterモバイル変換


[0]ホーム

URL:


Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up
Appearance settings
This repository was archived by the owner on May 14, 2025. It is now read-only.

A microservices-based Streaming and Batch data processing in Cloud Foundry and Kubernetes

License

NotificationsYou must be signed in to change notification settings

spring-attic/spring-cloud-dataflow

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Spring Data Flow Dashboard

Spring Cloud Data Flow is no longer maintained as an open-source project by Broadcom, Inc.

For information about extended support or commercial options for Spring Cloud Data Flow, please read the official blog posthere.

Spring Cloud Data Flow is a microservices-based toolkit for building streaming and batch data processing pipelines inCloud Foundry and Kubernetes.

Data processing pipelines consist of Spring Boot apps, built using theSpring Cloud StreamorSpring Cloud Task microservice frameworks.

This makes Spring Cloud Data Flow ideal for a range of data processing use cases, from import/export to event streamingand predictive analytics.


Components

Architecture: The Spring Cloud Data Flow Server is a Spring Boot application that provides RESTful API and REST clients(Shell, Dashboard, Java DSL).A single Spring Cloud Data Flow installation can support orchestrating the deployment of streams and tasks to Local,Cloud Foundry, and Kubernetes.

Familiarize yourself with the Spring Cloud Data Flowarchitectureandfeature capabilities.

Deployer SPI: A Service Provider Interface (SPI) is defined in theSpring Cloud Deployerproject. The Deployer SPI provides an abstraction layer for deploying the apps for a given streaming or batch data pipelineand managing the application lifecycle.

Spring Cloud Deployer Implementations:

Domain Model: The Spring Cloud Data Flowdomain moduleincludes the concept of astream that is a composition of Spring Cloud Stream applications in a linear data pipelinefrom asource to asink, optionally includingprocessor application(s) in between. The domain also includes theconcept of atask, which may be any process that does not run indefinitely, includingSpring Batchjobs.

Application Registry: TheApp Registrymaintains the metadata of the catalog of reusable applications.For example, if relying on Maven coordinates, an application URI would be of the format:maven://<groupId>:<artifactId>:<version>.

Shell/CLI: TheShellconnects to the Spring Cloud Data Flow Server's REST API and supports a DSL that simplifies the process of defining astream or task and managing its lifecycle.


Building

Clone the repo and type

$ ./mvnw -s .settings.xml clean install

Looking for more information? Follow thislink.

Building on Windows

When using Git on Windows to check out the project, it is important to handle line-endings correctly during checkouts.By default Git will change the line-endings during checkout toCRLF. This is, however, not desired forSpring Cloud Data Flowas this may lead to test failures under Windows.

Therefore, please ensure that you set Git propertycore.autocrlf tofalse, e.g. using:$ git config core.autocrlf false.For more information please refer to theGit documentation, Formatting and Whitespace.


Running Locally w/ Oracle

By default, the Dataflow server jar does not include the Oracle database driver dependency.If you want to use Oracle for development/testing when running locally, you can specify thelocal-dev-oracle Maven profile when building.The following command will include the Oracle driver dependency in the jar:

$ ./mvnw -s .settings.xml clean package -Plocal-dev-oracle

You can follow the steps in theOracle on Mac ARM64 Wiki to run Oracle XE locally in Docker with Dataflow pointing at it.

NOTE: If you are not running Mac ARM64 just skip the steps related to Homebrew and Colima


Running Locally w/ Microsoft SQL Server

By default, the Dataflow server jar does not include the MSSQL database driver dependency.If you want to use MSSQL for development/testing when running locally, you can specify thelocal-dev-mssql Maven profile when building.The following command will include the MSSQL driver dependency in the jar:

$ ./mvnw -s .settings.xml clean package -Plocal-dev-mssql

You can follow the steps in theMSSQL on Mac ARM64 Wiki to run MSSQL locally in Docker with Dataflow pointing at it.

NOTE: If you are not running Mac ARM64 just skip the steps related to Homebrew and Colima


Running Locally w/ IBM DB2

By default, the Dataflow server jar does not include the DB2 database driver dependency.If you want to use DB2 for development/testing when running locally, you can specify thelocal-dev-db2 Maven profile when building.The following command will include the DB2 driver dependency in the jar:

$ ./mvnw -s .settings.xml clean package -Plocal-dev-db2

You can follow the steps in theDB2 on Mac ARM64 Wiki to run DB2 locally in Docker with Dataflow pointing at it.

NOTE: If you are not running Mac ARM64 just skip the steps related to Homebrew and Colima


Contributing

We welcome contributions! See theCONTRIBUTING guide for details.


Code formatting guidelines

  • The directory ./src/eclipse has two files for use with code formatting,eclipse-code-formatter.xml for the majority of the code formatting rules andeclipse.importorder to order the import statements.

  • In eclipse you import these files by navigatingWindows -> Preferences and then the menu itemsPreferences > Java > Code Style > Formatter andPreferences > Java > Code Style > Organize Imports respectfully.

  • InIntelliJ, install the pluginEclipse Code Formatter. You can find it by searching the "Browse Repositories" under the plugin option withinIntelliJ (Once installed you will need to reboot Intellij for it to take effect).Then navigate toIntellij IDEA > Preferences and select the Eclipse Code Formatter. Select theeclipse-code-formatter.xml file for the fieldEclipse Java Formatter config file and the fileeclipse.importorder for the fieldImport order.Enable theEclipse code formatter by clickingUse the Eclipse code formatter then click theOK button.** NOTE: If you configure theEclipse Code Formatter fromFile > Other Settings > Default Settings it will set this policy across all of your Intellij projects.

License

Spring Cloud Data Flow is Open Source software released under theApache 2.0 license.

About

A microservices-based Streaming and Batch data processing in Cloud Foundry and Kubernetes

Topics

Resources

License

Code of conduct

Security policy

Stars

Watchers

Forks

Packages

No packages published

Contributors92


[8]ページ先頭

©2009-2025 Movatter.jp