Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/davidarchanjo/spring-boot-kafka
Sample Spring Boot project for demostrating communication between microservices through events via Kafka.
https://github.com/davidarchanjo/spring-boot-kafka
docker dockercompose java kafka spring-boot
Last synced: 3 days ago
JSON representation
Sample Spring Boot project for demostrating communication between microservices through events via Kafka.
- Host: GitHub
- URL: https://github.com/davidarchanjo/spring-boot-kafka
- Owner: davidarchanjo
- Created: 2021-07-07T08:35:52.000Z (over 3 years ago)
- Default Branch: main
- Last Pushed: 2021-08-22T23:11:03.000Z (about 3 years ago)
- Last Synced: 2024-10-12T12:24:00.885Z (about 1 month ago)
- Topics: docker, dockercompose, java, kafka, spring-boot
- Language: Java
- Homepage:
- Size: 871 KB
- Stars: 1
- Watchers: 1
- Forks: 1
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
# Docker Compose + Kafka + Spring Boot
![banner](./assets/banner.jpg)## Overview
This project aims to be a reference for those just starting to work with Java and SpringBoot in which there is a need to build an inner communication between microservices through events via Kafka. To speed up the setup of the development environment, a dockercompose is provided to provision the necessary infrastructure, i.e. Kafka and Zookeeper. **Be aware of the proposed infrastructure is suitable for development and not for production operation.**
## Prerequisites- Maven 3+
- Java 8+
- Docker 19.03+## Preparing Environment
From the project's folder, execute:
- `docker-compose up` to initialize Kafka and Zookeeper
- `mvn package` to build the applications## Booting Applications
- Initializing the `producer`
````bash
$ cd producer
$ mvn spring-boot:run
````
**Note:** The `producer` will be accepting request at `http://localhost:8080/orders`- Initializing the `consumer`
````bash
$ cd consumer
$ mvn spring-boot:run
````
**Note:** The `consumer` has no endpoint, it just connects to the Kafka to listen to the stream events.## Testing
With both `consumer` and `producer` applications up and running, let's test their integration through Kafka:
````bash
$ curl -d "{'idorder': '1234', 'customer': 'Foo Bar', 'value': 4321}" \
-H "Content-Type: application/json" \
-X POST http://127.0.0.1:8080/orders
````If the above request works, we should see a log in the `producer` application's console that looks like the following:
````
2021-07-07 06:23:26.433 INFO 2336 --- [ad | producer-1] b.c.d.producer.config.KafkaConfig : ACK from ProducerListener message: {"idorder": "12345", "customer": "Foo Bar", "value": 54321} offset: 0
````
and something like in the `consumer` application's console:
````
2021-07-07 06:23:26.490 INFO 3996 --- [ntainer#0-0-C-1] b.c.d.consumer.kafka.KafkaConsumer : Order: {"idorder": "12345", "customer": "Foo Bar", "value": 54321}
````## Cleaning Up
After playing around, we need to clean up the mess.. So from the project's folder, do:
- Stop the containers using the following command:
docker-compose down
- Delete all containers using the following command:
docker rm -f $(docker ps -a -q)