Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/rafaelsouzaribeiro/golang-broker
golang module broker
https://github.com/rafaelsouzaribeiro/golang-broker
apache-kafka broker golang producer-consumer
Last synced: about 2 months ago
JSON representation
golang module broker
- Host: GitHub
- URL: https://github.com/rafaelsouzaribeiro/golang-broker
- Owner: rafaelsouzaribeiro
- Created: 2024-03-15T15:30:57.000Z (10 months ago)
- Default Branch: main
- Last Pushed: 2024-05-31T22:32:47.000Z (8 months ago)
- Last Synced: 2024-05-31T23:37:52.307Z (8 months ago)
- Topics: apache-kafka, broker, golang, producer-consumer
- Language: Go
- Homepage:
- Size: 40 KB
- Stars: 0
- Watchers: 1
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
how to use aws sns and sqs?
install aws clirun docker container:
`sudo docker pull localstack/localstack`
`sudo docker container run -it -d -p 4566:4566 localstack/localstack start`How can I set up the local development environment?
`aws configure`
AWS Access Key ID [None]: fakeAccessKeyId
AWS Secret Access Key [None]: fakeSecretAccessKey
Default region name [us-east-1]: us-east-1
Default output format [None]: json
`aws configure --profile localstack`
AWS Access Key ID [None]: nome_perfil_novo
AWS Secret Access Key [None]: senha_perfil_novo
Default region name [None]: us-east-1
Default output format [None]: json
How to create topic and queue?
Create topic:
`aws --endpoint-url=http://localhost:4566 sns create-topic --name my-topic`
Create queue
`aws --endpoint-url=http://localhost:4566 sqs create-queue --queue-name my-queue --region us-east-1`
Create QueueArn
`aws --endpoint-url=http://localhost:4566 sqs get-queue-attributes --queue-url http://localhost:4566/000000000000/my-queue --attribute-names QueueArn --region us-east-1`
Subscribe SNS Topic to SQS Queue Endpoint
`aws --endpoint-url=http://localhost:4566 sns subscribe --topic-arn $TOPIC_ARN --protocol sqs --notification-endpoint $QUEUE_ARN --region us-east-1`To use SQN and SNS, follow the code below:
Sqs:```go
configs := payload.SNSSQSMessage{
Endpoint: aws.String("http://localhost:4566"),
Region: aws.String("us-east-1"),
QueueURL: "http://localhost:4566/000000000000/my-queue",
}messageChan := make(chan payload.SNSSQSMessage)
factory := factory.ISQSSNSBroker(&configs)
go factory.Receive(messageChan)for message := range messageChan {
fmt.Printf("Received message: %s Message Id: %s Topic: %s Time: %s\n",
message.Message, message.MessageId, message.TopicArn, message.Timestamp)
}select {}
```
SNS:
```go
configs := payload.SNSSQSMessage{
Endpoint: aws.String("http://localhost:4566"),
Region: aws.String("us-east-1"),
Message: "Message Test",
TopicArn: "arn:aws:sns:us-east-1:000000000000:my-topic",
}var wg sync.WaitGroup
wg.Add(1)factory := factory.ISQSSNSBroker(&configs)
go func() {
factory.Send()
wg.Done()
}()wg.Wait()
```To use apache Kafka, follow the code below:
Consumer:```go
func main() {data := payload.Message{
Topics: &[]string{"contact-adm-insert", "testar"},
Topic: "contact-adm-insert",
GroupID: "contacts",
Partition: 0,
Offset: -1,
}
canal := make(chan payload.Message)
broker := factory.NewBroker(factory.Kafka, "springboot:9092")
go broker.Consumer(&data, canal)
go broker.ListenPartition(&data, canal)for msgs := range canal {
printMessage(&msgs)
}close(canal)
select {}
}
func printMessage(msgs *payload.Message) {
fmt.Printf("topic: %s, Message: %s, Partition: %d, Key: %s, time: %s\n", msgs.Topic, msgs.Value, msgs.Partition, msgs.Key, msgs.Time.Format("2006-01-02 15:04:05"))println("Headers:")
for _, header := range *msgs.Headers {
fmt.Printf("Key: %s, Value: %s\n", header.Key, header.Value)
}
}```
Producer:
```go
func main() {
var wg sync.WaitGroup
wg.Add(1)go func() {
Producer()
wg.Done()
}()wg.Wait()
}func Producer() {
message := payload.Message{
Value: []byte("Testar"),
Topic: "contact-adm-insert",
Headers: &[]payload.Header{
{
Key: "your-header-key1",
Value: "your-header-value1",
},
{
Key: "your-header-key2",
Value: "your-header-value2",
},
},
}pro := factory.NewBroker(factory.Kafka, "springboot:9092")
pro.SendMessage(&message)}
```
To use Redis, follow the code below:
Consumer:```go
func main() {data := payload.Message{
Topics: &[]string{"contact-adm-insert", "testar"},
}
canal := make(chan payload.Message)
broker := factory.NewBroker(factory.Redis, "springboot:6379")
go broker.Consumer(&data, canal)for msgs := range canal {
printMessage(&msgs)
}close(canal)
select {}
}
func printMessage(msgs *payload.Message) {
fmt.Printf("topic: %s, Message: %s\n", msgs.Topic, msgs.Value)println("Headers:")
for _, header := range *msgs.Headers {
fmt.Printf("Key: %s, Value: %s\n", header.Key, header.Value)
}
}
```Producer:
```go
func main() {
var wg sync.WaitGroup
wg.Add(1)go func() {
Producer()
wg.Done()
}()wg.Wait()
}func Producer() {
message := payload.Message{
Value: []byte("testar"),
Topic: "contact-adm-insert",
Headers: &[]payload.Header{
{
Key: "your-header-key1",
Value: "your-header-value1",
},
{
Key: "your-header-key2",
Value: "your-header-value2",
},
},
}pro := factory.NewBroker(factory.Redis, "springboot:6379")
pro.SendMessage(&message)}
```