Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/enova/scout
An SQS listener for rails apps and sidekiq
https://github.com/enova/scout
Last synced: about 1 month ago
JSON representation
An SQS listener for rails apps and sidekiq
- Host: GitHub
- URL: https://github.com/enova/scout
- Owner: enova
- License: mit
- Created: 2016-12-05T17:48:05.000Z (about 8 years ago)
- Default Branch: main
- Last Pushed: 2024-09-07T18:04:37.000Z (4 months ago)
- Last Synced: 2024-09-07T19:25:54.354Z (4 months ago)
- Language: Go
- Size: 39.1 KB
- Stars: 14
- Watchers: 22
- Forks: 9
- Open Issues: 3
-
Metadata Files:
- Readme: README.md
- Changelog: Changelog.md
- License: LICENSE.txt
Awesome Lists containing this project
README
# Scout
![ci-status](https://github.com/enova/scout/workflows/CI/badge.svg)Scout is a daemon for listening to a set of SNS topics and enqueuing anything it
finds into sidekiq jobs. It's meant to extract processing of SQS from the rails
apps that increasingly need to do so.## Usage
```
NAME:
scout - SQS Listener
Poll SQS queues specified in a config and enqueue Sidekiq jobs with the queue items.
It gracefully stops when sent SIGTERM.USAGE:
scout [global options] command [command options] [arguments...]VERSION:
v1.6.0COMMANDS:
help, h Shows a list of commands or help for one commandGLOBAL OPTIONS:
--config FILE, -c FILE Load config from FILE, required
--freq N, -f N Poll SQS every N milliseconds (default: 100)
--log-level value, -l value Sets log level. Accepts one of: debug, info, warn, error
--json, -j Log in json format
--help, -h show help
--version, -v print the version
```## Configuration
The configuration requires 3 distinct sets of information. It needs information
about how to connect to redis to enqueue jobs, credentials to talk to AWS and
read SQS, and a mapping from SNS topics to sidekiq worker classes in the
application. The structure looks like this.```yaml
redis:
host: "localhost:9000"
queue: "background"
namespace: "test" # optional key
password: "someoptionalpassword" # optional key
aws:
access_key: "super"
secret_key: "secret"
region: "us-best"
queue:
name: "myapp_queue"
topics:
foo-topic: "FooWorker"
bar-topic: "BazWorker"
```None of this information is actually an example of anything other than the
strucure of the file, so if you copy paste it you'll probably be disappointed.### Environment Variables
A few optional settings can also be configured by environment variable:
* `SCOUT_SQS_MAX_NUMBER_OF_MESSAGES` - Max number of SQS messages to fetch at once
* `SCOUT_SQS_WAIT_TIME_SECONDS` - Max seconds to wait for an SQS message per poll
* `SCOUT_SQS_VISIBILITY_TIMEOUT` - How long to hide an SQS message after receiving it## Versioning
Scout uses tagged commits that are compatible with go modules. The first module
aware version of scout is version `v1.5.0`. We recommend that you also use go
modules to guard against unexpected updates.For legacy systems not using go modules, you can import using gopkg.in to pin
to version 1. The import path is `gopkg.in/enova/scout.v1`.## Development
Scout uses go modules to manage it's dependencies, so you should clone it to a
location outside your `GOPATH`. At that point all the standard toolchain commands
do what they say on the box.### Testing
The normal test suite can be run as expected with go test. There are also two
tagged files with expensive integration tests that require external services.
They can be run as follows```
[FG-386] scout > go test -run=TestSQS -v -tags=sqsint
=== RUN TestSQS_Init
--- PASS: TestSQS_Init (3.84s)
=== RUN TestSQS_FetchDelete
--- PASS: TestSQS_FetchDelete (3.58s)
PASS
ok github.com/enova/scout 7.422s
[FG-386] scout > go test -run=TestWorker -v -tags=redisint
=== RUN TestWorker_Init
--- PASS: TestWorker_Init (0.00s)
=== RUN TestWorker_Push
--- PASS: TestWorker_Push (0.00s)
PASS
ok github.com/enova/scout 0.013s
```The tests themselves (found in `sqs_client_test.go` and `worker_client_test.go`)
explain what is required to run them. In particular, the SQS integration tests
require that you provide AWS credentials to run them.