https://github.com/lauraschlimmer/logpusher
Command-line tool to constantly import any kind of logs into a database
https://github.com/lauraschlimmer/logpusher
eventql logfile logfile-import logfile-upload mongodb postgresql sqlite
Last synced: 2 months ago
JSON representation
Command-line tool to constantly import any kind of logs into a database
- Host: GitHub
- URL: https://github.com/lauraschlimmer/logpusher
- Owner: lauraschlimmer
- Created: 2016-10-04T15:20:52.000Z (over 8 years ago)
- Default Branch: master
- Last Pushed: 2017-11-30T13:37:21.000Z (over 7 years ago)
- Last Synced: 2025-01-29T16:12:47.139Z (4 months ago)
- Topics: eventql, logfile, logfile-import, logfile-upload, mongodb, postgresql, sqlite
- Language: Ruby
- Homepage:
- Size: 33.2 KB
- Stars: 2
- Watchers: 2
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
# logpusher
A command line tool to import logfiles into a database. New loglines in the file
are constantly added.To specify which parts of the logline should be extracted and imported into the target table, simply define a REGEX with named capturing groups.
Each group in the REGEX represents a column in the target table.### Logfile and REGEX
1475597897 lnd09 /GET / 80
1475597905 lnd09 /GET /img/home.png 80
1475597936 lnd07 /POST /account/new 80
1475597953 lnd03 /GET /about 80To store the information of the example logfile in the columns `time`, `server_name`, `http_method` and `path`, the regex could be defined as:
$ regex="(?
### Usage
$ logpusher [OPTIONS]
-f, --file Set the path of the logfile to import
-r, --regex Set the regex
-s, --storage Set the storage engine
-c, --connections Set the number of concurrent connections
--batch_size Set the batch size
-d, --database Select a database
-t, --table Select a destination table
-h, --host Set the hostname of the storage engine
-p, --port Set the port of the storage engine
-u, --user Set the username of the storage engine
-q, --quiet Run quietly
-?, --help Display this help text and exitNote that some options may vary depending on the storage engine/database system being used
### Database support
#### SQLite
Example: Import `time` and `server_name` from the logfile into the table access_logs
$ logpusher -s sqlite -f logs.access_logs -r $regex -d "dev.db" -t access_logs
#### MongoDB
Example: Connect to MongoDB on 127.0.0.1:27017 and import the logfile into the collection access_logs
$ logpusher -s mongo -f logs.access_logs -r $regex -d dev -t access_logs -h localhost -p 27017
#### EventQL
Example: Connect to the EventQL server on localhost:10001 and import the logfile into the table access_logs
$ logpusher -s eventql -f logs.access_logs -r $regex -d dev -t access_logs -h localhost -p 10001
#### PostgreSQL
Example: Connect to PostgreSQL server on localhost:5432 as user dev and import the logfile into the table access_logs
$ logpusher -s postgresql -f logs.access_logs -r $regex -d dev -t access_logs-h localhost -p 5432 -u dev