Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/gabfl/sql2csv
Run MySQL and PostgreSQL queries and store result in CSV
https://github.com/gabfl/sql2csv
csv-converter mysql postgresql
Last synced: 12 days ago
JSON representation
Run MySQL and PostgreSQL queries and store result in CSV
- Host: GitHub
- URL: https://github.com/gabfl/sql2csv
- Owner: gabfl
- License: mit
- Created: 2018-11-16T14:58:44.000Z (almost 6 years ago)
- Default Branch: main
- Last Pushed: 2024-02-18T18:09:40.000Z (9 months ago)
- Last Synced: 2024-10-14T01:27:02.691Z (25 days ago)
- Topics: csv-converter, mysql, postgresql
- Language: Python
- Size: 54.7 KB
- Stars: 32
- Watchers: 1
- Forks: 9
- Open Issues: 2
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# sql2csv
[![Pypi](https://img.shields.io/pypi/v/sql2csv.svg)](https://pypi.org/project/sql2csv)
[![Build Status](https://github.com/gabfl/sql2csv/actions/workflows/ci.yml/badge.svg?branch=main)](https://github.com/gabfl/sql2csv/actions)
[![codecov](https://codecov.io/gh/gabfl/sql2csv/branch/main/graph/badge.svg)](https://codecov.io/gh/gabfl/sql2csv)
[![MIT licensed](https://img.shields.io/badge/license-MIT-green.svg)](https://raw.githubusercontent.com/gabfl/sql2csv/main/LICENSE)Run MySQL and PostgreSQL queries and store the results in CSV.
## Why sql2csv
`sql2csv` is a small utility to run MySQL and PostgreSQL queries and store the output in a CSV file.
In some environments like when using MySQL or Aurora in AWS RDS, exporting queries' results to CSV is not available with native tools. `sql2csv` is a simple module that offers this feature.
## Installation
```bash
pip3 install sql2csv# Basic usage
mysql [...] -e "SELECT * FROM table" | sql2csv
# or
psql [...] -c "SELECT * FROM table" | sql2csv
```## Example
### From stdin
For simple queries you can pipe a result directly from `mysql` or `psql` to `sql2csv`.
For more complex queries, it is recommended to use the CLI (see below) to ensure a properly formatted CSV.
```bash
mysql -U root -p"secret" my_db -e "SELECT * FROM some_mysql_table;" | sql2csvid,some_int,some_str,some_date
1,12,hello world,2018-12-01 12:23:12
2,15,hello,2018-12-05 12:18:12
3,18,world,2018-12-08 12:17:12
``````bash
psql -U postgres my_db -c "SELECT * FROM some_pg_table" | sql2csvid,some_int,some_str,some_date
1,12,hello world,2018-12-01 12:23:12
2,15,hello,2018-12-05 12:18:12
3,18,world,2018-12-08 12:17:12
```### Using `sql2csv` CLI
#### Output to stdout
```bash
$ sql2csv --engine mysql \
--database my_db --user root --password "secret" \
--query "SELECT * FROM some_mysql_table"1,12,hello world,2018-12-01 12:23:12
2,15,hello,2018-12-05 12:18:12
3,18,world,2018-12-08 12:17:12
```#### Output saved in a file
```bash
$ sql2csv --engine mysql \
--database my_db --user root --password "secret" \
--query "SELECT * FROM some_mysql_table" \
--headers \
--out file --destination_file export.csv# * Exporting rows...
# ...done
# * The result has been exported to export.csv.$ cat export.csv
id,some_int,some_str,some_date
1,12,hello world,2018-12-01 12:23:12
2,15,hello,2018-12-05 12:18:12
3,18,world,2018-12-08 12:17:12
```## Usage
```bash
usage: sql2csv [-h] [-e {mysql,postgresql}] [-H HOST] [-P PORT] -u USER
[-p PASSWORD] -d DATABASE -q QUERY [-o {stdout,file}]
[-f DESTINATION_FILE] [-D DELIMITER] [-Q QUOTECHAR] [-t]optional arguments:
-h, --help show this help message and exit
-e {mysql,postgresql}, --engine {mysql,postgresql}
Database engine
-H HOST, --host HOST Database host
-P PORT, --port PORT Database port
-u USER, --user USER Database user
-p PASSWORD, --password PASSWORD
Database password
-d DATABASE, --database DATABASE
Database name
-q QUERY, --query QUERY
SQL query
-o {stdout,file}, --out {stdout,file}
CSV destination
-f DESTINATION_FILE, --destination_file DESTINATION_FILE
CSV destination file
-D DELIMITER, --delimiter DELIMITER
CSV delimiter
-Q QUOTECHAR, --quotechar QUOTECHAR
CSV quote character
-t, --headers Include headers
```