Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/dblueai/dblue-stores
Dblue Store is an abstraction and a collection of clients to interact with cloud storages.
https://github.com/dblueai/dblue-stores
dblue gcs s3 sftp
Last synced: about 1 month ago
JSON representation
Dblue Store is an abstraction and a collection of clients to interact with cloud storages.
- Host: GitHub
- URL: https://github.com/dblueai/dblue-stores
- Owner: dblueai
- License: mit
- Created: 2019-12-30T13:36:55.000Z (about 5 years ago)
- Default Branch: master
- Last Pushed: 2022-12-08T07:01:56.000Z (about 2 years ago)
- Last Synced: 2024-10-04T09:07:45.082Z (3 months ago)
- Topics: dblue, gcs, s3, sftp
- Language: Python
- Homepage:
- Size: 136 KB
- Stars: 1
- Watchers: 2
- Forks: 1
- Open Issues: 11
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# Dblue Stores
Dblue Store is an abstraction and a collection of clients to interact with cloud storages.
## Install
```bash
$ pip install -U dblue-stores
```N.B. this module does not include by default the cloud storage's client requirements
to keep the library lightweight, the user needs to install the appropriate module to use with `dblue-stores`.### Install S3
```bash
pip install -U dblue-stores[s3]
```### Install GCS
```bash
pip install -U dblue-stores[gcs]
```### Install Azure Storage
```bash
pip install -U dblue-stores[azure]
```### Install SFTP
```bash
pip install -U dblue-stores[sftp]
```## Stores
This module includes clients and stores abstraction that can be used to interact with AWS S3, Azure Storage, Google Cloud Storage and SFTP.
## S3
### Normal instantiation
```python
from dblue_stores.stores.s3 import S3Stores3_store = S3Store(
endpoint_url=...,
access_key=...,
secret_key=...,
session_token=...,
region=...
)
```### Using env vars
```bash
export AWS_ENDPOINT_URL=...
export AWS_ACCESS_KEY=...
export AWS_SECRET_KEY=...
export AWS_SECURITY_TOKEN=...
exprot AWS_REGION=...
```And then you can instantiate the store
```python
from dblue_stores.stores.s3 import S3Stores3_store = S3Store()
```### Using a client
```python
from dblue_stores.stores.s3 import S3Stores3_store = S3Store(client=client)
```### Important methods
```python
s3_store.list(bucket_name, prefix='', delimiter='', page_size=None, max_items=None, keys=True, prefixes=True)
s3_store.list_prefixes(bucket_name, prefix='', delimiter='', page_size=None, max_items=None)
s3_store.list_keys(bucket_name, prefix='', delimiter='', page_size=None, max_items=None)
s3_store.check_key(key, bucket_name=None)
s3_store.get_key(key, bucket_name=None)
s3_store.read_key(key, bucket_name=None)
s3_store.upload_bytes(bytes_data, key, bucket_name=None, overwrite=False, encrypt=False, acl=None)
s3_store.upload_string(string_data, key, bucket_name=None, overwrite=False, encrypt=False, acl=None, encoding='utf-8')
s3_store.upload_file(filename, key, bucket_name=None, overwrite=False, encrypt=False, acl=None, use_basename=True)
s3_store.download_file(key, local_path, bucket_name=None, use_basename=True)
s3_store.upload_dir(dirname, key, bucket_name=None, overwrite=False, encrypt=False, acl=None, use_basename=True)
s3_store.download_dir(key, local_path, bucket_name=None, use_basename=True)
```## GCS
### Normal instantiation
```python
from dblue_stores.stores.gcs import GCSStoregcs_store = GCSStore(
project_id=...,
credentials=...,
key_path=...,
keyfile_dict=...,
scopes=...
)
```### Using a client
```python
from dblue_stores.stores.gcs import GCSStoregcs_store = GCSStore(client=client)
```### Important methods
```python
gcs_store.list(key, bucket_name=None, path=None, delimiter='/', blobs=True, prefixes=True)
gcs_store.upload_file(filename, blob, bucket_name=None, use_basename=True)
gcs_store.download_file(blob, local_path, bucket_name=None, use_basename=True)
gcs_store.upload_dir(dirname, blob, bucket_name=None, use_basename=True)
gcs_store.download_dir(blob, local_path, bucket_name=None, use_basename=True)
```## Azure Storage
### Normal instantiation
```python
from dblue_stores.stores.azure import AzureStoreaz_store = AzureStore(
account_name=...,
account_key=...,
connection_string=...
)```
### Using env vars
```bash
export AZURE_ACCOUNT_NAME=...
export AZURE_ACCOUNT_KEY=...
export AZURE_CONNECTION_STRING=...
```And then you can instantiate the store
```python
from dblue_stores.stores.azure import AzureStoreaz_store = AzureStore()
```### Using a client
```python
from dblue_stores.stores.azure import AzureStoreaz_store = AzureStore(client=client)
```### Important methods
```python
az_store.list(key, container_name=None, path=None, delimiter='/', blobs=True, prefixes=True)
az_store.upload_file(filename, blob, container_name=None, use_basename=True)
az_store.download_file(blob, local_path, container_name=None, use_basename=True)
az_store.upload_dir(dirname, blob, container_name=None, use_basename=True)
az_store.download_dir(blob, local_path, container_name=None, use_basename=True)
```## Running tests
```
pytest
```## Publish
### Install twine
```bash
pip install twine
```### Build distribution
```bash
python setup.py sdist
```### Publish to pypi
```bash
twine upload dist/*
```## Credits
Most of the code are borrowed from https://github.com/polyaxon/polystores