Ecosyste.ms: Awesome

An open API service indexing awesome lists of open source software.

Awesome Lists | Featured Topics | Projects

https://github.com/migangqui/cloud-storage-spring-api

Manage files from AWS S3, Google Cloud Storag, Azure Blob Storage and Dropbox in Spring
https://github.com/migangqui/cloud-storage-spring-api

amazon aws aws-s3 azure azure-storage-blob cloud-storage dropbox google-cloud google-cloud-storage java kotlin s3 s3-bucket spring

Last synced: 8 days ago
JSON representation

Manage files from AWS S3, Google Cloud Storag, Azure Blob Storage and Dropbox in Spring

Awesome Lists containing this project

README

        

# Cloud Storage Spring (Java/Kotlin)

![GitHub last commit](https://img.shields.io/github/last-commit/migangqui/cloud-storage-spring-api?style=for-the-badge)
![Maven Central](https://img.shields.io/maven-central/v/com.github.sevtech-dev/cloud-storage-spring-java?style=for-the-badge)

Here we are a Java and a Kotlin API to manage files of AWS S3, Google Cloud Storage, Azure Blob Storage and Dropbox in Spring framework.

AWS S3



Google Cloud Storage



Azure Store Blob



Dropbox


In order to use it, are necessaries the following steps:

### Add dependency to Maven or Gradle:

If you use Java:

```xml

com.github.sevtech-dev
cloud-storage-spring-java
${currentVersion}

```
```groovy
implementation 'com.github.sevtech-dev:cloud-storage-spring-java:${currentVersion}'
```

If you use Kotlin:

```xml

com.github.sevtech-dev
cloud-storage-spring-kotlin
${currentVersion}

```
```groovy
implementation 'com.github.sevtech-dev:cloud-storage-spring-kotlin:${currentVersion}'
```

```${currentVersion}``` right now is ```1.1.2```

## Configuration

To enable the wanted client configuration, you must set the following properties:

To AWS S3:
```yaml
aws:
s3:
enabled: true
accessKey: [AMAZON_ACCESS_KEY]
secretKey: [AMAZON_SECRET_KEY]
bucket:
name: yourbucketname
region: [GovCloud("us-gov-west-1"),
US_EAST_1("us-east-1"),
US_WEST_1("us-west-1"),
US_WEST_2("us-west-2"),
EU_WEST_1("eu-west-1"),
EU_CENTRAL_1("eu-central-1"),
AP_SOUTH_1("ap-south-1"),
AP_SOUTHEAST_1("ap-southeast-1"),
AP_SOUTHEAST_2("ap-southeast-2"),
AP_NORTHEAST_1("ap-northeast-1"),
AP_NORTHEAST_2("ap-northeast-2"),
SA_EAST_1("sa-east-1"),
CN_NORTH_1("cn-north-1")]*
```
** Only one and only the string of the region.

* **accessKey/secretKey**: the keys to connect to AWS. To get it, check out this information.
* **bucket.name**: the bucket name.
* **region**: AWS region where your bucket is located.

To Google Cloud Storage:
```yaml
gcp:
storage:
enabled: true
bucket:
name: yourbucketname
keyfile: "where-you-keyfile"
```
* **bucket.name**: the bucket name.
* **keyfile**: the file to authenticate. To generate it, check out this information.

To Azure Blob Storage:
```yaml
azure:
blob:
storage:
enabled: true
connectionString: "your-connection-string"
container:
name: containername
```
* **connectionString**: to get it, check out this information.
* **container.name**: name of your files container.

To Dropbox:
```yaml
dropbox:
enabled: true
accessToken: "accessToken"
clientIdentifier: "clientIdentifier"
```
* **accessToken**: to get it, check out this information.
* **clientIdentifier**: name of your app.

## Enable async

Add ```@EnableAsync``` annotation in your Spring Application class to enable async upload method.

## File size

To controle max size of files you can upload, set the following properties:
```yaml
spring:
servlet:
multipart:
max-file-size: 128KB
max-request-size: 128KB
```

## How to use

You have to inject ```StorageService``` as dependency in your Spring component.

If you use more than one provider, you must name your bean
as awsS3Service to AWS S3, googleCloudStorageService to Google Cloud Storage, azureBlobStorageService to Azure Blob Storage,
and dropboxService to Dropbox.

The service provide these methods:

##### Java
```java
public interface StorageService {

UploadFileResponse uploadFile(UploadFileRequest request);

Future uploadFileAsync(UploadFileRequest request);

GetFileResponse getFile(GetFileRequest request);

DeleteFileResponse deleteFile(DeleteFileRequest request);

}
```
##### Kotlin
```kotlin
interface StorageService {

fun uploadFile(request: UploadFileRequest): UploadFileResponse

fun uploadFileAsync(request: UploadFileRequest): Future

fun getFile(request: GetFileRequest): GetFileResponse

fun deleteFile(request: DeleteFileRequest): DeleteFileResponse

}
```

### Model

#### Upload

*UploadFileRequest*

* **stream (InputStream)**: content of your file.
* **folder (String)**: folder where you want to save the file. Ex: folder/subfolder1/subfolder2
* **name (String)**: name of the uploaded file. Ex: image.jpg, image
* **contentType (String)**: type of content of the file.
* **bucketName (String, optional)**
* **accessControl**

*UploadFileResponse*

* **fileName (String)**: final name of the uploaded file.
* **status (int)**: status of the operation. 200 OK or 500 KO.
* **cause (String)**: cause of the fail.
* **exception (Exception)**: exception.
* **comment (String)**: optional comment.

#### Get

*GetFileRequest*

* **path (String)**: complete path where you want to get the file from. Ex: folder/subfolder1/subfolder2/file.jpg
* **bucketName (String, optional)**

*GetFileResponse*

* **stream (InputStream)**: content of your file.
* **status (int)**: status of the operation. 200 OK or 500 KO.
* **cause (String)**: cause of the fail.
* **exception (Exception)**: exception.

#### Delete

*DeleteFileRequest*

* **path (String)**: complete path where you want to get the file from. Ex: folder/subfolder1/subfolder2/file.jpg
* **bucketName (String, optional)**

*DeleteFileResponse*

* **result boolean)**: result of the deletion. true ok or false ko.
* **status (int)**: status of the operation. 200 OK or 500 KO.
* **cause (String)**: cause of the fail.
* **exception (Exception)**: exception.

## Test in local

### AWS S3: Localstack support

This library can be tested with Localstack. You only have to set the following properties in your application.yml:

```yaml
aws:
s3:
localstack:
enabled: true
endpoint: http://localhost:4572
region: us-east-1
```

In order to run easily Localstack, I have added ```docker-compose.yml``` file to the folder ```localstack```.
You have run the command ```docker-compose up``` to make it work.

I hardly recommend install AWS CLI in your local. It helps you to manage the buckets to run the tests with Localstack.
Here you are the documentation to install the version 2:

To create a local bucket you must run this command `aws2 --endpoint-url=http://localhost:4572 s3 mb s3://mytestbucket`

To check out if the bucket has been created run this command `aws2 --endpoint-url=http://localhost:4572 s3 ls`

When you create a bucket, you have to add `yourbucketname.localhost` to your hosts local file mapped to `127.0.0.1`.

Here we are the AWS CLI S3 command options:

## Next adds
* Support to Alibaba Cloud Object Storage Service, Oracle Object Storage, Google Drive...
* File permissions management.

## License

This project is licensed under the MIT License - see the LICENSE.md file for details.