Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/mfkimbell/github-actions-custom-actions
This project utilizes various composite, javascript, and docker actions through Github Actions workflows. These test and deploy a web application on S3.
https://github.com/mfkimbell/github-actions-custom-actions
aws-s3 composite-action docker-action javascript-action
Last synced: about 1 month ago
JSON representation
This project utilizes various composite, javascript, and docker actions through Github Actions workflows. These test and deploy a web application on S3.
- Host: GitHub
- URL: https://github.com/mfkimbell/github-actions-custom-actions
- Owner: mfkimbell
- Created: 2023-10-27T18:17:55.000Z (about 1 year ago)
- Default Branch: main
- Last Pushed: 2023-12-01T02:45:45.000Z (about 1 year ago)
- Last Synced: 2023-12-01T03:27:04.244Z (about 1 year ago)
- Topics: aws-s3, composite-action, docker-action, javascript-action
- Language: JavaScript
- Homepage:
- Size: 1.25 MB
- Stars: 1
- Watchers: 1
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
# github-actions-custom-actions
### **Tools Used:**
* `Composite Actions` For simple combinations of different jobs, with inputs and outputs for dynamic responses
* `Javascript Actions` For custom actions that run scripts in combination with a .js file
* `Docker Actions` For cutom actions that need scripts run in a non-javascript language and for more control over enviornment
* `AWS-S3` For static website hosting
### Custom Composite Action:
The first action I assigned is a composite action. This composite action checks whether caching needs to be performed, then uses a Github custom action to download, install, and then cache dependencies with "actions/cache@v3". The install step is only run if we fail our cache-hit or we specify with an input to specifically not use cache:
```if: steps.cache.outputs.cache-hit != 'true' || inputs.caching != 'true'```
```echo "cache='${{ inputs.caching }}'" >> $GITHUB_OUTPUT```
This line adds an ouput that is used lated in our "deploy.yml":
```
- name: Load & cache dependencies
id: cache-deps
uses: ./.github/actions/cached-deps
with:
caching: "false" #disables cache for this step
- name: Output information
run: echo "Cache used? ${{steps.cache-deps.outputs.used-cache}}"
```
Caching will only be attempted if it was attempted previously OR a step has specified with a input "caching=false"
By default, if not specified, we attempt to use cached dependencies if they are available. We check an output from the "Install dependencies" step by referencing the "install" id, and print whether cache was used as a log for the step. The output is assigned from the cache step in line
This is where this custom action is used in the main workflow:
Next we run a test step. It gathers the required files and then uses a built in npm testing library to run a test.jsx file that tests for the presence of a button as well as the popup component of the button.
### Custom Javascript Action:
For our website deployment, I use a javascript custom action. I define input parameters, with "bucket" and "dist-folder" being required, and "bucket-region" defaulting to 'us-east-1', which are passed in from the "deploy.yml" if they are set as "required: true". These will be used during the execution of the "main.js" file.
"main.js" Uses built-in Github commands through the "require()" function passing in Github custom actions "@actions/core" and "@actions/exec". We can use that "core" object to connect to AWS and pass in our inputs, and upload our static webpage to AWS.
We add an "output" to action.yml
```
outputs:
website-url:
description: 'The URL of the deployed website.'
```
and then set that output in "main.js"```
const websiteUrl = `http://${bucket}.s3-website-${bucketRegion}.amazonaws.com`
core.setOutput('website-url',websiteUrl) // ::set
```
and that results in our URL being added to our actions logs:### Custom Javascript Action:
Key difference is that we are going to use a docker image based on a Dockerfile. This is so we can execute in a language of our choice. I chose Python.All our docker file does is copy the requirements.txt and deployment.py, install the dependencies, then call the python function, which performs the exact same as main.js but in Python.
And here's the python file:
We change the deploy step to use our new docker action:
Finally, we can see the results of our change, we can see our job is now running on a Docker image.
And we get the same result as before: