Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/aws-samples/aws-genomics-workflows
Genomics Workflows on AWS
https://github.com/aws-samples/aws-genomics-workflows
aws batch genomics step-functions workflows
Last synced: 3 months ago
JSON representation
Genomics Workflows on AWS
- Host: GitHub
- URL: https://github.com/aws-samples/aws-genomics-workflows
- Owner: aws-samples
- License: mit-0
- Archived: true
- Created: 2019-02-08T20:24:48.000Z (almost 6 years ago)
- Default Branch: master
- Last Pushed: 2023-07-31T15:33:31.000Z (over 1 year ago)
- Last Synced: 2024-11-10T04:51:50.726Z (3 months ago)
- Topics: aws, batch, genomics, step-functions, workflows
- Language: Shell
- Homepage: https://docs.opendata.aws/genomics-workflows/
- Size: 12.4 MB
- Stars: 141
- Watchers: 19
- Forks: 108
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- Contributing: CONTRIBUTING.md
- License: LICENSE
- Code of conduct: CODE_OF_CONDUCT.md
Awesome Lists containing this project
- top-life-sciences - **aws-samples/aws-genomics-workflows** - functions`, `workflows`<br><img src='https://github.com/HubTou/topgh/blob/main/icons/gstars.png'> 140 <img src='https://github.com/HubTou/topgh/blob/main/icons/forks.png'> 106 <img src='https://github.com/HubTou/topgh/blob/main/icons/watchers.png'> 19 <img src='https://github.com/HubTou/topgh/blob/main/icons/code.png'> Shell <img src='https://github.com/HubTou/topgh/blob/main/icons/license.png'> MIT-0 license <img src='https://github.com/HubTou/topgh/blob/main/icons/last.png'> 2022-03-30 21:38:09 | (Ranked by starred repositories)
- awesome_ai_agents - Aws-Genomics-Workflows - Genomics Workflows on AWS (Building / Workflows)
- awesome_ai_agents - Aws-Genomics-Workflows - Genomics Workflows on AWS (Building / Workflows)
README
# Genomics Workflows on AWS
:warning: This site and related code are no longer actively maintained as of 2023-07-31. :warning:
This allows all code and assets presented here to remain publicly available for historical reference purposes only.
For more up to date solutions to running Genomics workflows on AWS checkout:
- [Amazon Omics](https://aws.amazon.com/omics/) - a fully managed service for storing, processing, and querying genomic, transcriptomic, and other omics data into insights. [Omics Workflows](https://docs.aws.amazon.com/omics/latest/dev/workflows.html) provides fully managed execution of pre-packaged [Ready2Run](https://docs.aws.amazon.com/omics/latest/dev/service-workflows.html) workflows or private workflows you create using WDL or Nextflow.
- [Amazon Genomics CLI](https://aws.amazon.com/genomics-cli/) - an open source tool that automates deploying and running workflow engines in AWS. AGC uses the same architectural patterns described here (i.e. operating workflow engines with AWS Batch). It provides support for running WDL, Nextflow, Snakemake, and CWL based workflows.---
This repository is the source code for [Genomics Workflows on AWS](). It contains markdown documents that are used to build the site as well as source code (CloudFormation templates, scripts, etc) that can be used to deploy AWS infrastructure for running genomics workflows.
If you want to get the latest version of these solutions up and running quickly, it is recommended that you deploy stacks using the launch buttons available via the [hosted guide]().
If you want to customize these solutions, you can create your own distribution using the instructions below.
## Creating your own distribution
Clone the repo
```bash
git clone https://github.com/aws-samples/aws-genomics-workflows.git
```Create an S3 bucket in your AWS account to use for the distribution deployment
```bash
aws s3 mb
```Create and deploy a distribution from source
```bash
cd aws-genomics-workflows
bash _scripts/deploy.sh --deploy-region --asset-profile --asset-bucket s3:// test
```This will create a `dist` folder in the root of the project with subfolders `dist/artifacts` and `dist/templates` that will be uploaded to the S3 bucket you created above.
Use `--asset-profile` option to specify an AWS profile to use to make the deployment.
**Note**: the region set for `--deploy-region` should match the region the bucket `` is created in.
You can now use your deployed distribution to launch stacks using the AWS CLI. For example, to launch the GWFCore stack:
```bash
TEMPLATE_ROOT_URL=https://.s3-.amazonaws.com/test/templatesaws cloudformation create-stack \
--region \
--stack-name \
--template-url $TEMPLATE_ROOT_URL/gwfcore/gwfcore-root.template.yaml \
--capabilities CAPABILITY_IAM CAPABILITY_AUTO_EXPAND \
--parameters \
ParameterKey=VpcId,ParameterValue= \
ParameterKey=SubnetIds,ParameterValue=\",,...\" \
ParameterKey=ArtifactBucketName,ParameterValue= \
ParameterKey=TemplateRootUrl,ParameterValue=$TEMPLATE_ROOT_URL \
ParameterKey=S3BucketName,ParameterValue= \
ParameterKey=ExistingBucket,ParameterValue=false```
## Shared File System Support
Amazon EFS is supported out of the box for `GWFCore` and `Nextflow`. You have two options to use EFS.
1. **Create a new EFS File System:** Be sure to have `CreateEFS` set to `Yes` and also include the total number of subnets.
2. **Use an Existing EFS File System:** Be sure to specify the EFS ID in the `ExistingEFS` parameter. This file system should be accessible from every subnet you specify.Following successful deployment of `GWFCore`, when creating your Nextflow Resources, set `MountEFS` to `Yes`.
## Building the documentation
The documentation is built using mkdocs.
Install dependencies:
```bash
$ conda env create --file environment.yaml
```This will create a `conda` environment called `mkdocs`
Build the docs:
```bash
$ conda activate mkdocs
$ mkdocs build
```## License Summary
This library is licensed under the MIT-0 License. See the LICENSE file.