Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/localstack-samples/sample-mnist-digit-recognition-sagemaker
Demo to run the MNIST handwritten digit model on a locally running SageMaker endpoint
https://github.com/localstack-samples/sample-mnist-digit-recognition-sagemaker
aws developer-hub lambda localstack localstack-developer-hub machine-learning s3 s3-website sagemaker servleress
Last synced: about 2 months ago
JSON representation
Demo to run the MNIST handwritten digit model on a locally running SageMaker endpoint
- Host: GitHub
- URL: https://github.com/localstack-samples/sample-mnist-digit-recognition-sagemaker
- Owner: localstack-samples
- License: apache-2.0
- Created: 2023-04-25T15:18:55.000Z (over 1 year ago)
- Default Branch: main
- Last Pushed: 2024-08-06T04:57:13.000Z (5 months ago)
- Last Synced: 2024-08-06T07:15:47.296Z (5 months ago)
- Topics: aws, developer-hub, lambda, localstack, localstack-developer-hub, machine-learning, s3, s3-website, sagemaker, servleress
- Language: JavaScript
- Homepage:
- Size: 1.26 MB
- Stars: 4
- Watchers: 14
- Forks: 0
- Open Issues: 1
-
Metadata Files:
- Readme: README.md
- Contributing: CONTRIBUTING.md
- License: LICENSE
- Code of conduct: CODE_OF_CONDUCT.md
Awesome Lists containing this project
README
# MNIST handwritten digit recognition model running on a local SageMaker endpoint
| Key | Value |
|--------------|----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| Environment | |
| Services | S3, SageMaker, Lambda |
| Integrations | AWS SDK |
| Categories | Serverless, S3 website, Lambda function URLs, SageMaker, Machine Learning, JavaScript, Python | |
| Level | Intermediate |## Introduction
This is a sample application that demonstrates how to use SageMaker on LocalStack.
A simple web frontend allows users to draw a digit and submit it to a locally running SageMaker endpoint.
The endpoint returns a prediction of the digit, which is then displayed in the web frontend.
Request handling is performed by a Lambda function, accessible via a function URL, that uses the SageMaker SDK to invoke the endpoint.Here's a short summary of AWS service features we use:
* S3 website
* Lambda function URLs
* SageMaker endpointHere's the web application in action:
https://user-images.githubusercontent.com/39307517/234888629-4bd9deb8-ecdd-46a6-91d6-908b9f2a443c.mov
## Architecture overview
![Architecture Diagram](/assets/architecture-diagram.png?raw=True "Architecture Diagram")
## Prerequisites
### Dev environment
Create a virtualenv and install all the development dependencies there:
```bash
python -m venv .venv
source .venv/bin/activate
pip install -r requirements.txt
```If you'd like to perform training locally, you'll need to install the ml dev dependencies as well:
```bash
pip install -r ml/requirements.txt
```You'll also need npm/node installed to build the web application. Please install according to official guidelines: https://github.com/nvm-sh/nvm
### Download pytorch container image
As our inference container, we use the PyTorch inference container from the AWS ECR.```bash
aws ecr get-login-password --region eu-central-1 | docker login --username AWS --password-stdin 763104351884.dkr.ecr.eu-central-1.amazonaws.com
docker pull 763104351884.dkr.ecr.eu-central-1.amazonaws.com/pytorch-inference:1.10.2-cpu-py38-ubuntu20.04-sagemaker
```### LocalStack
Start LocalStack Pro with your Auth Token:
```bash
PERSISTENCE=1 LOCALSTACK_AUTH_TOKEN=... localstack start
```## Instructions
First, we install the dependencies for the Web application in the `web` directory:
```
(cd web; npm install)
```You can then create the AWS infrastructure on LocalStack by running the `deploy/deploy_app.py` script (make sure to have the virtual environment activated):
```
source .venv/bin/activate
python deploy/deploy_app.py
```This script will create the SageMaker endpoint with the model, which it first uploads to a bucket.
The script will also create a lambda function that will be used to invoke the endpoint.
Finally, the script will build the web application and then create a s3 website to host it.### Using the application
Once deployed, visit http://mnist-website.s3-website.localhost.localstack.cloud:4566
Draw something in the canvas and click on the button that says `Predict`.
After a few moments the resulting prediction should be displayed in the box to the right.
![Demo Picture](/assets/demo-pic.png?raw=True "Demo Picture")
## Serverless SageMaker Endpoint
To switch to a serverless SageMaker endpoint you can also execute the deployment script with the additional `-s` or `--serverless` flag:
```bash
python deploy/deploy_app.py --serverless
```## License
The code of this sample application is published under the Apache 2.0 license (see `LICENSE`).
## Contributing
We appreciate your interest in contributing to our project and are always looking for new ways to improve the developer experience. We welcome feedback, bug reports, and even feature ideas from the community.
Please refer to the [contributing file](CONTRIBUTING.md) for more details on how to get started.