{"id":15354651,"url":"https://github.com/kikuomax/learn-aws-lambda","last_synced_at":"2025-12-26T06:50:32.446Z","repository":{"id":85897025,"uuid":"162933826","full_name":"kikuomax/learn-aws-lambda","owner":"kikuomax","description":"Learn how to use AWS Lambda","archived":false,"fork":false,"pushed_at":"2019-02-10T01:17:09.000Z","size":75,"stargazers_count":1,"open_issues_count":0,"forks_count":0,"subscribers_count":3,"default_branch":"master","last_synced_at":"2025-01-19T10:21:50.158Z","etag":null,"topics":[],"latest_commit_sha":null,"homepage":null,"language":"Python","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":null,"status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/kikuomax.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":null,"code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null}},"created_at":"2018-12-24T00:35:16.000Z","updated_at":"2019-02-10T01:17:07.000Z","dependencies_parsed_at":"2023-03-04T11:45:30.837Z","dependency_job_id":null,"html_url":"https://github.com/kikuomax/learn-aws-lambda","commit_stats":{"total_commits":15,"total_committers":1,"mean_commits":15.0,"dds":0.0,"last_synced_commit":"aa43f32933c0361dcf0676417e6e2586f210701d"},"previous_names":[],"tags_count":0,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/kikuomax%2Flearn-aws-lambda","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/kikuomax%2Flearn-aws-lambda/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/kikuomax%2Flearn-aws-lambda/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/kikuomax%2Flearn-aws-lambda/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/kikuomax","download_url":"https://codeload.github.com/kikuomax/learn-aws-lambda/tar.gz/refs/heads/master","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":243259300,"owners_count":20262448,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":[],"created_at":"2024-10-01T12:20:18.118Z","updated_at":"2025-12-26T06:50:32.395Z","avatar_url":"https://github.com/kikuomax.png","language":"Python","readme":"# Getting Started with AWS Lambda\n\nEnglish/[日本語](README_ja.md)\n\n**Table of Contents**\n\n\u003c!-- TOC depthFrom:1 depthTo:6 withLinks:1 updateOnSave:1 orderedList:0 indent:ICAgIA== --\u003e\n\n- [Getting Started with AWS Lambda](#getting-started-with-aws-lambda)\n    - [Introduction](#introduction)\n    - [Creating a Lambda function](#creating-a-lambda-function)\n        - [Creating a dedicated role](#creating-a-dedicated-role)\n        - [Deploying initial code](#deploying-initial-code)\n    - [Triggering the Lambda function from S3](#triggering-the-lambda-function-from-s3)\n        - [Creating a dedicated S3 bucket](#creating-a-dedicated-s3-bucket)\n        - [Adding a permission for S3 to the Lambda function](#adding-a-permission-for-s3-to-the-lambda-function)\n        - [Adding a trigger to the S3 bucket](#adding-a-trigger-to-the-s3-bucket)\n        - [Putting an object into the bucket](#putting-an-object-into-the-bucket)\n    - [Obtaining the contents of a given S3 object from the Lambda function](#obtaining-the-contents-of-a-given-s3-object-from-the-lambda-function)\n        - [Allowing the Lambda function to get S3 objects from the bucket](#allowing-the-lambda-function-to-get-s3-objects-from-the-bucket)\n        - [Updating the Lambda function](#updating-the-lambda-function)\n        - [Testing if the Lambda function works](#testing-if-the-lambda-function-works)\n    - [Processing a text with Amazon Comprehend](#processing-a-text-with-amazon-comprehend)\n        - [Allowing the Lambda function to run Amazon Comprehend](#allowing-the-lambda-function-to-run-amazon-comprehend)\n        - [Updating the Lambda function](#updating-the-lambda-function)\n        - [Changing logging level of the Lambda function](#changing-logging-level-of-the-lambda-function)\n    - [Saving analysis results as an S3 object](#saving-analysis-results-as-an-s3-object)\n        - [Allowing the Lambda function to PUT an S3 object](#allowing-the-lambda-function-to-put-an-s3-object)\n        - [Updating the Lambda function](#updating-the-lambda-function)\n        - [Testing if the Lambda function works](#testing-if-the-lambda-function-works)\n        - [Retrieving the latest logs through CLI](#retrieving-the-latest-logs-through-cli)\n    - [Generating documentation with Sphinx](#generating-documentation-with-sphinx)\n    - [Describing a serverless application with AWS SAM](#describing-a-serverless-application-with-aws-sam)\n        - [Describing an AWS SAM template](#describing-an-aws-sam-template)\n        - [Starting a Docker service](#starting-a-docker-service)\n        - [Building a serverless application with AWS SAM](#building-a-serverless-application-with-aws-sam)\n            - [Specifying the region](#specifying-the-region)\n        - [Packaging a serverless application with AWS SAM](#packaging-a-serverless-application-with-aws-sam)\n            - [Specifying a profile](#specifying-a-profile)\n        - [Deploying a serverless application with AWS SAM](#deploying-a-serverless-application-with-aws-sam)\n        - [Avoiding circular dependency in an AWS SAM template](#avoiding-circular-dependency-in-an-aws-sam-template)\n            - [Directly referencing an S3 bucket by ARN](#directly-referencing-an-s3-bucket-by-arn)\n            - [Using `Events` property of a Lambda function](#using-events-property-of-a-lambda-function)\n        - [Validating an AWS SAM template](#validating-an-aws-sam-template)\n    - [Appendix](#appendix)\n        - [How was the table of contents in this document generated?](#how-was-the-table-of-contents-in-this-document-generated)\n\n\u003c!-- /TOC --\u003e\n\n## Introduction\n\nThis repository is just a note for myself, but I hope someone might feel this useful.\n\nTo learn [AWS Lambda](https://docs.aws.amazon.com/lambda/index.html#lang/en_us), I supposed a function that\n1. Is triggered when an object is put in a specific S3 location (`my-bucket/inbox`)\n2. Processes the contents of the object with [Amazon Comprehend](https://docs.aws.amazon.com/comprehend/index.html#lang/en_us)\n3. Saves the results of Amazon Comprehend processing as an object in another S3 location (`my-bucket/comprehend`)\n\nMy Lambda function is written in Python 3.7.\nI am primarily working on the Tokyo region (`ap-northeast-1`), so region specific resources are located in the Tokyo region unless otherwise noted.\n\n**You need a credential with sufficient permissions to complete the following AWS manipulations.**\nI set up a profile for me but I omitted it in the following examples.\n\nThe following examples use [AWS CLI](https://aws.amazon.com/cli/) to configure components in favor of reproducibility, though, if you just started learning AWS Lambda, I recommend you first to try the AWS Lambda Console like I did.\n\n## Creating a Lambda function\n\nFirst, let's deploy a silly function.\n\n### Creating a dedicated role\n\nBefore creating a Lambda function, create a role `comprehend-s3` dedicated to it.\n\n```bash\naws iam create-role --role-name comprehend-s3 --description 'Executes Lambda function comprehend-s3' --assume-role-policy-document file://iam/policy/lambda-assume-role-policy.json\n```\n\nThe ARN of the role will be similar to `arn:aws:iam::123456789012:role/comprehend-s3`.\n\nThe role should have at least the predefined policy `AWSLambdaBasicExecutionRole` or equivalent attached, which allows a Lambda function to write logs to [CloudWatch](https://aws.amazon.com/cloudwatch/).\n\n```bash\naws iam attach-role-policy --role-name comprehend-s3 --policy-arn arn:aws:iam::aws:policy/service-role/AWSLambdaBasicExecutionRole\n```\n\n### Deploying initial code\n\nZip the [initial code](scripts/lambda_function_1.py).\n\n```bash\ncd scripts\nzip lambda_function_1.zip lambda_function_1.py\ncd ..\n```\n\nCreate a Lambda function `comprehend-s3` with the following command.\n\n```bash\naws lambda create-function --function-name comprehend-s3 --runtime python3.7 --role arn:aws:iam::123456789012:role/comprehend-s3 --handler lambda_function_1.lambda_handler --description \"Comprehends S3\" --zip-file fileb://scripts/lambda_function_1.zip\n```\n\nThe ARN of the function will be `arn:aws:lambda:ap-northeast-1:123456789012:function:comprehend-s3`.\n\n## Triggering the Lambda function from S3\n\nLet's invoke the Lambda function when an S3 object is PUT into a specific location.\n\n### Creating a dedicated S3 bucket\n\nCreate a dedicated S3 bucket `my-bucket`.\nNote that `my-bucket` is too general that you have to choose a more specific name.\n\n```bash\naws s3api create-bucket --bucket my-bucket --region ap-northeast-1 --create-bucket-configuration LocationConstraint=ap-northeast-1\n```\n\nBlock public access to the bucket, because it is unnecessary in this use case.\n\n```bash\naws s3api put-public-access-block --bucket my-bucket --public-access-block-configuration BlockPublicAcls=true,IgnorePublicAcls=true,BlockPublicPolicy=true,RestrictPublicBuckets=true\n```\n\n### Adding a permission for S3 to the Lambda function\n\nBefore enabling the notification from the S3 bucket to the Lambda function, add an appropriate permission to the function.\n\n```bash\naws lambda add-permission --function-name comprehend-s3 --principal s3.amazonaws.com --statement-id something_unique --action \"lambda:InvokeFunction\" --source-arn arn:aws:s3:::my-bucket --source-account 123456789012\n```\n\nDo not forget to replace `--statement-id something_unique` and `--source-acount 123456789012` appropriately.\n\n### Adding a trigger to the S3 bucket\n\nEnable the notification that is triggered when an object is `PUT` into a path like `inbox/*.txt`.\nThe following is the [configuration](s3/notificiation-config.json),\n\n```json\n{\n  \"LambdaFunctionConfigurations\": [\n    {\n      \"Id\": \"TextPutIntoMyBucket\",\n      \"LambdaFunctionArn\": \"arn:aws:lambda:ap-northeast-1:123456789012:function:comprehend-s3\",\n      \"Events\": [\n        \"s3:ObjectCreated:Put\"\n      ],\n      \"Filter\": {\n        \"Key\": {\n          \"FilterRules\": [\n            {\n              \"Name\": \"Prefix\",\n              \"Value\": \"inbox/\"\n            },\n            {\n              \"Name\": \"Suffix\",\n              \"Value\": \".txt\"\n            }\n          ]\n        }\n      }\n    }\n  ]\n}\n```\n\n**Note that you have to replace `LambdaFunctionArn` with the ARN of your Lambda function.**\n\n```bash\naws s3api put-bucket-notification-configuration --bucket my-bucket --notification-configuration file://s3/notification-config.json\n```\n\n### Putting an object into the bucket\n\nTest if the Lambda function is invoked when an object is put into the bucket.\n\n```bash\naws s3 cp test/test.txt s3://my-bucket/inbox/test.txt\n```\n\nYou can examine logs in CloudWatch Logs.\nIts log group name should be `/aws/lambda/comprehend-s3`.\nThey are retained forever by default, so I changed its retention period to a week to save storage.\n\n```bash\naws logs put-retention-policy --log-group-name /aws/lambda/comprehend-s3 --retention-in-days 7\n```\n\n## Obtaining the contents of a given S3 object from the Lambda function\n\nLet's add a contents retrieval feature to the Lambda function.\n\n### Allowing the Lambda function to get S3 objects from the bucket\n\nDefine a policy that allows retrieval of S3 objects from `my-bucket`.\n\n```bash\naws iam create-policy --path /learn-aws-lambda/ --policy-name S3GetObject_my-bucket_inbox --policy-document file://iam/policy/S3GetObject_my-bucket_inbox.json --description \"Allows getting an object from s3://my-bucket/inbox\"\n```\n\nThe following is the [policy document](iam/policy/S3GetObject_my-bucket_inbox.json),\n\n```json\n{\n  \"Version\": \"2012-10-17\",\n  \"Statement\": [\n    {\n      \"Effect\": \"Allow\",\n      \"Action\": [\n        \"s3:GetObject\"\n      ],\n      \"Resource\": [\n        \"arn:aws:s3:::my-bucket/inbox/*\"\n      ]\n    }\n  ]\n}\n```\n\nThe ARN of the new policy will be similar to `arn:aws:iam::123456789012:policy/learn-aws-lambda/S3GetObject_my-bucket_inbox`.\n\nAttach the policy to the role `comprehend-s3`.\n\n```bash\naws iam attach-role-policy --role-name comprehend-s3 --policy-arn arn:aws:iam::123456789012:policy/learn-aws-lambda/S3GetObject_my-bucket_inbox\n```\n\n### Updating the Lambda function\n\nWe are going to update the Lambda function with another code [`lambda_function_2.py`](scripts/lambda_function_2.py).\nBefore updating the function, zip the code.\n\n```bash\ncd scripts\nzip lambda_function_2.zip lambda_function_2.py\ncd ..\n```\n\nThen update the function with the new zip file.\n\n```bash\naws lambda update-function-code --function-name comprehend-s3 --zip-file fileb://scripts/lambda_function_2.zip\n```\n\nThe handler function also has to be changed.\n\n```bash\naws lambda update-function-configuration --function-name comprehend-s3 --handler lambda_function_2.lambda_handler\n```\n\nBy the way, I found that no stack trace is left in CloudWatch Logs when the Lambda function fails.\nThis was very inconvenient, so I wrapped the main function with a `try-except` clause to log the stack trace of any exception raised from it.\n\n```python\ndef lambda_handler(event, context):\n    global LOGGER\n    try:\n        return main(event, context)\n    except Exception as e:\n        # prints the stack trace of the exception\n        traceback.print_exc()\n        LOGGER.error(e)\n        raise e\n```\n\n### Testing if the Lambda function works\n\nCopy the test text into the S3 bucket again.\n\n```bash\naws s3 cp test/test.txt s3://my-bucket/inbox/test.txt\n```\n\nCheck CloudWatch Logs to see if the function is invoked.\n\n## Processing a text with Amazon Comprehend\n\nLet's add a Amazon Comprehend analysis feature to the Lambda function.\n\n### Allowing the Lambda function to run Amazon Comprehend\n\nCreate a policy that allows detection with Amazon Comprehend ([policy document](iam/policy/ComprehendDetectAny.json)).\n\n```bash\naws iam create-policy --policy-name ComprehendDetectAny --path /learn-aws-lambda/ --policy-document file://iam/policy/ComprehendDetectAny.json --description \"Allows detection with Amazon Comprehend\"\n```\n\nThe ARN of the policy will be similar to `arn:aws:iam::123456789012:policy/learn-aws-lambda/ComprehendDetectAny`.\n\nThen attach the policy to the role `comprehend-s3`.\n\n```bash\naws iam attach-role-policy --role-name comprehend-s3 --policy-arn arn:aws:iam::123456789012:policy/learn-aws-lambda/ComprehendDetectAny\n```\n\n### Updating the Lambda function\n\nZip the new code [lambda_function_3.py](scripts/lambda_function_3.py).\n\n```bash\ncd scripts\nzip lambda_function_3.zip lambda_function_3.py\ncd ..\n```\n\nUpdate the code of the Lambda function.\n\n```bash\naws lambda update-function-code --function-name comprehend-s3 --zip-file fileb://scripts/lambda_function_3.zip\n```\n\nDo not forget to replace the handler function with `lambda_function_3.lambda_handler`.\n\n```bash\naws lambda update-function-configuration --function-name comprehend-s3 --handler lambda_function_3.lambda_handler\n```\n\nMy Lambda function is running in the Tokyo region and any [boto3](https://boto3.amazonaws.com/v1/documentation/api/latest/index.html) client is associated with that region by default, but unfortunately the Tokyo region does not host Amazon Comprehend as of January 4, 2019.\nSo I configured my Amazon Comprehend client with the Ohio region; i.e., `us-east-2`.\nAnyway, the region does not matter as long as it supports Amazon Comprehend.\n\n```python\nCOMPREHEND_REGION = 'us-east-2'\n    # specifies the region where Amazon Comprehend is hosted\n    # because not all regions provide Amazon Comprehend\n```\n```python\ncomprehend = boto3.client('comprehend', region_name=COMPREHEND_REGION)\n```\n\nWell, update the S3 object and check CloudWatch Logs to see if the Lambda function works.\nI refrain from repeating the trivial command.\n\n### Changing logging level of the Lambda function\n\nThe [third script](scripts/lambda_function_3.py) interprets the environment variable `COMPREHEND_S3_LOGGING_LEVEL` as the logging level of the function, which is `DEBUG` by default.\nYou can control the logging level of the Lambda function by configuring the environment variable `COMPREHEND_S3_LOGGING_LEVEL`.\nFor instance, you can change it to `INFO` with the following command,\n\n```bash\naws lambda update-function-configuration --function-name comprehend-s3 --environment 'Variables={COMPREHEND_S3_LOGGING_LEVEL=INFO}'\n```\n\nIf you want to delete environment variables from the Lambda function, run the following command,\n\n```bash\naws lambda update-function-configuration --function-name comprehend-s3 --environment 'Variables={}'\n```\n\n## Saving analysis results as an S3 object\n\nLet's add a results saving feature to the Lambda function.\n\n### Allowing the Lambda function to PUT an S3 object\n\nCreate a policy that allows putting an S3 object.\n\n```bash\naws iam create-policy --path /learn-aws-lambda/ --policy-name S3PutObject_my-bucket_comprehend --policy-document file://iam/policy/S3PutObject_my-bucket_comprehend.json --description \"Allows putting an S3 object into s3://my-bucket/comprehend\"\n```\n\nThe ARN of the policy will be similar to `arn:aws:iam::123456789012:policy/learn-aws-lambda/S3PutObject_my-bucket_comprehend`.\n\nAttach the policy to the role `comprehend-s3`.\n\n```bash\naws iam attach-role-policy --role-name comprehend-s3 --policy-arn arn:aws:iam::123456789012:policy/learn-aws-lambda/S3PutObject_my-bucket_comprehend\n```\n\n### Updating the Lambda function\n\nZip the new code [lambda_function_4.py](scripts/lambda_function_4.py).\n\n```bash\ncd scripts\nzip lambda_function_4.zip lambda_function_4.py\ncd ..\n```\n\nUpdate the code of the Lambda function.\n\n```bash\naws lambda update-function-code --function-name comprehend-s3 --zip-file fileb://scripts/lambda_function_4.zip\n```\n\nDo not forget to replace the handler with `lambda_function_4.lambda_handler`.\n\n```bash\naws lambda update-function-configuration --function-name comprehend-s3 --handler lambda_function_4.lambda_handler\n```\n\n### Testing if the Lambda function works\n\nWell, update the S3 object and check CloudWatch Logs to see if the Lambda function is invoked.\n\nYou will find a JSON file at `s3://my-bucket/comprehend/test.json`.\nTest if it matches the [reference](test/test-ref.json).\n\n```bash\naws s3 cp s3://my-bucket/comprehend/test.json test/test.json\ndiff test/test.json test/test-ref.json\n```\n\n### Retrieving the latest logs through CLI\n\nLet's check the latest logs through CLI.\nThe following are brief steps to get logs,\n\n1. Obtain the latest log stream with `aws logs describe-log-streams`.\n2. Obtain the last n lines of the log stream identified at the step 1 with `aws logs get-log-events`.\n\nHere is a one-liner for bash,\n\n```bash\naws logs get-log-events --log-group-name /aws/lambda/comprehend-s3 --log-stream-name `aws --query 'logStreams[0].logStreamName' logs describe-log-streams --log-group-name /aws/lambda/comprehend-s3 --descending --order-by LastEventTime --max-items 1 | tr -d '\"'` --limit 12 --no-start-from-head --query 'events[].message'\n```\n\nYou will get results similar to the following by running the command shown above,\n\n```\n[\n    \"START RequestId: a7f14ae4-10a4-11e9-ab9d-478d60f1dab7 Version: $LATEST\\n\",\n    \"[INFO]\\t2019-01-05T04:45:58.535Z\\ta7f14ae4-10a4-11e9-ab9d-478d60f1dab7\\trequest ID: a7f14ae4-10a4-11e9-ab9d-478d60f1dab7\\n\",\n    \"\\n[INFO]\\t2019-01-05T04:45:58.535Z\\ta7f14ae4-10a4-11e9-ab9d-478d60f1dab7\\tobtaining: s3://my-bucket/inbox/test.txt\\n\",\n    \"\\n[INFO]\\t2019-01-05T04:45:58.557Z\\ta7f14ae4-10a4-11e9-ab9d-478d60f1dab7\\tdetecting dominant language\\n\",\n    \"\\n[INFO]\\t2019-01-05T04:45:58.743Z\\ta7f14ae4-10a4-11e9-ab9d-478d60f1dab7\\tdetecting entities\\n\",\n    \"\\n[INFO]\\t2019-01-05T04:45:58.984Z\\ta7f14ae4-10a4-11e9-ab9d-478d60f1dab7\\tdetecting key phrases\\n\",\n    \"\\n[INFO]\\t2019-01-05T04:45:59.200Z\\ta7f14ae4-10a4-11e9-ab9d-478d60f1dab7\\tdetecting sentiment\\n\",\n    \"\\n[INFO]\\t2019-01-05T04:45:59.405Z\\ta7f14ae4-10a4-11e9-ab9d-478d60f1dab7\\tdetecting syntax\\n\",\n    \"\\n[INFO]\\t2019-01-05T04:45:59.598Z\\ta7f14ae4-10a4-11e9-ab9d-478d60f1dab7\\tsaving: s3://my-bucket/comprehend/test.json\\n\",\n    \"END RequestId: a7f14ae4-10a4-11e9-ab9d-478d60f1dab7\\n\",\n    \"REPORT RequestId: a7f14ae4-10a4-11e9-ab9d-478d60f1dab7\\tDuration: 1162.23 ms\\tBilled Duration: 1200 ms \\tMemory Size: 128 MB\\tMax Memory Used: 34 MB\\t\\n\",\n    \"\\n\"\n]\n```\n\n## Generating documentation with Sphinx\n\n**This section has really nothing to do with AWS.**\n\nYou may have noticed that functions in the [fourth script](scripts/lambda_function_4.py) have [docstrings](https://www.python.org/dev/peps/pep-0257/).\nYou can generate documentation with [Sphinx](http://www.sphinx-doc.org/en/master/) by running `make` in the [`docs` directory](docs).\n\nTake the following steps,\n\n1. Install Sphinx.\n\n    ```bash\n    pip install -U Sphinx\n    ```\n\n2. Move down to the `docs` directory.\n\n    ```bash\n    cd docs\n    ```\n\n3. Run the `make` script.\n\n    ```bash\n    make html\n    ```\n\n4. You will find the `html` directory in the `build` directory.\n\n## Describing a serverless application with AWS SAM\n\nBy using [AWS Serverless Application Model (AWS SAM)](https://docs.aws.amazon.com/serverless-application-model/latest/developerguide/what-is-sam.html) which is an extension of [AWS CloudFormation](https://aws.amazon.com/cloudformation/), we can integrate resource allocation and configuration steps described above in a single AWS SAM template file.\nIf you are new to AWS SAM, I recommend you to take [this tutorial](https://docs.aws.amazon.com/serverless-application-model/latest/developerguide/serverless-quick-start.html).\n\nBasic steps are,\n1. [Describe](#describing-an-aws-sam-template)\n2. [Build](#building-a-serverless-application-with-aws-sam)\n3. [Package](#packaging-a-serverless-application-with-aws-sam)\n4. [Deploy](#deploying-a-serverless-application-with-aws-sam)\n\n### Describing an AWS SAM template\n\nThe AWS SAM template and source code of our serverless application are in the directory `sam`.\n\n- `sam`\n    - [`template.yaml`](sam/template.yaml): AWS SAM template\n    - `src`\n        - [`lambda_function_4.py`](sam/src/lambda_function_4.py): Lambda handler (identical to the last example)\n        - [`requirements.txt`](sam/src/requirements.txt): dependencies\n\n[`sam/template.yaml`](sam/template.yaml) is the AWS SAM template describing our serverless application.\n[`sam/src/requirements.txt`](sam/src/requirements.txt) is an empty text because our serverless application has no dependencies.\n\nThe following sections suppose you are in the `sam` directory.\nSo move down to it.\n\n```bash\ncd sam\n```\n\n### Starting a Docker service\n\nBefore working with AWS SAM, do not forget to boot a Docker service.\nOtherwise you will get an error similar to the following when you run `sam build --use-container` command.\n\n```\n2019-01-06 22:14:12 Starting Build inside a container\n2019-01-06 22:14:12 Found credentials in shared credentials file: ~/.aws/credentials\n2019-01-06 22:14:12 Building resource 'HelloWorldFunction'\nTraceback (most recent call last):\n  File \"/Users/kikuo/Library/Python/3.7/lib/python/site-packages/urllib3/connectionpool.py\", line 600, in urlopen\n    chunked=chunked)\n  File \"/Users/kikuo/Library/Python/3.7/lib/python/site-packages/urllib3/connectionpool.py\", line 354, in _make_request\n    conn.request(method, url, **httplib_request_kw)\n  File \"/Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/http/client.py\", line 1229, in request\n    self._send_request(method, url, body, headers, encode_chunked)\n  File \"/Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/http/client.py\", line 1275, in _send_request\n    self.endheaders(body, encode_chunked=encode_chunked)\n  File \"/Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/http/client.py\", line 1224, in endheaders\n    self._send_output(message_body, encode_chunked=encode_chunked)\n  File \"/Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/http/client.py\", line 1016, in _send_output\n    self.send(msg)\n  File \"/Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/http/client.py\", line 956, in send\n    self.connect()\n  File \"/Users/kikuo/Library/Python/3.7/lib/python/site-packages/docker/transport/unixconn.py\", line 42, in connect\n    sock.connect(self.unix_socket)\nFileNotFoundError: [Errno 2] No such file or directory\n\nDuring handling of the above exception, another exception occurred:\n\nTraceback (most recent call last):\n  File \"/Users/kikuo/Library/Python/3.7/lib/python/site-packages/requests/adapters.py\", line 449, in send\n    timeout=timeout\n  File \"/Users/kikuo/Library/Python/3.7/lib/python/site-packages/urllib3/connectionpool.py\", line 638, in urlopen\n    _stacktrace=sys.exc_info()[2])\n  File \"/Users/kikuo/Library/Python/3.7/lib/python/site-packages/urllib3/util/retry.py\", line 367, in increment\n    raise six.reraise(type(error), error, _stacktrace)\n  File \"/Users/kikuo/Library/Python/3.7/lib/python/site-packages/urllib3/packages/six.py\", line 685, in reraise\n    raise value.with_traceback(tb)\n  File \"/Users/kikuo/Library/Python/3.7/lib/python/site-packages/urllib3/connectionpool.py\", line 600, in urlopen\n    chunked=chunked)\n  File \"/Users/kikuo/Library/Python/3.7/lib/python/site-packages/urllib3/connectionpool.py\", line 354, in _make_request\n    conn.request(method, url, **httplib_request_kw)\n  File \"/Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/http/client.py\", line 1229, in request\n    self._send_request(method, url, body, headers, encode_chunked)\n  File \"/Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/http/client.py\", line 1275, in _send_request\n    self.endheaders(body, encode_chunked=encode_chunked)\n  File \"/Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/http/client.py\", line 1224, in endheaders\n    self._send_output(message_body, encode_chunked=encode_chunked)\n  File \"/Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/http/client.py\", line 1016, in _send_output\n    self.send(msg)\n  File \"/Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/http/client.py\", line 956, in send\n    self.connect()\n  File \"/Users/kikuo/Library/Python/3.7/lib/python/site-packages/docker/transport/unixconn.py\", line 42, in connect\n    sock.connect(self.unix_socket)\nurllib3.exceptions.ProtocolError: ('Connection aborted.', FileNotFoundError(2, 'No such file or directory'))\n\nDuring handling of the above exception, another exception occurred:\n\nTraceback (most recent call last):\n  File \"/Users/kikuo/Library/Python/3.7/bin/sam\", line 11, in \u003cmodule\u003e\n    sys.exit(cli())\n  File \"/Users/kikuo/Library/Python/3.7/lib/python/site-packages/click/core.py\", line 722, in __call__\n    return self.main(*args, **kwargs)\n  File \"/Users/kikuo/Library/Python/3.7/lib/python/site-packages/click/core.py\", line 697, in main\n    rv = self.invoke(ctx)\n  File \"/Users/kikuo/Library/Python/3.7/lib/python/site-packages/click/core.py\", line 1066, in invoke\n    return _process_result(sub_ctx.command.invoke(sub_ctx))\n  File \"/Users/kikuo/Library/Python/3.7/lib/python/site-packages/click/core.py\", line 895, in invoke\n    return ctx.invoke(self.callback, **ctx.params)\n  File \"/Users/kikuo/Library/Python/3.7/lib/python/site-packages/click/core.py\", line 535, in invoke\n    return callback(*args, **kwargs)\n  File \"/Users/kikuo/Library/Python/3.7/lib/python/site-packages/click/decorators.py\", line 64, in new_func\n    return ctx.invoke(f, obj, *args[1:], **kwargs)\n  File \"/Users/kikuo/Library/Python/3.7/lib/python/site-packages/click/core.py\", line 535, in invoke\n    return callback(*args, **kwargs)\n  File \"/Users/kikuo/Library/Python/3.7/lib/python/site-packages/samcli/commands/build/command.py\", line 94, in cli\n    skip_pull_image, parameter_overrides)  # pragma: no cover\n  File \"/Users/kikuo/Library/Python/3.7/lib/python/site-packages/samcli/commands/build/command.py\", line 132, in do_cli\n    artifacts = builder.build()\n  File \"/Users/kikuo/Library/Python/3.7/lib/python/site-packages/samcli/lib/build/app_builder.py\", line 129, in build\n    lambda_function.runtime)\n  File \"/Users/kikuo/Library/Python/3.7/lib/python/site-packages/samcli/lib/build/app_builder.py\", line 201, in _build_function\n    runtime)\n  File \"/Users/kikuo/Library/Python/3.7/lib/python/site-packages/samcli/lib/build/app_builder.py\", line 249, in _build_function_on_container\n    self._container_manager.run(container)\n  File \"/Users/kikuo/Library/Python/3.7/lib/python/site-packages/samcli/local/docker/manager.py\", line 75, in run\n    is_image_local = self.has_image(image_name)\n  File \"/Users/kikuo/Library/Python/3.7/lib/python/site-packages/samcli/local/docker/manager.py\", line 153, in has_image\n    self.docker_client.images.get(image_name)\n  File \"/Users/kikuo/Library/Python/3.7/lib/python/site-packages/docker/models/images.py\", line 312, in get\n    return self.prepare_model(self.client.api.inspect_image(name))\n  File \"/Users/kikuo/Library/Python/3.7/lib/python/site-packages/docker/utils/decorators.py\", line 19, in wrapped\n    return f(self, resource_id, *args, **kwargs)\n  File \"/Users/kikuo/Library/Python/3.7/lib/python/site-packages/docker/api/image.py\", line 245, in inspect_image\n    self._get(self._url(\"/images/{0}/json\", image)), True\n  File \"/Users/kikuo/Library/Python/3.7/lib/python/site-packages/docker/utils/decorators.py\", line 46, in inner\n    return f(self, *args, **kwargs)\n  File \"/Users/kikuo/Library/Python/3.7/lib/python/site-packages/docker/api/client.py\", line 215, in _get\n    return self.get(url, **self._set_request_timeout(kwargs))\n  File \"/Users/kikuo/Library/Python/3.7/lib/python/site-packages/requests/sessions.py\", line 546, in get\n    return self.request('GET', url, **kwargs)\n  File \"/Users/kikuo/Library/Python/3.7/lib/python/site-packages/requests/sessions.py\", line 533, in request\n    resp = self.send(prep, **send_kwargs)\n  File \"/Users/kikuo/Library/Python/3.7/lib/python/site-packages/requests/sessions.py\", line 646, in send\n    r = adapter.send(request, **kwargs)\n  File \"/Users/kikuo/Library/Python/3.7/lib/python/site-packages/requests/adapters.py\", line 498, in send\n    raise ConnectionError(err, request=request)\nrequests.exceptions.ConnectionError: ('Connection aborted.', FileNotFoundError(2, 'No such file or directory'))\n```\n\nThe error message was not very intuitive but actually meant no Docker service was running.\nIn my case, I just installed [Docker Desktop](https://www.docker.com/products/docker-desktop) to resolve it.\n\n### Building a serverless application with AWS SAM\n\nRun the [`sam build`](https://docs.aws.amazon.com/serverless-application-model/latest/developerguide/sam-cli-command-reference-sam-build.html) command.\n\n```bash\nsam build --use-container\n```\n\nIn this example, the command may not matter.\nBecause this example has no dependencies.\n\n#### Specifying the region\n\nWhen I ran the `sam build` command, it complained that no region was specified.\nIf I supplied the `--region` option, it was resolved.\n\n```bash\nsam build --region ap-northeast-1 --use-container\n```\n\n### Packaging a serverless application with AWS SAM\n\nRun the [`sam package`](https://docs.aws.amazon.com/serverless-application-model/latest/developerguide/sam-cli-command-reference-sam-package.html) command.\n\n```bash\nsam package --template-file template.yaml --output-template-file packaged.yaml --s3-bucket artifacts-bucket\n```\n\n**NOTE:** You have to replace `artifacts-bucket` with the bucket where you want to store artifacts.\n\n#### Specifying a profile\n\nBecause the `sam package` command needs to access an S3 bucket, you have to provide a credential with sufficient privileges.\nIf you want to use a credential other than default, you can specify the `--profile` option even though `sam package --help` does not show it.\n\n```bash\nsam package --profile your-profile --template-file template.yaml --output-template-file packaged.yaml --s3-bucket artifacts-bucket\n```\n\n### Deploying a serverless application with AWS SAM\n\nRun the [`sam deploy`](https://docs.aws.amazon.com/serverless-application-model/latest/developerguide/sam-cli-command-reference-sam-deploy.html) command.\n\n```\nsam deploy --template-file packaged.yaml --stack-name comprehend-s3 --capabilities CAPABILITY_IAM\n```\n\nNow, you will find a new stack named `comprehend-s3` on the CloudFormation console.\n\n### Avoiding circular dependency in an AWS SAM template\n\nBecause the Lambda function needs privileges to access the S3 bucket and at the same time the S3 bucket needs a privilege to notify the Lambda function, they have the following circular dependency.\n\nLambda Function \u0026rightarrow; S3 Bucket \u0026rightarrow; Lambda Function \u0026rightarrow; ...\n\nIf you have circular dependency in your AWS SAM template, the `sam deploy` command fails with an error similar to the following,\n\n```\nFailed to create the changeset: Waiter ChangeSetCreateComplete failed: Waiter encountered a terminal failure state Status: FAILED. Reason: Circular dependency between resources: [ComprehendS3FunctionTextUploadPermission, ComprehendS3FunctionRole, ComprehendS3Function, ComprehendS3Bucket]\n```\n\n[This article](https://aws.amazon.com/premiumsupport/knowledge-center/unable-validate-circular-dependency-cloudformation/) explains how to avoid circular dependencies.\nI needed some trial and error to adapt it to our serverless application.\n\n#### Directly referencing an S3 bucket by ARN\n\nTo break circular dependency in an AWS SAM template, [the article introduced above](https://aws.amazon.com/premiumsupport/knowledge-center/unable-validate-circular-dependency-cloudformation/) suggests referencing an S3 bucket with its absolute ARN instead of its logical ID.\nThis means you need to know the name of the S3 bucket in advance.\nBecause CloudFormation generates a unique name for an S3 bucket by default, you have to override this behavior by giving a predictable name to the S3 bucket.\n\n```yaml\nParameters:\n  ComprehendS3BucketName:\n    Description: 'Name of the S3 bucket where input texts and output results are saved'\n    Type: String\n    Default: 'learn-aws-lambda-comprehend-s3-bucket'\n\nResources:\n  ComprehendS3Function:\n    Type: 'AWS::Serverless::Function'\n    Properties:\n      ...\n      Policies:\n        ...\n        # policy to get S3 objects in the inbox folder\n        - Version: '2012-10-17'\n          Statement:\n            - Effect: Allow\n              Action:\n                - 's3:GetObject'\n              Resource: !Sub 'arn:aws:s3:::${ComprehendS3BucketName}/inbox/*'\n                # instead of '${ComprehendS3Bucket.Arn}/inbox/*'\n                # to avoid circular dependency\n        # policy to put S3 objects in the comprehend folder\n        - Version: '2012-10-17'\n          Statement:\n            - Effect: Allow\n              Action:\n                - 's3:PutObject'\n              Resource: !Sub 'arn:aws:s3:::${ComprehendS3BucketName}/comprehend/*'\n                # instead of '${ComprehendS3Bucket.Arn}/comprehend/*'\n                # to avoid circular dependency\n    ...\n\n  ComprehendS3Bucket:\n    Type: 'AWS::S3::Bucket'\n    Properties:\n      BucketName: !Ref ComprehendS3BucketName\n        # overrides automatic name assignment\n```\n\n#### Using `Events` property of a Lambda function\n\nAt first I was trying to directly add the `NotificationConfiguration` property to the S3 bucket.\nAnd I noticed that just adding `NotificationConfiguration` to the S3 bucket was not sufficient, but an `AWS::Lambda::Permission` resource also had to be defined.\nI felt it somewhat cumbersome.\n\nBut there is a better way to address it.\nAn `AWS::Serveless::Function` resource can have an `Events` property and you can describe events to be triggered there.\n\n```yaml\nComprehendS3Function:\n  Type: 'AWS::Serverless::Function'\n  Properties:\n    ...\n    Events:\n      TextUpload:\n        Type: S3\n        Properties:\n          Bucket: !Ref ComprehendS3Bucket\n          Events: 's3:ObjectCreated:Put'\n          Filter:\n            S3Key:\n              Rules:\n                - Name: prefix\n                  Value: 'inbox/'\n                - Name: suffix\n                  Value: '.txt'\n```\n\nIf you configure the `Events` property of the Lambda function, you do not need to specify the `NotificationConfiguration` property to the S3 bucket.\n\n**NOTE:** The `Bucket` property of an event only accepts a logical ID of an S3 bucket.\nAt first I specified an ARN of the S3 bucket to the `Bucket` property and I got an error.\n\n### Validating an AWS SAM template\n\n**NOTE:** This is not essential part of this document.\n\nThere is the [`sam validate`](https://docs.aws.amazon.com/serverless-application-model/latest/developerguide/sam-cli-command-reference-sam-validate.html) command.\n\n```bash\nsam validate --template template.yaml\n```\n\nWhen I ran `sam validate`, I got the following wierd error,\n\n```\nTraceback (most recent call last):\n  File \"/Users/kikuo/Library/Python/3.7/bin/sam\", line 11, in \u003cmodule\u003e\n    sys.exit(cli())\n  File \"/Users/kikuo/Library/Python/3.7/lib/python/site-packages/click/core.py\", line 722, in __call__\n    return self.main(*args, **kwargs)\n  File \"/Users/kikuo/Library/Python/3.7/lib/python/site-packages/click/core.py\", line 697, in main\n    rv = self.invoke(ctx)\n  File \"/Users/kikuo/Library/Python/3.7/lib/python/site-packages/click/core.py\", line 1066, in invoke\n    return _process_result(sub_ctx.command.invoke(sub_ctx))\n  File \"/Users/kikuo/Library/Python/3.7/lib/python/site-packages/click/core.py\", line 895, in invoke\n    return ctx.invoke(self.callback, **ctx.params)\n  File \"/Users/kikuo/Library/Python/3.7/lib/python/site-packages/click/core.py\", line 535, in invoke\n    return callback(*args, **kwargs)\n  File \"/Users/kikuo/Library/Python/3.7/lib/python/site-packages/click/decorators.py\", line 64, in new_func\n    return ctx.invoke(f, obj, *args[1:], **kwargs)\n  File \"/Users/kikuo/Library/Python/3.7/lib/python/site-packages/click/core.py\", line 535, in invoke\n    return callback(*args, **kwargs)\n  File \"/Users/kikuo/Library/Python/3.7/lib/python/site-packages/samcli/commands/validate/validate.py\", line 30, in cli\n    do_cli(ctx, template)  # pragma: no cover\n  File \"/Users/kikuo/Library/Python/3.7/lib/python/site-packages/samcli/commands/validate/validate.py\", line 44, in do_cli\n    validator.is_valid()\n  File \"/Users/kikuo/Library/Python/3.7/lib/python/site-packages/samcli/commands/validate/lib/sam_template_validator.py\", line 83, in is_valid\n    parameter_values={})\n  File \"/Users/kikuo/Library/Python/3.7/lib/python/site-packages/samtranslator/translator/translator.py\", line 60, in translate\n    deployment_preference_collection = DeploymentPreferenceCollection()\n  File \"/Users/kikuo/Library/Python/3.7/lib/python/site-packages/samtranslator/model/preferences/deployment_preference_collection.py\", line 30, in __init__\n    self.codedeploy_iam_role = self._codedeploy_iam_role()\n  File \"/Users/kikuo/Library/Python/3.7/lib/python/site-packages/samtranslator/model/preferences/deployment_preference_collection.py\", line 89, in _codedeploy_iam_role\n    ArnGenerator.generate_aws_managed_policy_arn('service-role/AWSCodeDeployRoleForLambda')\n  File \"/Users/kikuo/Library/Python/3.7/lib/python/site-packages/samtranslator/translator/arn_generator.py\", line 28, in generate_aws_managed_policy_arn\n    return 'arn:{}:iam::aws:policy/{}'.format(ArnGenerator.get_partition_name(),\n  File \"/Users/kikuo/Library/Python/3.7/lib/python/site-packages/samtranslator/translator/arn_generator.py\", line 49, in get_partition_name\n    region_string = region.lower()\nAttributeError: 'NoneType' object has no attribute 'lower'\n```\n\nAs it is suggested [here](https://github.com/awslabs/aws-sam-cli/issues/442#issuecomment-417489857), it was resolved if I specified the region to the `AWS_DEFAULT_REGION` environment variable.\n\n```bash\nexport AWS_DEFAULT_REGION=ap-northeast-1\n```\n\n## Appendix\n\n### How was the table of contents in this document generated?\n\nI used a [slightly modified version](https://github.com/kikuomax/markdown-toc) of the [Atom](https://atom.io) plug-in [markdown-toc](https://github.com/nok/markdown-toc), which has the following additional features,\n- [Support of underscore character](https://github.com/Sorix/markdown-toc/commit/31c9e1bb6b37d692cde2395ad8f1e9c8a555d365) introduced by [Sorix](https://github.com/Sorix)\n- [Support of non-latin characters](https://github.com/Sorix/markdown-toc/commit/1d5482e3bc3dd1339190eeb9a7e6c000871df888) introduced by [Sorix](https://github.com/Sorix)\n- [`indent` option](https://github.com/kikuomax/markdown-toc/commit/40172017e833552fa109817ba42bb35e24291abc)\n\nTo enable the plug-in you can do as follows,\n\n```bash\napm install kikuomax/markdown-toc\n```\n","funding_links":[],"categories":[],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fkikuomax%2Flearn-aws-lambda","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fkikuomax%2Flearn-aws-lambda","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fkikuomax%2Flearn-aws-lambda/lists"}