{"id":21887521,"url":"https://github.com/amaurioliveira/serverless-localstack","last_synced_at":"2025-03-22T02:21:14.737Z","repository":{"id":106071515,"uuid":"378237505","full_name":"AmauriOliveira/serverless-localstack","owner":"AmauriOliveira","description":"Palestra sobre localstack e serverless","archived":false,"fork":false,"pushed_at":"2021-07-26T18:22:54.000Z","size":118,"stargazers_count":0,"open_issues_count":0,"forks_count":0,"subscribers_count":2,"default_branch":"master","last_synced_at":"2025-01-26T20:27:20.195Z","etag":null,"topics":["aws-lambda","bash","javascript","localstack","serverless"],"latest_commit_sha":null,"homepage":"","language":"JavaScript","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":null,"status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/AmauriOliveira.png","metadata":{"files":{"readme":"readme.md","changelog":null,"contributing":null,"funding":null,"license":null,"code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null}},"created_at":"2021-06-18T18:39:32.000Z","updated_at":"2021-07-26T18:22:57.000Z","dependencies_parsed_at":null,"dependency_job_id":"8cba2844-2086-43f3-82c9-362ece771d32","html_url":"https://github.com/AmauriOliveira/serverless-localstack","commit_stats":null,"previous_names":[],"tags_count":0,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/AmauriOliveira%2Fserverless-localstack","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/AmauriOliveira%2Fserverless-localstack/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/AmauriOliveira%2Fserverless-localstack/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/AmauriOliveira%2Fserverless-localstack/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/AmauriOliveira","download_url":"https://codeload.github.com/AmauriOliveira/serverless-localstack/tar.gz/refs/heads/master","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":244894948,"owners_count":20527801,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["aws-lambda","bash","javascript","localstack","serverless"],"created_at":"2024-11-28T11:10:56.415Z","updated_at":"2025-03-22T02:21:14.706Z","avatar_url":"https://github.com/AmauriOliveira.png","language":"JavaScript","readme":"# LocalStack\n\n![Banner](https://danieldcs.com/wp-content/uploads/2020/12/diagram-localstack-1024x485.png)\n\n[TOC]\n\n## Serverless\n\n\u003e framework\n\u003e open source\n\n[Site Oficial](https://www.serverless.com/)\n\nDesenvolvimento e implantação fáceis de YAML + CLI para AWS, Azure, Google Cloud, Knative e muito mais.\n\n### Instalando Serverless\n\n\u003chttps://www.serverless.com/framework/docs/getting-started/\u003e\n\n```bash\n  npm install -g serverless\n```\n\n### Iniciando um projeto com serverless\n\n```bash\n  sls\n```\n\n### Respostas\n\nServerless: No project detected. Do you want to create a new one? **Yes**\nServerless: What do you want to make? **AWS Node.js**\nServerless: What do you want to call this project? **nome-projeto**\nProject successfully created in 'nome-projeto' folder.\nServerless: Would you like to enable this? **No**\n\nEntra no pasta e inicia um projeto node\n\n```bash\n  cd nome-projeto\n  yarn init -y\n```\n\n\u003e serverless.yml\n\n```yml\nfunctions:\n  hello:\n    handler: handler.hello\n    events:\n      - http:\n          method: get\n          path: hello\n```\n\nInstalando as dependências\n\n```bash\n  yarn add serverless nodemon serverless-offline -D\n```\n\n\u003e serverless.yml\n\n```yml\nplugins:\n  - serverless-offline\n  # Documentação recomenda deixar Serverless-offline para o fim\n```\n\nLevantando end point\n\n```bash\n  sls offline\n```\n\nAcessando o end point\n\n```bash\n  url http://localhost:3000/dev/hello\n```\n\nAtivando nodemom\n\n\u003e package.json\n\n```json\n\"scripts\": {\n  \"start\": \"npx nodemon --exec npm run offline\",\n  \"offline\": \"npx sls offline start --host 0.0.0.0\",\n    \"invoke-local:sqs\": \"npx sls invoke local -f sqsListener --path mocks/sqs-event.json\",\n    \"invoke-local:sqs-clear\": \"npx sls invoke local -f sqsClean\",\n    \"invoke-local:s3\": \"npx sls invoke local -f s3Listener --path mocks/s3-insert.json\"\n},\n```\n\nAgora esta com nodemon ativo\n\nInstalando o mocha\n\n```bash\n  yarn add -D serverless-mocha-plugin\n```\n\n\u003e serverless.yml\n\n```yml\n- serverless-mocha-plugin\n```\n\nCriando o teste\n\n```bash\n  sls create test -f hello\n```\n\n\u003e package.json\n\n```json\n\"test\": \"npx sls invoke test --path test\",\n```\n\npara rodar\n\n```bash\n  yarn test\n```\n\n## Localstack\n\n### O que é Localstack?\n\nLocalstack é um projeto de código aberto lançado pela **Atlassian** que simula cada recurso da AWS em sua máquina local. **Uma grande parte é gratuita**, como Cloudformation, Dynamo, EC2, Kinesis, S3, mas uma ótima IU e alguns serviços precisam da versão PRO de localstack como EMR, docker lambda, Athena.\n\n\u003e Para o exemplo vamos utilizar com o pacote docker\n\n### Docker-Compose\n\n\u003e exemplo\n\n```yml\nversion: '2.1'\n\nservices:\n  localstack:\n    container_name: 'localstack'\n    image: localstack/localstack\n    network_mode: bridge\n    ports:\n      - '4566:4566'\n      - '4567:4567'\n      - '4574-4576:4574-4576'\n      - '8080:8080'\n    environment:\n      - SERVICES=apigateway:4567,lambda:4574,sns:4575,sqs:4576\n      - DEFAULT_REGION=us-east-1\n      - AWS_XRAY_SDK_ENABLED=true\n      - LAMBDA_EXECUTOR=docker\n      - LAMBDA_REMOTE_DOCKER=true\n      - DOCKER_HOST=unix:///var/run/docker.sock\n    volumes:\n      - '${TMPDIR:-/tmp/localstack}:/tmp/localstack'\n      - '/var/run/docker.sock:/var/run/docker.sock'\n```\n\nComo o docker cria seu próprio ambiente, precisamos precisar quais portas locais estão vinculadas a quais portas de contêiner. Por exemplo, vinculamos as portas de 4566 a 4620 de sua máquina à mesma do contêiner.\n\nAlgumas variáveis ​​de ambiente são necessárias para fazer isso funcionar:\n\nDEBUG = 1 é usado para fornecer mais registros dentro do contêiner\n\nSERVIÇOS = s3, sqs, lambda, sns lista os serviços que você deseja implantar\n\nDEFAULT_REGION = us-east-1 especifica a região de destino de seus recursos\n\nLAMBDA_EXECUTOR = docker diz ao localstack para usar o contêiner docker dedicado para executar suas funções lambda (parece ser a melhor maneira de reproduzir uma infraestrutura real da AWS)\n\nLAMBDA_REMOTE_DOCKER = true e LAMBDA_REMOVE_CONTAINERS = true são configurações adicionais para execução do lambda docker\n\nDATA_DIR = / tmp / localstack / data é o caminho da pasta dedicado usado por localstack para salvar seus próprios dados\n\nDOCKER_HOST = unix: ///var/run/docker.sock\n\nOs volumes são necessários, pois o docker não armazena nenhum estado. Usá-los permitirá dados persistentes e evitará construir tudo cada vez que você lançar sua pilha.\n\n#### Portas e seu serviços\n\nAPI Gateway: \u003chttp://localhost:4567\u003e\n\nKinesis: \u003chttp://localhost:4568\u003e\n\nDynamoDB: \u003chttp://localhost:4569\u003e\n\nDynamoDB Streams: \u003chttp://localhost:4570\u003e\n\nElasticsearch: \u003chttp://localhost:4571\u003e\n\nS3: \u003chttp://localhost:4572\u003e\n\nFirehose: \u003chttp://localhost:4573\u003e\n\nLambda: \u003chttp://localhost:4574\u003e\n\nSNS: \u003chttp://localhost:4575\u003e\n\nSQS: \u003chttp://localhost:4576\u003e\n\nRedshift: \u003chttp://localhost:4577\u003e\n\nES (Elasticsearch Service): \u003chttp://localhost:4578\u003e\n\nSES: \u003chttp://localhost:4579\u003e\n\nRoute53: \u003chttp://localhost:4580\u003e\n\nCloudFormation: \u003chttp://localhost:4581\u003e\n\nCloudWatch: \u003chttp://localhost:4582\u003e\n\nSSM: \u003chttp://localhost:4583\u003e\n\n### Ver status geral\n\n\u003e \u003chttp://localhost:4566/health\u003e\n\n#### links\n\n[Site Oficial](https://localstack.cloud/)\n\n[GitHub Oficial](https://github.com/localstack)\n\n### Docker-Compose do projeto\n\nCriando o docker compose\n\n\u003e docker-compose.yml\n\n```yml\nversion: '3'\n\nservices:\n  localstack:\n    container_name: 'localstack'\n    image: localstack/localstack\n    network_mode: bridge\n    environment:\n      - AWS_DEFAULT_REGION=us-east-1\n      - EDGE_PORT=4566\n      - SERVICES=sqs,sns,ssm,s3,apigateway,lambda\n      - LAMBDA_EXECUTOR=local\n      - LAMBDA_REMOTE_DOCKER=false\n      - DOCKER_HOST=unix:///var/run/docker.sock\n      - DEBUG=1\n    ports:\n      - 4566:4566\n      - 4567:4567\n      - 4572:4572\n      - 4574:4574\n      - 4575:4575\n      - 4576:4576\n      - 4583:4583\n\n    volumes:\n      - '${TEMPDIR:-/tmp/localstack}:/temp/localstack'\n      - '/var/run/docker.sock:/var/run/docker.sock'\n```\n\n### Levantando o container\n\nPara executar o docker-compose e criar o contêiner\n\n```bash\n  docker-compose up -d localstack\n```\n\nVendo o ID do container\n\n```bash\n  docker ps\n```\n\nPega o container\n\n```bash\n  docker logs containerID -f\n```\n\n### SH do bucket S3\n\n\u003e create-bucket.sh\n\n```bash\n#pega pelo terminal o nome do bucket\nBUCKET_NAME=$1\n# cria o bucket\naws --endpoint-url=http://localhost:4566 s3 mb s3://$BUCKET_NAME\n# lista o bucket\naws --endpoint-url=http://localhost:4566 s3 ls\n# exemplo do uso 'bash scripts/s3/create-bucket.sh meu-bucket'\n```\n\n\u003e upload-file.sh\n\n```bash\n#pega pelo terminal o nome do bucket\nBUCKET_NAME=$1\n#pega o caminho do arquivo\nFILE_PATH=$2\n\n#faz upload do arquivo\naws --endpoint-url=http://localhost:4566 s3 cp $FILE_PATH s3://$BUCKET_NAME\n# lista o bucket\naws --endpoint-url=http://localhost:4566 s3 ls\n# exemplo do uso 'scripts/s3/upload-file.sh meu-bucket test.txt'\n```\n\n### SH do queue SQS\n\n\u003e create-queue.sh\n\n```bash\n#pega pelo terminal o nome do bucket\nQUEUE_NAME=$1\n# criando queue\n  aws --endpoint-url=http://localhost:4566 sqs create-queue --queue-name $QUEUE_NAME\n# listando queue\n  aws --endpoint-url=http://localhost:4566 sqs list-queues\n# exemplos 'bash scripts/sqs/create-queue.sh amauri'\n```\n\n\u003e send-message.sh\n\n```bash\n#pegando a url\nQUEUE_URL=$1\nMESSAGE=$2\n# Send msg to SQS\n  aws --endpoint-url=http://localhost:4566 sqs send-message --queue-url $QUEUE_URL --message-body \"$MESSAGE\"\n# Receive msg from SQS\n  aws --endpoint-url=http://localhost:4566 sqs receive-message --queue-url $QUEUE_URL\n# Exemplo 'bash scripts/sqs/send-message.sh http://localhost:4566/000000000000/amauri TokenLab'\n```\n\n### Criando Dockerfile\n\n\u003e Dockerfile\n\n```yml\nFROM lambci/lambda:build-nodejs12.x\n\nWORKDIR /src/\n\nCOPY package.json yarn.lock /src/\n\nRUN npm install\n\nCOPY . .\n\nCMD npm start\n```\n\n\u003e docker-compose.yml\n\n```yml\napp:\n  build: .\n  volumes:\n    - .:/src\n    - nodemodules:/src/node_modules\n  restart: on-failure\n  ports:\n    - 3000:3000\n  links:\n    - localstack\n  depends_on:\n    - localstack\n  environment:\n    LOCALSTACK_HOST: localhost\n    S3_PORT: 4566 # 4572\n    SQS_PORT: 4566 # 4576\n\nvolumes:\n  nodemodules: {}\n```\n\n### Instalando SDK\n\nInstalando AWS-SDK\n\n```bash\n  yarn add aws-sdk csvtojson\n```\n\nInstalando serverless-localstack\n\n```bash\n  yarn add -D serverless-localstack\n```\n\nAdicione o serverless-localstack ao serverless.yml\n\n\u003e serverless.yml\n\n```yml\nplugins:\n  - serverless-localstack\n  - serverless-mocha-plugin\n  - serverless-offline\n  # Documentação recomenda deixar Serverless-offline para o fim\n\ncustom:\n  serverless-offline: useChildProcesses:true\n```\n\nRealizando o build\n\n```bash\n  docker-compose up --build\n```\n\nTeste do Build\n\n```bash\n  curl http://localhost:3000/dev/hello\n```\n\n\u003e handler.js\n\n```js\n'use strict';\n\nconst AWS = require('aws-sdk');\nconst host = process.env.LOCALSTACK_HOST || 'localhost';\nconst s3Port = process.env.S3_PORT || '4566';\n\nconst s3config = {\n  apiVersion: '2006-03-01',\n  s3ForcePathStyle: true,\n  endpoint: new AWS.Endpoint(`http://${host}:${s3Port}`),\n};\nconst S3 = new AWS.S3(s3config);\nmodule.exports.hello = async event =\u003e {\n  return {\n    statusCode: 200,\n    body: JSON.stringify(\n      {\n        message: 'Go Serverless v1.0!',\n\n        input: event,\n      },\n      null,\n      2\n    ),\n  };\n};\n```\n\n#### Adicionando estagio nas config\n\n\u003e serverless.yml\n\n```yml\ncustom:\n  sqsArn:\n    Fn::GetAtt:\n      - SQSQueue\n      - Arn\n\n  localstack:\n    stage:\n      - local\n    autostart: false\n\n  serverless-offline:\n    useChildProcesses: true\n\nprovider:\n  name: aws\n  runtime: nodejs12.x\n  lambdaHashingVersion: 20201221\n  stage: ${opt:stage, 'dev'}\n  environment:\n    BUCKET_NAME: Amauri-Oliveira-TokenLab\n    SQS_QUEUE: file-queue\n```\n\nCriando recurso\n\n\u003e serverless.yml\n\n```yml\nresources:\n  Resources:\n    SQSQueue: #Nome é livre\n      Type: AWS::SQS::Queue\n      Properties:\n        QueueName: ${self:provider.environment.SQS_QUEUE}\n        # nunca coloque valor 0 senão gera loop infinito, é o tempo que a fila espera e resposta da lambda\n        VisibilityTimeout: 60\n```\n\nAvisando que cada lambda vai ser individual\n\n\u003e serverless.yml\n\n```yml\npackage:\n  individually: true\n  excludeDevDependencies: true\n```\n\nCriando mais lambdas\n\n\u003e serverless.yml\n\n```yml\ns3Listener:\n  handler: src/index.s3Listener\n  events:\n    - s3:\n        bucket: ${self:provider.environment.BUCKET_NAME}\n        event: s3:ObjectCreated:*\n        rules:\n          - suffix: .csv\n\nsqsListener:\n  handler: src/index.sqsListener\n  events:\n    - sqs:\n        batchSize: 1\n        arn: ${self:custom.sqsArn}\n\n  sqsClean:\n    handler: src/index.sqsClean\n    events:\n      - http:\n          method: get\n          path: clean\n```\n\n\u003e src/index.js\n\n```js\nmodule.exports = {\n  s3Listener: require('./s3.listener'),\n  sqsListener: require('./sqs.listener'),\n  sqsClean: require('./sqs.clear'),\n};\n```\n\n\u003e base das lambdas\n\n```js\nclass Handler {\n  async main(event) {\n    try {\n      return {\n        statusCode: 200,\n        body: 'Hello',\n      };\n    } catch (error) {\n      console.log('ERROR ', error.stack);\n      return {\n        statusCode: 500,\n        body: 'Internal Error',\n      };\n    }\n  }\n}\nconst handler = new Handler();\nmodule.exports = handler.main.bind(handler);\n```\n\n\u003e serverless.yml\n\n```yml\niamRoleStatements:\n  - Effect: Allow\n    Action:\n      - sqs:SendMessage\n      - sqs:GetQueueUrl\n      - sqs:CreateQueue\n      - sqs:ReceiveMessage\n    Resource: ${self:custom.sqsArn}\n  - Effect: Allow\n    Action:\n      - s3:*\n    Resource:\n      - arn:aws:s3:::${self:provider.environment.BUCKET_NAME}/*\n      - arn:aws:s3:::${self:provider.environment.BUCKET_NAME}\n```\n\nTestando as Lambdas\n\n```bash\n  file\n  sls invoke local -f sqsListener\n```\n\nCriando o CSV\n\n\u003e test.csv\n\n```csv\nname,age\nHuguinho,12\nZezinho,14\nLuisinho,13\n```\n\nTrabalhando com o CSV\n\n```bash\n# criando o bucket\n  bash scripts/s3/create-bucket.sh amauri-oliveira-tokenlab\n# criando a fila sqs\n  bash scripts/sqs/create-queue.sh file-queue\n# fazendo o upload do arquivo para o s3\n  bash scripts/s3/upload-file.sh amauri-oliveira-tokenlab scripts/s3/test.csv\n# lendo o csv do s3\n  yarn invoke-local:s3\n# rodando um mock da trigger para chamar a fila\n  yarn invoke-local:sqs\n# lendo e removendo msg da fila\n  yarn invoke-local:sqs-clear\n```\n\n#### s3.listener.js\n\n\u003e s3.listener.js\n\n```js\nconst AWS = require('aws-sdk');\nconst { Writable, pipeline } = require('stream');\nconst csvtojson = require('csvtojson');\n\nclass Handler {\n  constructor({ s3Svc, sqsSvc }) {\n    this.s3Svc = s3Svc;\n    this.sqsSvc = sqsSvc;\n    this.queueName = process.env.SQS_QUEUE;\n  }\n\n  static getSdks() {\n    const host = process.env.LOCALSTACK_HOST || 'localhost';\n    const s3port = process.env.S3_PORT || '4566';\n    const sqsPort = process.env.SQS_PORT || '4566';\n    const isLocal = process.env.IS_LOCAL;\n    const s3endpoint = new AWS.Endpoint(`http://${host}:${s3port}`);\n    const s3config = {\n      endpoint: s3endpoint,\n      s3ForcePathStyle: true,\n    };\n    const sqsEndpoint = new AWS.Endpoint(`http://${host}:${sqsPort}`);\n    const sqsConfig = {\n      endpoint: sqsEndpoint,\n    };\n\n    if (!isLocal) {\n      delete s3config.endpoint;\n      delete sqsConfig.endpoint;\n    }\n    return {\n      s3: new AWS.S3(s3config),\n      sqs: new AWS.SQS(sqsConfig),\n    };\n  }\n\n  async getQueueUrl() {\n    const { QueueUrl } = await this.sqsSvc\n      .getQueueUrl({\n        QueueName: this.queueName,\n      })\n      .promise();\n\n    return QueueUrl;\n  }\n\n  processDataOnDemand(queueUrl) {\n    const writableStream = new Writable({\n      write: (chunk, encoding, done) =\u003e {\n        const item = chunk.toString();\n        console.log('sending..', item, 'at', new Date().toISOString());\n        this.sqsSvc.sendMessage(\n          {\n            QueueUrl: queueUrl,\n            MessageBody: item,\n          },\n          done\n        );\n      },\n    });\n    return writableStream;\n  }\n\n  async pipefyStreams(...args) {\n    return new Promise((resolve, reject) =\u003e {\n      pipeline(...args, error =\u003e (error ? reject(error) : resolve()));\n    });\n  }\n\n  async main(event) {\n    const [\n      {\n        s3: {\n          bucket: { name },\n          object: { key },\n        },\n      },\n    ] = event.Records;\n\n    console.log('processing.', name, key);\n\n    try {\n      console.log('getting queueURL...');\n      const queueUrl = await this.getQueueUrl();\n\n      const params = {\n        Bucket: name,\n        Key: key,\n      };\n\n      await this.pipefyStreams(\n        this.s3Svc.getObject(params).createReadStream(),\n        csvtojson(),\n        this.processDataOnDemand(queueUrl)\n      );\n      console.log('process finished...', new Date().toISOString());\n\n      return {\n        statusCode: 200,\n        body: 'Process finished with success!',\n      };\n    } catch (error) {\n      console.log('ERROR ', error.stack);\n      return {\n        statusCode: 500,\n        body: 'Internal Error',\n      };\n    }\n  }\n}\n\nconst { s3, sqs } = Handler.getSdks();\nconst handler = new Handler({\n  sqsSvc: sqs,\n  s3Svc: s3,\n});\nmodule.exports = handler.main.bind(handler);\n```\n\n#### sqs.listener.js\n\n\u003e sqs.listener.js\n\n```js\nclass Handler {\n  async main(event) {\n    const [{ body, messageId }] = event.Records;\n    const item = JSON.parse(body);\n    console.log(\n      '***event',\n      JSON.stringify(\n        {\n          ...item,\n          messageId,\n          at: new Date().toISOString(),\n        },\n        null,\n        2\n      )\n    );\n    try {\n      return {\n        statusCode: 200,\n        body: 'Hello',\n      };\n    } catch (error) {\n      console.log('***error', error.stack);\n      return {\n        statusCode: 500,\n        body: 'Internal Error',\n      };\n    }\n  }\n}\nconst handler = new Handler();\nmodule.exports = handler.main.bind(handler);\n```\n\n#### sqs.clear.js\n\n\u003e sqs.clear.js\n\n```js\nconst AWS = require('aws-sdk');\nconst { promisify } = require('util');\n\nAWS.config.update({ region: 'us-east-1' });\nconst sqs = new AWS.SQS({ endpoint: '' });\nsqs.receiveMessage = promisify(sqs.receiveMessage);\nconst QueueUrl = 'http://localhost:4566/000000000000/file-queue';\n\nconst receiveParams = {\n  QueueUrl,\n  MaxNumberOfMessages: 1,\n};\n\nclass Handler {\n  async receive() {\n    try {\n      const queueData = await sqs.receiveMessage(receiveParams);\n      if (queueData \u0026\u0026 queueData.Messages \u0026\u0026 queueData.Messages.length \u003e 0) {\n        const [firstMessage] = queueData.Messages;\n        console.log('RECEIVED: ', firstMessage);\n        const deleteParams = {\n          QueueUrl,\n          ReceiptHandle: firstMessage.ReceiptHandle,\n        };\n        sqs.deleteMessage(deleteParams);\n      } else {\n        console.log('Queue Vazia');\n      }\n    } catch (e) {\n      console.log('ERROR: ', e);\n    }\n  }\n\n  async main(event) {\n    try {\n      await this.receive();\n      return {\n        statusCode: 200,\n        body: 'OK',\n      };\n    } catch (error) {\n      console.log('***error', error.stack);\n      return {\n        statusCode: 500,\n        body: 'Internal Error',\n      };\n    }\n  }\n}\nconst handler = new Handler();\nmodule.exports = handler.main.bind(handler);\n```\n\nInvocando a Lambda\n\n```bash\n# A lambda vai ler o S3 e colocar as msg na fila\n  yarn invoke-local:s3\n# A lambda vai ler a fila e fazer um print na tela\n  yarn invoke-local:sqs\n```\n\n## Outros\n\n### Exemplos com Lambda\n\n```bash\n# cria a pasta de log\n  mkdir logs\n# criar arquivo com conteúdo e faz um .zip\n  zip function.zip index.js\n# cria a lambda no local stack\n  aws --endpoint-url=http://localhost:4566 lambda create-function --function-name hello-cli --zip-file fileb://function.zip  --handler index.handler --role arn:aws:iam::123456:role/irrelevant --runtime nodejs12.x | tee logs/lambda-create.log\n# invoke lambda\n  aws --endpoint-url=http://localhost:4566 lambda invoke --function-name hello-cli --log-type Tail logs/lambda-exec.log\n#upgrade\n# zip\n  zip function.zip index.js\n# upload\n  aws --endpoint-url=http://localhost:4566 lambda update-function-code --zip-file fileb://function.zip --function-name hello-cli --publish | tee logs/lambda-update.log\n# remove a lambda\n  aws --endpoint-url=http://localhost:4566 lambda delete-function --function-name hello-cli\n```\n\n### SNS\n\n```bash\n# Cria um tópico  SNS\n  aws sns create-topic --name local_sns --endpoint-url=http://localhost:4566\n# Subscribing to SNS to SQS\n  aws --endpoint-url=http://localhost:4566 sns subscribe --topic-arn arn:aws:sns:us-east-1:000000000000:local_sns --protocol sqs --notification-endpoint http://localhost:4566/queue/local_queue\n# Send message.txt to SNS topic\n  aws --endpoint-url=http://localhost:4566 sns publish --topic-arn arn:aws:sns:us-east-1:000000000000:local_sns --message file://message.txt\n```\n\n### SQS\n\n```bash\n# create SQS\n  aws sqs create-queue --endpoint-url=http://localhost:4566 --queue-name local_queue;\n# Send msg to SQS\n  aws --endpoint-url=http://localhost:4566 sqs send-message --queue-url http://localhost:4566/000000000000/local_queue --message-body \"Mensagem de teste\"\n# Receive msg from SQS\n  aws --endpoint-url=http://localhost:4566 sqs receive-message --queue-url http://localhost:4566/000000000000/local_queue\n```\n\n### Limpando o docker\n\n```bash\n# desligando o que foi criado com dockerconposer\n docker-compose down\n# listando todos pacotes ativos e não\n docker ps -a\n#removendo um pacote pelo nome ou ID\n docker rm -f ID or NAME\n# removendo todos pacotes em sua maquina\n docker rm -f $(docker ps -a -q)\n# listando todos volumes\n docker volume ls\n#apagando um volume por nome\n docker volume rm  ID or NAME\n# removendo todos volumes\n docker volume rm $(docker volume ls -q)\n```\n\n### Handler base\n\n```js\nclass Handler {\n  async main(event) {\n    try {\n      return {\n        statusCode: 200,\n        body: 'Hello',\n      };\n    } catch (error) {\n      console.log('ERROR ', error.stack);\n      return {\n        statusCode: 500,\n        body: 'Internal Error',\n      };\n    }\n  }\n}\nconst handler = new Handler();\nmodule.exports = handler.main.bind(handler);\n```\n\n### Recomendado\n\n[BLOG](https://danieldcs.com/simulando-aws-local-com-localstack-e-node-js/)\n[VIDEO](https://www.youtube.com/watch?v=fIG8Wc_zg0w)\n\n### Agradecimentos\n\nGustavo Valim\n\n### Palestrante\n\nAmauri Oliveira\n","funding_links":[],"categories":[],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Famaurioliveira%2Fserverless-localstack","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Famaurioliveira%2Fserverless-localstack","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Famaurioliveira%2Fserverless-localstack/lists"}