{"id":13907459,"url":"https://github.com/adhorn/logtoes","last_synced_at":"2025-07-18T05:32:04.581Z","repository":{"id":22341879,"uuid":"92207489","full_name":"adhorn/logtoes","owner":"adhorn","description":"Demo of Asynchronous pattern (worker) using Python Flask \u0026 Celery ","archived":false,"fork":false,"pushed_at":"2023-02-15T21:34:32.000Z","size":12600,"stargazers_count":48,"open_issues_count":7,"forks_count":4,"subscribers_count":6,"default_branch":"master","last_synced_at":"2024-08-07T23:51:59.939Z","etag":null,"topics":["asynchronous","aws","celery","elasticsearch","flask","flask-api","gunicorn","python","redis","worker","worker-service"],"latest_commit_sha":null,"homepage":null,"language":"Python","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"mit","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/adhorn.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":null,"code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null}},"created_at":"2017-05-23T18:41:14.000Z","updated_at":"2023-06-15T18:41:35.000Z","dependencies_parsed_at":"2023-01-12T08:30:35.696Z","dependency_job_id":null,"html_url":"https://github.com/adhorn/logtoes","commit_stats":null,"previous_names":[],"tags_count":0,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/adhorn%2Flogtoes","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/adhorn%2Flogtoes/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/adhorn%2Flogtoes/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/adhorn%2Flogtoes/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/adhorn","download_url":"https://codeload.github.com/adhorn/logtoes/tar.gz/refs/heads/master","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":226353722,"owners_count":17611752,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["asynchronous","aws","celery","elasticsearch","flask","flask-api","gunicorn","python","redis","worker","worker-service"],"created_at":"2024-08-06T23:01:56.990Z","updated_at":"2024-11-25T15:32:04.421Z","avatar_url":"https://github.com/adhorn.png","language":"Python","readme":"\n\n**LogToES**:\n\nSimple demo of the asynchronous worker pattern using Flask and Celery.\nThis demo demonstrates the use of a python decorator to send API logs to Elasticsearch in real-time for analysis.\n\n![Logs to ES](https://github.com/adhorn/logtoes/blob/master/pics/demo0.png)\n\n\n**Why this demo?**:\n\nWhile Flask + Celery code is very common on the internet, I could not find any ready-to-use example which would combine all the bells and whistles necessary to run the asynchronous pattern code in production. This code here gives you just that (hopefully).\nThis demo also uses Gunicorn to serve the Flask application.\n\n\n**Asynchronous Pattern on AWS**:\n\n![Architecture](https://github.com/adhorn/logtoes/blob/master/pics/demo1.png)\n\n![How it works (part1)](https://github.com/adhorn/logtoes/blob/master/pics/demo2.png)\n\n![How it works (part2)](https://github.com/adhorn/logtoes/blob/master/pics/demo3.png)\n\n\n**What is Flask?**: \nFlask is a fun and easy to use microframework for Python based on Werkzeug.\nIt is easy to setup and use, and has a large community, lots of examples, etc:\n\n```\nfrom flask import Flask\napp = Flask(__name__)\n\n@app.route(\"/\")\ndef hello():\n    return \"Hello World!\"\n\nif __name__ == \"__main__\":\n    app.run()\n```\n\n```\n$ python hello.py\n * Running on http://localhost:5000/\n```\n\n**What is Gunicorn?**\nGunicorn 'Green Unicorn' is a Python WSGI HTTP Server for UNIX. It's a pre-fork worker model. The Gunicorn server is broadly compatible with various web frameworks, simply implemented, light on server resources, and fairly speedy.\n\n\n**What is Celery?**\nCelery is an asynchronous task queue/job queue based on distributed message passing. It is focused on real-time operation, but supports scheduling as well. \nThe execution units, called tasks, are executed concurrently on a single or more worker servers using multiprocessing, Eventlet,\tor gevent. Tasks can execute asynchronously (in the background) or synchronously (wait until ready).\nCelery is used in production systems to process millions of tasks a day.\n\n**What is Amazon Elasticsearch Service?** Elasticsearch service makes it easy to deploy, operate, and scale Elasticsearch for log analytics, full text search, application monitoring, and more. Amazon Elasticsearch Service is a fully managed service that delivers Elasticsearch’s easy-to-use APIs and real-time capabilities along with the availability, scalability, and security required by production workloads. The service offers built-in integrations with Kibana, Logstash, and AWS services including Amazon Kinesis Firehose, AWS Lambda, and Amazon CloudWatch so that you can go from raw data to actionable insights quickly.\n\n\n**Prerequistes:**\n\nYou will need Redis and Elasticsearch running. Replace the respective URLs of Redis and Elasticsearch in the default_settings.py file.\n\n**How to run the code:**\n\nCreate a virtualenv and install the requirements.\n\n```\n$ virtualenv ~/.virtualenvs/logtoes \u0026\u0026 source ~/.virtualenvs/logtoes/bin/activate\n$ pip install -r requirements.txt\n```\n\nLaunch 2 terminal sessions since you need to run both Gunicorn and Celery. \n\nIn the first terminal ( make sure to activate your Virtualenv logtoes) - launch Celery workers\n\n```\n$ celery -A start_celery worker -l debug -P gevent\n```\n\nIn the second terminal (make sure to activate your Virtualenv logtoes) - launch Gunicorn server\n\n```\n$ gunicorn -w 1 -b 0.0.0.0:5555 -k gevent logtoes.logtoes:app\n* Test the API on http://0.0.0.0:5555/api/echo\n```\n\nThis should respond: \n```\n{\"Application Status\": \"Surprising, but I am up and running!\"}\n```\n\n\nYou can now start a third terminal to launch Flower, a tool to vizualise your tasks\n\n```\n$ celery -A start_celery flower --port=4444\n* running on http://localhost:4444/tasks\n```\n\nConnect into Kibana endpoint found from the Amazon Elasticsearch Service and create an index in setting:\n  \n ```\n Index seach pattern: logtoes\n Select datetime\n ```\n\n Vizualise - Enjoy :)\n\n![Kibana](https://github.com/adhorn/logtoes/blob/master/pics/demo4.png)\n\n**Note:**\n\nThis demo used Geocity to enrich the log information with the location (converting IP to Country) - if you want to update the data, do the following:\n\n```\ncd logtoes/geocity \u0026\u0026 { curl -O http://geolite.maxmind.com/download/geoip/database/GeoLiteCity.dat.gz ; cd -; } \ncd logtoes/geocity \u0026\u0026 { gunzip GeoLiteCity.dat.gz ; cd -; }\n```\n\n\n\n\n\n\n","funding_links":[],"categories":["HarmonyOS"],"sub_categories":["Windows Manager"],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fadhorn%2Flogtoes","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fadhorn%2Flogtoes","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fadhorn%2Flogtoes/lists"}