{"id":13562271,"url":"https://github.com/ScriptSmith/daq","last_synced_at":"2025-04-03T18:33:10.745Z","repository":{"id":97265747,"uuid":"447377559","full_name":"ScriptSmith/daq","owner":"ScriptSmith","description":"DIY AirGradient Air Quality Monitor with AWS IoT Core and Amazon Timestream","archived":false,"fork":false,"pushed_at":"2024-07-19T12:21:41.000Z","size":576,"stargazers_count":4,"open_issues_count":0,"forks_count":1,"subscribers_count":5,"default_branch":"main","last_synced_at":"2025-04-03T10:22:16.787Z","etag":null,"topics":["air-quality","airgradient","arduino","aws","aws-iot-core","cloudformation","esp8266","grafana","iot","timestream"],"latest_commit_sha":null,"homepage":"","language":"C++","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"mit","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/ScriptSmith.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":"LICENSE.txt","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null}},"created_at":"2022-01-12T21:35:18.000Z","updated_at":"2025-03-31T12:28:54.000Z","dependencies_parsed_at":"2024-10-27T21:45:02.662Z","dependency_job_id":"3a770d8d-d283-4113-8d78-81680d6c5e48","html_url":"https://github.com/ScriptSmith/daq","commit_stats":{"total_commits":12,"total_committers":1,"mean_commits":12.0,"dds":0.0,"last_synced_commit":"77815849efe4764b347bc811e92a3672b1c2b9dc"},"previous_names":[],"tags_count":0,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/ScriptSmith%2Fdaq","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/ScriptSmith%2Fdaq/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/ScriptSmith%2Fdaq/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/ScriptSmith%2Fdaq/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/ScriptSmith","download_url":"https://codeload.github.com/ScriptSmith/daq/tar.gz/refs/heads/main","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":247056977,"owners_count":20876489,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["air-quality","airgradient","arduino","aws","aws-iot-core","cloudformation","esp8266","grafana","iot","timestream"],"created_at":"2024-08-01T13:01:06.542Z","updated_at":"2025-04-03T18:33:05.728Z","avatar_url":"https://github.com/ScriptSmith.png","language":"C++","readme":"# daq - DIY Air Quality\n\nDIY Air Quality Monitor by [AirGradient](https://www.airgradient.com/diy/) integrated with [AWS Iot Core](https://aws.amazon.com/iot-core/) \u0026 [Amazon Timestream](https://aws.amazon.com/timestream/), for display in [Grafana](https://grafana.com/oss/grafana/).\n\nRecords \u0026 visualises carbon dioxide, particulate matter, temperature, and humidity.\n\n![grafana dashboard](assets/grafana_dashboard.png)\n\n![pcb \u0026 components](assets/device.jpg)\n\n\n**Sections:**\n\n1. [Estimated cost](#estimated-cost)\n2. [Assembly](#assembly)\n3. [Uploading the custom client](#uploading-the-custom-client)\n4. [AWS setup](#aws-setup)\n5. [Connect client to AWS](#connect-client-to-aws)\n6. [Grafana setup](#grafana-setup)\n\n## Estimated cost\n\n\u003e [!IMPORTANT]  \n\u003e **July 2024 Update**: AWS will now charge for a 100GB minimum of magnetic storage\n\u003e \n\u003e _Effective July 10, 2024, the service will have a new minimum requirement for the magnetic store usage. All accounts using the service are subject to a 100GB storage minimum, equivalent to $3/month (in US-EAST-1), of the magnetic store. Refer to the pricing page for the prevailing rates of magnetic store usage in your AWS region [1]. If the magnetic store usage in your account exceeds 100 GB, there will be no change to your billing._\n\u003e\n\u003e You can export your data from timestream to s3 using the new [S3 UNLOAD statement](https://docs.aws.amazon.com/timestream/latest/developerguide/export-unload-concepts.html)\n\n*As of January 2022*\n\nThe [DIY kit](https://www.airgradient.com/diy/) can be purchased from [AirGradient's shop](https://www.airgradient.com/diyshop/) for between US$46 and US$60 + shipping\n\nAWS IoT Core + Amazon Timestream costs are _estimated_ to be approximately the following:\n\n\n| Service           | Link                                                              | First month | With 10 years of data |\n|-------------------|-------------------------------------------------------------------|-------------|-----------------------|\n| AWS IoT Core      | [Pricing](https://aws.amazon.com/iot-core/pricing/)               | US$0.08     | US$0.08               |\n| Amazon Timestream | [Pricing](https://aws.amazon.com/timestream/pricing/?nc=sn\u0026loc=3) | US$0.22     | US$0.45               |\n\nAssumptions:\n- Running in `us-east-2`\n- Limited querying scope and frequency\n- Sending data using [Basic Ingest](https://docs.aws.amazon.com/iot/latest/developerguide/iot-basic-ingest.html)\n- 24h in-memory retention\n\n\nNotes:\n\n- Costs will vary depending on usage and prices are subject to change\n- You may benefit from using [scheduled queries](https://docs.aws.amazon.com/timestream/latest/developerguide/scheduledqueries.html) depending on your usage patterns\n- There is a 10MB (US$0.0001) minimum charge per query\n- In the default configuration, a little more than 2MB is added to Timestream each day\n\n\n## Assembly\n\n### Hardware:\n- WEMOS D1 mini (ESP8266)\n- WEMOS D1 mini display (OLED 0.66 Shield)\n- Plantower PMS5003\n- Sensair S8\n- SHT31\n\nFollow the instructions on [https://www.airgradient.com/diy/](https://www.airgradient.com/diy/).\n\nSee also Jeff Geerling's video for tips:\n\n[![Your home's air could be making you sick. Fight back!](https://img.youtube.com/vi/Cmr5VNALRAg/0.jpg)](https://youtu.be/Cmr5VNALRAg?t=173)\n\n## Uploading the custom client\n\n[AirGradient's client](https://www.airgradient.com/diy/#flashing-of-the-d1-mini-with-the-airgradient-firmware) can be uploaded using the Arduino IDE.\n\nThis custom client uses the [platformio framework](https://docs.platformio.org/) to manage the toolchain for building and deployment.\n\nIn the `client` directory:\n\n### Build\n\n```\npio run\n```\n\n### Upload\n\n```\npio run -t upload\n```\n\nThe OLED screen should begin displaying information.\n\n### Listen to client serial output\n\n```\npio device monitor\n```\n\n## AWS Setup\n\nPlease note that Amazon Timestream is only supported in a few AWS regions. See the [pricing page](https://aws.amazon.com/timestream/pricing/?nc=sn\u0026loc=3) for details.\n\nAll the infrastructure can be deployed using AWS CloudFormation and the [AWS CLI](https://docs.aws.amazon.com/cli/latest/userguide/cli-chap-getting-started.html) or [AWS console](https://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/cfn-console-create-stack.html).\n\nConfigure environment variables:\n\n```\nexport AWS_REGION=us-east-2 # ap-southeast-2 when :(\nexport STACK_NAME=daq\n```\n\nCreate the [client certificates](https://docs.aws.amazon.com/iot/latest/developerguide/device-certs-create.html) to connect to AWS IoT Core:\n\n```\naws iot create-keys-and-certificate \\\n    --certificate-pem-outfile \"daq.cert.pem\" \\\n    --public-key-outfile \"daq.public.key\" \\\n    --private-key-outfile \"daq.private.key\" \u003e cert_outputs.json\n```\n\nDeploy the CloudFormation stack:\n\n```\naws cloudformation deploy \\\n    --template-file cloudformation.yaml \\\n    --capabilities CAPABILITY_IAM \\\n    --stack-name $STACK_NAME \\\n    --parameter-overrides MemoryRetentionHours=24 MagneticRetentionDays=3650\n```\n\n```\naws cloudformation describe-stacks \\\n    --stack-name $STACK_NAME \\\n    --query 'Stacks[0].Outputs' \u003e stack_outputs.json\n```\n\nOnce the stack has been successfully deployed, configure the certificate.\n\n```\ncat cert_outputs.json stack_outputs.json\n```\n\nMark as active:\n\n```\naws iot update-certificate --certificate-id {CERTIFICATE_ID} --new-status ACTIVE\n```\n\nAttach Policy:\n\n```\naws iot attach-policy --policy-name {POLICY_NAME} --target {CERTIFICATE_ARN}\n```\n\nAttach Thing:\n\n```\naws iot attach-thing-principal --thing-name {THING_NAME} --principal {CERTIFICATE_ARN}\n```\n\n## Connect client to AWS\n\nIn `client/include/config.h`, update the following:\n\n1. Set `THING_NAME` to the name of the created Thing (eg. `daq-Thing-L5KRHBFJORP4`)\n2. Set `ROOM` to the human-readable identifier of the room the client is housed in\n3. Set `ENABLE_WIFI` to `true`\n4. Set `TIME_ZONE` to your timezone\n5. Set `AWS_TOPIC` to the created topic rule name with the included prefix (eg. `$aws/rules/TopicRule_0yKDHTposnt5`)\n6. Set `AWS_IOT_ENDPOINT` to your AWS IoT Core endpoint (`aws iot describe-endpoint --endpoint-type iot:Data-ATS`)\n7. Set `AWS_CERT_CA` to the value of `AWSRootCA1.pem` [link](https://www.amazontrust.com/repository/AmazonRootCA1.pem)\n8. Set `AWS_CERT_DEVICE` to the contents of the created `daq.cert.pem` file\n9. Set `AWS_CERT_PRIVATE` to the contents of the created `daq.private.key` file\n\nThen upload the changes:\n\n```\ncd client\npio run -t upload\n```\n\nConnect to the Wi-Fi network beginning with `DAQ-`, configure the network connection, and reset.\n\nSee [WiFiManager](https://github.com/tzapu/WiFiManager) for more details.\n\nOnce the configuration is complete, confirm that the device is sending data to AWS:\n\n```\n$ pio device monitor\n*wm:[1] AutoConnect \n*wm:[2] Connecting as wifi client... \n*wm:[2] setSTAConfig static ip not set, skipping \n*wm:[1] connectTimeout not set, ESP waitForConnectResult... \n*wm:[2] Connection result: WL_CONNECTED\n*wm:[1] AutoConnect: SUCCESS \nConnecting to server\nConnected!\n{\"device_id\":\"abb44a\",\"room\":\"office1\",\"wifi\":-71,\"pm2\":0,\"co2\":733,\"tmp\":25.60000038,\"hmd\":53}\n{\"device_id\":\"abb44a\",\"room\":\"office1\",\"wifi\":-72,\"pm2\":0,\"co2\":735,\"tmp\":25.60000038,\"hmd\":53}\n{\"device_id\":\"abb44a\",\"room\":\"office1\",\"wifi\":-72,\"pm2\":0,\"co2\":739,\"tmp\":25.60000038,\"hmd\":53}\n```\n\n```\n$ aws timestream-query query --query-string 'SELECT * FROM \"{DATABASE_NAME}\".\"{TABLE_NAME}\" order by time desc LIMIT 10'\n{\n    \"Rows\": [...],\n    \"ColumnInfo\": [\n        {\n            \"Name\": \"device_id\",\n            \"Type\": {\n                \"ScalarType\": \"VARCHAR\"\n            }\n        },\n        {\n            \"Name\": \"room\",\n            \"Type\": {\n                \"ScalarType\": \"VARCHAR\"\n            }\n        },\n        {\n            \"Name\": \"measure_name\",\n            \"Type\": {\n                \"ScalarType\": \"VARCHAR\"\n            }\n        },\n        {\n            \"Name\": \"time\",\n            \"Type\": {\n                \"ScalarType\": \"TIMESTAMP\"\n            }\n        },\n        {\n            \"Name\": \"measure_value::bigint\",\n            \"Type\": {\n                \"ScalarType\": \"BIGINT\"\n            }\n        },\n        {\n            \"Name\": \"measure_value::double\",\n            \"Type\": {\n                \"ScalarType\": \"DOUBLE\"\n            }\n        }\n    ],\n    \"QueryStatus\": {\n        \"ProgressPercentage\": 100.0,\n        \"CumulativeBytesScanned\": 1870735,\n        \"CumulativeBytesMetered\": 10000000\n    }\n}\n```\n\n## Grafana setup\n\nSee the installation instructions for [Grafana](https://grafana.com/grafana/).\n\nAdd an [Amazon Timestream Data Source](https://grafana.com/grafana/plugins/grafana-timestream-datasource/):\n1. In `Configuration` -\u003e `Plugins`, install the `Amazon Timestream` plugin\n2. In `Configuration` -\u003e `Data Sources`, click `Add data source` -\u003e `Amazon Timestream`\n3. Pick your preferred method of creating AWS credentials\n   1. Create an IAM User and attach the `AmazonTimestreamReadOnlyAccess` policy\n      ```\n      aws iam create-user --user-name GrafanaTimestreamQuery\n      aws iam attach-user-policy --user-name GrafanaTimestreamQuery --policy-arn \"arn:aws:iam::aws:policy/AmazonTimestreamReadOnlyAccess\"\n      aws iam create-access-key --user-name GrafanaTimestreamQuery\n      ```\n   2. Use your root credentials and in the `Assume Role ARN` field, enter the ARN of the `QueryRole` role in `stack_outputs.json` \n   3. Use your root credentials\n4. In the `Authentication Provider` field, choose either:\n   1. `Access \u0026 secret key` and enter the keys directly.\n   2. `AWS SDK Default` or `Credentials file` and configure your AWS credentials through the filesystem or env variables\n5. In the `Default Region` field, enter the AWS Region\n6. Click `Save \u0026 test`\n\nImport the default dashboard:\n\n1. In `Create` -\u003e `Import`, click `Upload JSON file` and select `dashboard.json`\n2. Change the value of `Amazon Timestream` to your Amazon Timestream Data Source\n3. Change the value of `Database` to the name of your database\n4. Change the value of `Table` to the name of your table\n5. Click `Import`\n\n![grafana dashboard](assets/grafana_dashboard.png)\n\n## Acknowledgements\n\nThanks to AirGradient for creating this awesome DIY kit, providing the example code, and the Gerber + STL files.\n\nThanks to Jeff Geerling for the video and commentary.\n","funding_links":[],"categories":["C++"],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2FScriptSmith%2Fdaq","html_url":"https://awesome.ecosyste.ms/projects/github.com%2FScriptSmith%2Fdaq","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2FScriptSmith%2Fdaq/lists"}