{"id":15192444,"url":"https://github.com/itzmeanjan/ette","last_synced_at":"2025-10-02T07:30:53.833Z","repository":{"id":37882114,"uuid":"306397377","full_name":"itzmeanjan/ette","owner":"itzmeanjan","description":"EVM-based Blockchain Indexer, with historical data query \u0026 real-time notification support 😎","archived":true,"fork":false,"pushed_at":"2023-01-13T12:48:00.000Z","size":47943,"stargazers_count":267,"open_issues_count":0,"forks_count":76,"subscribers_count":13,"default_branch":"main","last_synced_at":"2025-01-21T12:35:59.041Z","etag":null,"topics":["blockchain","blockchain-data","blockchain-events","blockchain-explorer","ethereum","ethereum-blockchain-analyser","evm","graphql-api","notification","realtime-notification","realtime-tracking","websocket"],"latest_commit_sha":null,"homepage":"https://itzmeanjan.github.io/ette/","language":"Go","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"cc0-1.0","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/itzmeanjan.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null}},"created_at":"2020-10-22T16:31:02.000Z","updated_at":"2025-01-10T20:54:19.000Z","dependencies_parsed_at":"2023-02-09T15:45:41.347Z","dependency_job_id":null,"html_url":"https://github.com/itzmeanjan/ette","commit_stats":null,"previous_names":[],"tags_count":13,"template":false,"template_full_name":null,"purl":"pkg:github/itzmeanjan/ette","repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/itzmeanjan%2Fette","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/itzmeanjan%2Fette/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/itzmeanjan%2Fette/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/itzmeanjan%2Fette/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/itzmeanjan","download_url":"https://codeload.github.com/itzmeanjan/ette/tar.gz/refs/heads/main","sbom_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/itzmeanjan%2Fette/sbom","scorecard":{"id":497765,"data":{"date":"2025-08-11","repo":{"name":"github.com/itzmeanjan/ette","commit":"ac1a0942a3edd95b4ad130dd0be3553523d5be65"},"scorecard":{"version":"v5.2.1-40-gf6ed084d","commit":"f6ed084d17c9236477efd66e5b258b9d4cc7b389"},"score":2.9,"checks":[{"name":"Packaging","score":-1,"reason":"packaging workflow not detected","details":["Warn: no GitHub/GitLab publishing workflow detected."],"documentation":{"short":"Determines if the project is published as a package that others can easily download, install, easily update, and uninstall.","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#packaging"}},{"name":"Dangerous-Workflow","score":10,"reason":"no dangerous workflow patterns detected","details":null,"documentation":{"short":"Determines if the project's GitHub Action workflows avoid dangerous patterns.","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#dangerous-workflow"}},{"name":"Token-Permissions","score":0,"reason":"detected GitHub workflow tokens with excessive permissions","details":["Warn: no topLevel permission defined: .github/workflows/go.yml:1","Info: no jobLevel write permissions found"],"documentation":{"short":"Determines if the project's workflows follow the principle of least privilege.","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#token-permissions"}},{"name":"Maintained","score":0,"reason":"project is archived","details":["Warn: Repository is archived."],"documentation":{"short":"Determines if the project is \"actively maintained\".","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#maintained"}},{"name":"Code-Review","score":4,"reason":"Found 2/5 approved changesets -- score normalized to 4","details":null,"documentation":{"short":"Determines if the project requires human code review before pull requests (aka merge requests) are merged.","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#code-review"}},{"name":"CII-Best-Practices","score":0,"reason":"no effort to earn an OpenSSF best practices badge detected","details":null,"documentation":{"short":"Determines if the project has an OpenSSF (formerly CII) Best Practices Badge.","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#cii-best-practices"}},{"name":"Binary-Artifacts","score":10,"reason":"no binaries found in the repo","details":null,"documentation":{"short":"Determines if the project has generated executable (binary) artifacts in the source repository.","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#binary-artifacts"}},{"name":"Security-Policy","score":0,"reason":"security policy file not detected","details":["Warn: no security policy file detected","Warn: no security file to analyze","Warn: no security file to analyze","Warn: no security file to analyze"],"documentation":{"short":"Determines if the project has published a security policy.","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#security-policy"}},{"name":"Fuzzing","score":0,"reason":"project is not fuzzed","details":["Warn: no fuzzer integrations found"],"documentation":{"short":"Determines if the project uses fuzzing.","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#fuzzing"}},{"name":"License","score":10,"reason":"license file detected","details":["Info: project has a license file: LICENSE:0","Info: FSF or OSI recognized license: Creative Commons Zero v1.0 Universal: LICENSE:0"],"documentation":{"short":"Determines if the project has defined a license.","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#license"}},{"name":"Pinned-Dependencies","score":0,"reason":"dependency not pinned by hash detected -- score normalized to 0","details":["Warn: GitHub-owned GitHubAction not pinned by hash: .github/workflows/go.yml:12: update your workflow using https://app.stepsecurity.io/secureworkflow/itzmeanjan/ette/go.yml/main?enable=pin","Warn: GitHub-owned GitHubAction not pinned by hash: .github/workflows/go.yml:15: update your workflow using https://app.stepsecurity.io/secureworkflow/itzmeanjan/ette/go.yml/main?enable=pin","Info:   0 out of   2 GitHub-owned GitHubAction dependencies pinned"],"documentation":{"short":"Determines if the project has declared and pinned the dependencies of its build process.","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#pinned-dependencies"}},{"name":"Signed-Releases","score":-1,"reason":"no releases found","details":null,"documentation":{"short":"Determines if the project cryptographically signs release artifacts.","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#signed-releases"}},{"name":"Branch-Protection","score":0,"reason":"branch protection not enabled on development/release branches","details":["Warn: branch protection not enabled for branch 'main'"],"documentation":{"short":"Determines if the default and release branches are protected with GitHub's branch protection settings.","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#branch-protection"}},{"name":"Vulnerabilities","score":0,"reason":"16 existing vulnerabilities detected","details":["Warn: Project is vulnerable to: GHSA-4gmj-3p3h-gm8h","Warn: Project is vulnerable to: GO-2022-1098 / GHSA-2chg-86hq-7w38","Warn: Project is vulnerable to: GO-2024-2818 / GHSA-3jgf-r68h-xfqm","Warn: Project is vulnerable to: GO-2024-3189 / GHSA-27vh-h6mc-q6g8","Warn: Project is vulnerable to: GO-2023-2046 / GHSA-ppjg-v974-84cm","Warn: Project is vulnerable to: GO-2024-2819 / GHSA-4xc9-8hmq-j652","Warn: Project is vulnerable to: GHSA-rqmg-hrg4-fm69","Warn: Project is vulnerable to: GHSA-v9jh-j8px-98vq","Warn: Project is vulnerable to: GO-2024-2955 / GHSA-869c-j7wc-8jqv","Warn: Project is vulnerable to: GO-2021-0052 / GHSA-h395-qcrw-5vmq","Warn: Project is vulnerable to: GHSA-3vp4-m3rf-835h","Warn: Project is vulnerable to: GO-2023-1737 / GHSA-2c4m-59x9-fr2g","Warn: Project is vulnerable to: GO-2024-2606 / GHSA-7jwh-3vrq-q3m8","Warn: Project is vulnerable to: GO-2024-2920 / GHSA-2hmf-46v7-v6fx","Warn: Project is vulnerable to: GO-2024-2611 / GHSA-8r3f-844c-mc37","Warn: Project is vulnerable to: GO-2022-0603 / GHSA-hp87-p4gw-j4gq"],"documentation":{"short":"Determines if the project has open, known unfixed vulnerabilities.","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#vulnerabilities"}},{"name":"SAST","score":0,"reason":"SAST tool is not run on all commits -- score normalized to 0","details":["Warn: 0 commits out of 29 are checked with a SAST tool"],"documentation":{"short":"Determines if the project uses static code analysis.","url":"https://github.com/ossf/scorecard/blob/f6ed084d17c9236477efd66e5b258b9d4cc7b389/docs/checks.md#sast"}}]},"last_synced_at":"2025-08-19T20:56:17.936Z","repository_id":37882114,"created_at":"2025-08-19T20:56:17.936Z","updated_at":"2025-08-19T20:56:17.936Z"},"host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":277974403,"owners_count":25908396,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","status":"online","status_checked_at":"2025-10-02T02:00:08.890Z","response_time":67,"last_error":null,"robots_txt_status":"success","robots_txt_updated_at":"2025-07-24T06:49:26.215Z","robots_txt_url":"https://github.com/robots.txt","online":true,"can_crawl_api":true,"host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["blockchain","blockchain-data","blockchain-events","blockchain-explorer","ethereum","ethereum-blockchain-analyser","evm","graphql-api","notification","realtime-notification","realtime-tracking","websocket"],"created_at":"2024-09-27T21:23:43.455Z","updated_at":"2025-10-02T07:30:52.943Z","avatar_url":"https://github.com/itzmeanjan.png","language":"Go","readme":"\u003e **Warning** **I've stopped maintaining `ette`.**\n\n# ette\n\nEVM-based Blockchain Indexer, with historical data query \u0026 real-time notification support 😎\n\n**Deploy your `ette` instance today**\n\n![banner](sc/banner.gif)\n\n## Table of Contents\n\n- [Why did you build `ette` ?](#inspiration-)\n- [What do I need to have to use it ?](#prerequisite-)\n- [How to install it ?](#installation-)\n- [What are possible use cases of `ette` ?](#use-cases-)\n- [How do I generate `APIKey`(s) ?](#management-using-webui-)\n- [How to use it ?](#usage-)\n    - Historical Data\n        - Custom REST\n            - [Query historical block data](#historical-block-data--rest-api--)\n            - [Query historical transaction data](#historical-transaction-data--rest-api--)\n            - [Query historical event data](#historical-event-data--rest-api--)\n        - GraphQL ( **Recommended** )\n            - [Query historical block data](#historical-block-data--graphql-api--)\n            - [Query historical transaction data](#historical-transaction-data--graphql-api--)\n            - [Query historical event data](#historical-event-data--graphql-api--)\n    - Real-time Data\n        - [Real-time block mining notification](#real-time-notification-for-mined-blocks-)\n        - [Real-time transaction notification ( 🤩 Filters Added ) ](#real-time-notification-for-transactions-%EF%B8%8F)\n        - [Real-time log event notification ( 🤩 Filters Added ) ](#real-time-notification-for-events-)\n    - Snapshotting\n        - [Take snapshot](#take-snapshot-of-existing-data-store-%EF%B8%8F)\n        - [Restore from snapshot](#restore-data-from-snapshot-%EF%B8%8F)\n\n## Inspiration 🤔\n\nI was looking for one tool which will be able to keep itself in sync with latest happenings on EVM based blockchain i.e. index blockchain data, while exposing REST \u0026 GraphQL API for querying blockchain data with various filters. That tool will also expose real time notification functionalities over websocket, when subscribed to topics.\n\nIt's not that I was unable find any solution, but wasn't fully satisfied with those, so I decided to write `ette`, which will do following\n\n- Sync upto latest state of blockchain\n- Listen for all happenings on EVM based blockchain\n- Persist all happenings in local database\n- Expose REST \u0026 GraphQL API for querying 👇, while also setting block range/ time range for filtering results. Allow querying latest **X** entries for events emitted by contracts.\n    - Block data\n    - Transaction data\n    - Event data\n\n- Expose websocket based real time notification mechanism for \n    - Blocks being mined\n    - Transactions being sent from address and/ or transactions being received at address\n    - Events being emitted by contract, with indexed fields i.e. topics\n\n- All historical data query requests must carry authentication header as `APIKey`, which can be generated by users, using webUI, packed with `ette`.\n- All real-time event subscription \u0026 unsubscription requests must carry `apiKey` in their payload.\n- It has very minimalistic webUI for creating \u0026 managing `APIKey`(s).\n- It has capability to process blocks in delayed fashion, if asked to do so. **To address chain reorganization issue, this is very effective**. All you need to do, specify how many block confirmations you require before considering that block to be finalized in `.env` file. Now `ette` will do everything with block _( if real-time subscription mode is enabled, it'll publish data to clients who're interested i.e. subscribed )_ expect putting it in persistent data store. Rather block identifier to be put in waiting queue, from where it'll be eventually picked up by workers to finally persist it in DB. Only downside of using this feature is you might not get data back in response of query for certain block number, which just got mined but not finalized as per your set up i.e. `BlockConfirmations` environment variable's value. You can always skip it, default value will be **0**.\n\n- `ette` can help you in taking snapshot of whole database, it's relying on, into a single binary file, where block data is serialized into Protocol Buffer format, efficient for deserialization also i.e. while restoring back from snapshot.\n    - `EtteMode` = 4, attempts to take a snapshot of whole database.\n\n- Restoring from snapshoted data file, can be attempted by `ette` when `EtteMode` = 5. Make sure you've cleaned backing data store before so \u0026 recreated database. [ **Table migration to be automatically taken care of** ]\n\n- For snapshotting purposes, you can always set sink/ source data file in `SnapshotFile` in `.env`.\n\n- 👆 snapshotting feature is helpful, if you're willing migrate `ette` to different machine or setting up new instance of `ette`. If you want to avoid a lengthy whole chain data syncing, you must take snapshot from existing instance of `ette` \u0026 attempt to restore from binary snapshot file in new `ette` instance.\n\nAnd that's `ette`\n\n## Prerequisite 👍\n\n![running_ette](./sc/running-ette.png)\n\n- Make sure you've Go _( \u003e= 1.15 )_ installed\n- You need to also install \u0026 set up PostgreSQL. I found [this](https://www.digitalocean.com/community/tutorials/how-to-install-and-use-postgresql-on-ubuntu-20-04) guide helpful.\n\n\u003e Make sure you've `pgcrypto` extension enabled on PostgreSQL Database.\n\n\u003e Check existing extensions using : `\\dx`\n\n\u003e Create extension using : `create extension pgcrypto;`\n\n- Redis needs to be installed too. Consider following [this](https://www.digitalocean.com/community/tutorials/how-to-install-and-secure-redis-on-ubuntu-20-04) guide.\n\n\u003e Note : Redis **v6.0.6** is required\n\n\u003e Note : Setting password in Redis instance has been made optional from now on, though it's recommended.\n\n- Blockchain Node's both **HTTP \u0026 Websocket** connection URL required, because we'll be querying block, transaction, event log related data using HTTP interface \u0026 listening for block mining events in real time over Websocket.\n\n## Installation 🛠\n\n- First fork this repository \u0026 clone it, some where out side of **GOPATH**.\n\n```bash\ngit clone git@github.com:username/ette.git\n```\n\n- Now get inside `ette`\n\n```bash\ncd ette\n```\n\n- Create a `.env` file in this directory. \n\n    - Make sure PostgreSQL has md5 authentication mechanism enabled.\n    - Please enable password based authentication in Redis Server\n    - Skipping `RedisPassword` is absolutely fine, if you don't want to use any password in Redis instance. [ **Not recommended** ]\n    - Replace `Domain` with your domain name i.e. `ette.company.com`\n    - Set `Production` to `yes` before running it in production; otherwise you can simply skip it\n    - `ette` can be run in any of 👇 5 possible modes, which can be set by `EtteMode`\n\n    ---\n\n    EtteMode | Interpretation\n    --- | ---\n    1 | Only Historical Data Query Allowed\n    2 | Only Real-time Subscription Allowed\n    3 | Both Historical Data Query \u0026 Real-time Subscription Allowed\n    4 | Attempt to take snapshot from data in backing DB\n    5 | Attempt to restore data from snapshot file\n\n    ---\n\n    - For testing historical data query using browser based GraphQL Playground in `ette`, you can set `EtteGraphQLPlayGround` to `yes` in config file\n    - For processing block(s)/ tx(s) concurrently, it'll create `ConcurrencyFactor * #-of CPUs on machine` workers, who will pick up jobs submitted to them.\n    - If nothing is specified, it defaults to 1 \u0026 assuming you're running `ette` on machine with 4 CPUs, it'll spawn worker pool of size 4. But more number of jobs can be submitted, only 4 can be running at max.\n    - 👆 being done for controlling concurrency level, by putting more control on user's hand.\n    - If you want to persist blocks in delayed fashion, you might consider setting `BlockConfirmations` to some _number \u003e 0_.\n    - That will make `ette` think you're asking it 80 is latest block, which can be persisted in final data store, when latest mined block number is 100 \u0026 `BlockConfirmations` is set to 20.\n    - This option is **recommended** to be used, at least in production, to address _chain reorganization issue_.\n    - For range based queries `BlockRange` can be set to limit how many blocks can be queried by client in a single go. Default value 100.\n    - For time span based queries `TimeRange` can be set to put limit on max time span _( in terms of second )_, can be used by clients. Default value 3600 i.e. 1 hour.\n    - If you're attempting to take snapshot/ restore from binary snapshot file, you can set `SnapshotFile` in `.env` file, to set sink/ source file name, respectively. Default file name `echo $(echo $(pwd)/snapshot.bin)` in i.e. from where `ette` gets invoked. Consider setting `EtteMode` correctly, depending upon what you want to attain.\n\n```\nRPCUrl=https://\u003cdomain-name\u003e\nWebsocketUrl=wss://\u003cdomain-name\u003e\nPORT=7000\nDB_USER=user\nDB_PASSWORD=password\nDB_HOST=x.x.x.x\nDB_PORT=5432\nDB_NAME=ette\nRedisConnection=tcp\nRedisAddress=x.x.x.x:6379\nRedisPassword=password\nDomain=localhost\nProduction=yes\nEtteMode=3\nEtteGraphQLPlayGround=yes\nConcurrencyFactor=5\nBlockConfirmations=200\nBlockRange=1000\nTimeRange=21600\nSnapshotFile=snapshot.bin\n```\n\n- Create another file in same directory, named `.plans.json`, whose content will look like 👇.\n\n    - This file holds subscription plans for clients, allowed by this `ette` instance.\n    - Each plan is denoted by one unique `name` \u0026 `deliveryCount`, where _`deliveryCount` denotes number of times data to be delivered to client application in 24 hours of time span._\n    - Because each request must be accompanied with `APIKey`, `ette` knows which user is requesting for resources \u0026 how many were delivered successfully in last 24 hours of time span.\n    - If one user crosses allowed request limit in 24 hours, no new request will be taken under consideration \u0026 any existing connection will stop delivering data to client.\n\n\u003e **Quick Tip :** Setting `deliveryCount` is fully upto you. Please consider VM specifications before doing so.\n\n```json\n{\n    \"plans\": [\n        {\n            \"name\": \"TIER 1\",\n            \"deliveryCount\": 50000\n        },\n        {\n            \"name\": \"TIER 2\",\n            \"deliveryCount\": 100000\n        },\n        {\n            \"name\": \"TIER 3\",\n            \"deliveryCount\": 500000\n        },\n        {\n            \"name\": \"TIER 4\",\n            \"deliveryCount\": 750000\n        },\n        {\n            \"name\": \"TIER 5\",\n            \"deliveryCount\": 1000000\n        }\n    ]\n}\n```\n\n- Now build `ette`\n\n```bash\nmake build\n```\n\n- If everything goes as expected, you'll find one binary named, **ette** in this directory. Run it. \n\n```bash\n./ette\n\n# or directly run `ette` using 👇, which will first build, then run\nmake run\n```\n\n- Database migration to be taken care of during application start up.\n- Syncing `ette` with latest state of blockchain takes time. Current sync state can be queried\n\n```bash\ncurl -s localhost:7000/v1/synced | jq\n```\n\n- You'll receive response of form 👇\n\n```json\n{\n  \"elapsed\": \"3m2.487237s\",\n  \"eta\": \"87h51m38s\",\n  \"processed\": 4242,\n  \"synced\": \"0.35 %\"\n}\n```\n\n- You can check how many active websocket sessions being managed by your `ette` deployment by\n\n\n```bash\ncurl -s localhost:7000/v1/stat | jq\n```\n\n---\n\n### Production deployment of `ette` using **systemd**\n\nHere's a systemd unit file which you can create in `/etc/systemd/system`.\n\n```bash\nsudo touch /etc/systemd/system/ette.service # first do it\n```\n\nNow you can paste 👇 content in unit file, given that you've cloned `ette` in **$HOME**.\n\n```bash\n[Unit]\nDescription=ette - EVM Blockchain Indexer\n\n[Service]\nUser=ubuntu\nWorkingDirectory=/home/ubuntu/ette\nExecStart=/home/ubuntu/ette/ette\nRestart=on-failure\nRestartSec=10s\n\n[Install]\nWantedBy=multi-user.target\n```\n\nTime to load systemd.\n\n```bash\nsudo systemctl daemon-reload\n```\n\nNow you can enable `ette`, so that it can be automatically started after system boot up.\n\n```bash\nsudo systemctl enable ette.service\n```\n\nFinally you can start `ette`.\n\n```bash\nsudo systemctl start ette.service\n```\n\nYou can also stop, running `ette` instance.\n\n```bash\nsudo systemctl stop ette.service\n```\n\nRestart an instance.\n\n```bash\nsudo systemctl restart ette.service\n```\n\nAll logs `ette` produces can be inspected using 👇\n\n```bash\nsudo journalctl -u ette.service # oldest to newest\nsudo journalctl -u ette.service --reverse # opposite of 👆\n```\n\nLatest log can be followed\n\n```bash\nsudo journalctl -u ette.service -f\n```\n\n---\n\n## Use Cases 🤯\n\n`ette` is supposed to be deployed by anyone, interested in running a historical data query \u0026 real-time notification service for EVM-based blockchain(s).\n\nAll client requests are by default rate limited _( 50k requests/ day )_. This rate limit is enforced on all `APIKey`(s) created by any single Ethereum Address. You can create multiple `APIKey`(s) from your account \u0026 accumulated requests made from those keys to be considered before dropping your requests.\n\nIf you need more requests per day, you can always asked your `ette` administrator to manually increase that from database table. _[ **Risky operation, needs to be done carefully. This is not recommended.** ]_\n\n**More features coming here, soon**\n\n## Management using webUI 🖥\n\n`ette` has one minimalistic webUI for generating \u0026 managing `APIKey`(s). It doesn't have any password based login mechanism. You need to have [Metamask](https://metamask.io/download.html) browser plugin installed for logging into `ette` management webUI.\n\nOnce you've started `ette` on your machine, open browser \u0026 head to [http://localhost:7000/v1/login](http://localhost:7000/v1/login).\n\nYou'll be greeted with 👇\n\n![ui](./sc/login_1.png)\n\nAssuming you've Metamask browser plugin installed, you can click `Login` \u0026 you'll be asked to sign a message of specific format, which will be validated by `ette`.\n\n![ui](./sc/login_2.png)\n\nOnce logged in, you can find out all `APIKey`(s) created by you.\n\n![ui](./sc/webUI_1.png)\n\nIf you've not any `APIKey`(s) created yet, go ahead \u0026 click `Create new app`. Again you'll be asked to sign a message of specific format.\n\n![ui](./sc/webUI_2.png)\n\nAnd you'll see new `APIKey` on your screen.\n\n![ui](./sc/webUI_3.png)\n\nYou can create any number of `APIKey`(s), but rate limiting to be applied on aggregated requests from all those `APIKey`(s).\n\n\u003e Now go ahead \u0026 use `APIKey` in header of historical data query requests/ payload of real-time notification subscription/ unsubscription request.\n\n**Double clicking on created `APIKey` toggles its enabled state, which is represented visually.** 👇\n\n![ui](./sc/webUI.gif)\n\nEnabled | Text Color\n--- | ---\nYes | Green\nNo | Red\n\n\u003e **Quick Tip:** As you can create any number of `APIKey`(s) from one Ethereum address, if you feel any of those has been exposed, disabling those ensures all requests accompanied with those `APIKey`(s) to be dropped, by `ette`\n\nRead further for usage examples.\n\n## Usage 🦾\n\n`ette` exposes REST \u0026 GraphQL API for querying historical block, transaction \u0026 event related data. It can also play role of real time notification engine, when subscribed to supported topics.\n\n\u003e **_All historical data query requests need to be strictly accompanied with valid `APIKey` as request header param_** 🤖\n\n### Historical Block Data ( REST API ) 🤩\n\nYou can query historical block data with various combination of query string params. 👇 is a comprehensive guide for consuming block data.\n\n**Path : `/v1/block`**\n\n**Example code snippet can be found [here](example/block.sh)**\n\nQuery Params | Method | Description\n--- | --- | ---\n`hash=0x...\u0026tx=yes` | GET | Fetch all transactions present in a block, when block hash is known\n`number=1\u0026tx=yes` | GET | Fetch all transactions present in a block, when block number is known\n`hash=0x...` | GET | Fetch block by hash\n`number=1` | GET | Fetch block by number\n`fromBlock=1\u0026toBlock=10` | GET | Fetch blocks by block number range _( max 10 at a time )_\n`fromTime=1604975929\u0026toTime=1604975988` | GET | Fetch blocks by unix timestamp range _( max 60 seconds timespan )_\n\n### Historical Transaction Data ( REST API ) 😎\n\nIt's possible to query historical transactions data with various combination of query string params, where URL path is 👇\n\n**Path : `/v1/transaction`**\n\n**Example code snippet can be found [here](example/transaction.sh)**\n\nQuery Params | Method | Description\n--- | --- | ---\n`hash=0x...` | GET | Fetch transaction by txHash\n`nonce=1\u0026fromAccount=0x...` | GET | Fetch transaction, when tx sender's address \u0026 account nonce are known\n`fromBlock=1\u0026toBlock=10\u0026deployer=0x...` | GET | Find out what contracts are created by certain account within given block number range _( max 100 blocks )_\n`fromTime=1604975929\u0026toTime=1604975988\u0026deployer=0x...` | GET | Find out what contracts are created by certain account within given timestamp range _( max 600 seconds of timespan )_\n`fromBlock=1\u0026toBlock=100\u0026fromAccount=0x...\u0026toAccount=0x...` | GET | Given block number range _( max 100 at a time )_ \u0026 a pair of accounts, can find out all tx performed between that pair, where `from` \u0026 `to` fields are fixed\n`fromTime=1604975929\u0026toTime=1604975988\u0026fromAccount=0x...\u0026toAccount=0x...` | GET | Given time stamp range _( max 600 seconds of timespan )_ \u0026 a pair of accounts, can find out all tx performed between that pair, where `from` \u0026 `to` fields are fixed\n`fromBlock=1\u0026toBlock=100\u0026fromAccount=0x...` | GET | Given block number range _( max 100 at a time )_ \u0026 an account, can find out all tx performed from that account\n`fromTime=1604975929\u0026toTime=1604975988\u0026fromAccount=0x...` | GET | Given time stamp range _( max 600 seconds of span )_ \u0026 an account, can find out all tx performed from that account\n`fromBlock=1\u0026toBlock=100\u0026toAccount=0x...` | GET | Given block number range _( max 100 at a time )_ \u0026 an account, can find out all tx where target was this address\n`fromTime=1604975929\u0026toTime=1604975988\u0026toAccount=0x...` | GET | Given time stamp range _( max 600 seconds of span )_ \u0026 an account, can find out all tx where target was this address\n\n### Historical Event Data ( REST API ) 🧐\n\n`ette` lets you query historical event data, emitted by smart contracts, by combination of query string params.\n\n**Path : `/v1/event`**\n\nQuery Params | Method | Description\n--- | --- | ---\n`blockHash=0x...` | GET | Given blockhash, retrieves all events emitted by tx(s) present in block\n`blockHash=0x...\u0026logIndex=1` | GET | Given blockhash and log index in block, attempts to retrieve associated event\n`blockNumber=123456\u0026logIndex=2` | GET | Given block number and log index in block, attempts to retrieve associated event\n`txHash=0x...` | GET | Given txhash, retrieves all events emitted during execution of this transaction\n`count=50\u0026contract=0x...` | GET | Returns last **x** _( \u003c=50 )_ events emitted by this contract\n`fromBlock=1\u0026toBlock=10\u0026contract=0x...\u0026topic0=0x...\u0026topic1=0x...\u0026topic2=0x...\u0026topic3=0x...` | GET | Finding event(s) emitted from contract within given block range \u0026 also matching topic signatures _{0, 1, 2, 3}_\n`fromBlock=1\u0026toBlock=10\u0026contract=0x...\u0026topic0=0x...\u0026topic1=0x...\u0026topic2=0x...` | GET | Finding event(s) emitted from contract within given block range \u0026 also matching topic signatures _{0, 1, 2}_\n`fromBlock=1\u0026toBlock=10\u0026contract=0x...\u0026topic0=0x...\u0026topic1=0x...` | GET | Finding event(s) emitted from contract within given block range \u0026 also matching topic signatures _{0, 1}_\n`fromBlock=1\u0026toBlock=10\u0026contract=0x...\u0026topic0=0x...` | GET | Finding event(s) emitted from contract within given block range \u0026 also matching topic signatures _{0}_\n`fromBlock=1\u0026toBlock=10\u0026contract=0x...` | GET | Finding event(s) emitted from contract within given block range\n`fromTime=1604975929\u0026toTime=1604975988\u0026contract=0x...\u0026topic0=0x...\u0026topic1=0x...\u0026topic2=0x...\u0026topic3=0x...` | GET | Finding event(s) emitted from contract within given time stamp range \u0026 also matching topic signatures _{0, 1, 2, 3}_\n`fromTime=1604975929\u0026toTime=1604975988\u0026contract=0x...\u0026topic0=0x...\u0026topic1=0x...\u0026topic2=0x...` | GET | Finding event(s) emitted from contract within given time stamp range \u0026 also matching topic signatures _{0, 1, 2}_\n`fromTime=1604975929\u0026toTime=1604975988\u0026contract=0x...\u0026topic0=0x...\u0026topic1=0x...` | GET | Finding event(s) emitted from contract within given time stamp range \u0026 also matching topic signatures _{0, 1}_\n`fromTime=1604975929\u0026toTime=1604975988\u0026contract=0x...\u0026topic0=0x...` | GET | Finding event(s) emitted from contract within given time stamp range \u0026 also matching topic signatures _{0}_\n`fromTime=1604975929\u0026toTime=1604975988\u0026contract=0x...` | GET | Finding event(s) emitted from contract within given time stamp range\n\n### Historical Block Data ( GraphQL API ) 🤩\n\nYou can query block data using GraphQL API.\n\n**Path: `/v1/graphql`**\n\n**Method: `POST`**\n\n```graphql\ntype Query {\n    blockByHash(hash: String!): Block!\n    blockByNumber(number: String!): Block!\n    blocksByNumberRange(from: String!, to: String!): [Block!]!\n    blocksByTimeRange(from: String!, to: String!): [Block!]!\n}\n```\n\nResponse will be of type 👇\n\n```graphql\ntype Block {\n  hash: String!\n  number: String!\n  time: String!\n  parentHash: String!\n  difficulty: String!\n  gasUsed: String!\n  gasLimit: String!\n  nonce: String!\n  miner: String!\n  size: Float!\n  txRootHash: String!\n  receiptRootHash: String!\n}\n```\n\nMethod | Parameters | Possible use case\n--- | --- | ---\n`blockByHash` | hash: String! | When you know block hash \u0026 want to get whole block data back\n`blockByNumber` | number: String! | When you know block number \u0026 want to get whole block data back\n`blocksByNumberRange` | from: String!, to: String! | When you've a block number range \u0026 want to get all blocks in that range, in a single call\n`blocksByTimeRange` | from: String!, to: String! | When you've unix timestamp range \u0026 want to get all blocks in that range, in a single call\n\n---\n\n### Historical Transaction Data ( GraphQL API ) 🤩\n\nYou can query transaction data from `ette`, using following GraphQL methods.\n\n**Path: `/v1/graphql`**\n\n**Method: `POST`**\n\n```graphql\ntype Query {\n    transaction(hash: String!): Transaction!\n  \n    transactionCountByBlockHash(hash: String!): Int!\n    transactionsByBlockHash(hash: String!): [Transaction!]!\n  \n    transactionCountByBlockNumber(number: String!): Int!\n    transactionsByBlockNumber(number: String!): [Transaction!]!\n  \n    transactionCountFromAccountByNumberRange(account: String!, from: String!, to: String!): Int!\n    transactionsFromAccountByNumberRange(account: String!, from: String!, to: String!): [Transaction!]!\n  \n    transactionCountFromAccountByTimeRange(account: String!, from: String!, to: String!): Int!\n    transactionsFromAccountByTimeRange(account: String!, from: String!, to: String!): [Transaction!]!\n  \n    transactionCountToAccountByNumberRange(account: String!, from: String!, to: String!): Int!\n    transactionsToAccountByNumberRange(account: String!, from: String!, to: String!): [Transaction!]!\n\n    transactionCountToAccountByTimeRange(account: String!, from: String!, to: String!): Int!\n    transactionsToAccountByTimeRange(account: String!, from: String!, to: String!): [Transaction!]!\n\n    transactionCountBetweenAccountsByNumberRange(fromAccount: String!, toAccount: String!, from: String!, to: String!): Int!\n    transactionsBetweenAccountsByNumberRange(fromAccount: String!, toAccount: String!, from: String!, to: String!): [Transaction!]!\n\n    transactionCountBetweenAccountsByTimeRange(fromAccount: String!, toAccount: String!, from: String!, to: String!): Int!\n    transactionsBetweenAccountsByTimeRange(fromAccount: String!, toAccount: String!, from: String!, to: String!): [Transaction!]!\n\n    contractsCreatedFromAccountByNumberRange(account: String!, from: String!, to: String!): [Transaction!]!\n    contractsCreatedFromAccountByTimeRange(account: String!, from: String!, to: String!): [Transaction!]!\n    transactionFromAccountWithNonce(account: String!, nonce: String!): Transaction!\n}\n```\n\nResponse will be of type 👇\n\n```graphql\ntype Transaction {\n  hash: String!\n  from: String!\n  to: String!\n  contract: String!\n  value: String!\n  data: String!\n  gas: String!\n  gasPrice: String!\n  cost: String!\n  nonce: String!\n  state: String!\n  blockHash: String!\n}\n```\n\nMethod | Parameters | Possible use case\n--- | --- | ---\n`transaction` | hash: String! | When you know txHash \u0026 want to get that tx data\n`transactionCountByBlockHash` | hash: String! | When you know block hash \u0026 want to get count of tx(s) packed in that block\n`transactionsByBlockHash` | hash: String! | When you know block hash \u0026 want to get all tx(s) packed in that block\n`transactionCountByBlockNumber` | number: String! | When you know block number \u0026 want to get count of tx(s) packed in that block\n`transactionsByBlockNumber` | number: String! | When you know block number \u0026 want to get all tx(s) packed in that block\n`transactionCountFromAccountByNumberRange` | account: String!, from: String!, to: String! | When you know tx sender address, block number range \u0026 want to find out how many tx(s) were sent by this address in that certain block number range\n`transactionsFromAccountByNumberRange` | account: String!, from: String!, to: String! | When you know tx sender address, block number range \u0026 want to find out all tx(s) that were sent by this address in that certain block number range\n`transactionCountFromAccountByTimeRange` | account: String!, from: String!, to: String! | When you know tx sender address, unix time stamp range \u0026 want to find out how many tx(s) were sent by this address in that certain timespan\n`transactionsFromAccountByTimeRange` | account: String!, from: String!, to: String! | When you know tx sender address, unix time stamp range \u0026 want to find out all tx(s) that were sent by this address in that certain timespan\n`transactionCountToAccountByNumberRange` | account: String!, from: String!, to: String! | When you know tx receiver address, block number range \u0026 want to find out how many tx(s) were sent to this address in that certain block number range\n`transactionsToAccountByNumberRange` | account: String!, from: String!, to: String! | When you know tx receiver address, block number range \u0026 want to find out all tx(s) that were sent to this address in that certain block number range\n`transactionCountToAccountByTimeRange` | account: String!, from: String!, to: String! | When you know tx receiver address, unix time stamp range \u0026 want to find out how many tx(s) were sent to this address in that certain timespan\n`transactionsToAccountByTimeRange` | account: String!, from: String!, to: String! | When you know tx receiver address, unix time stamp range \u0026 want to find out all tx(s) that were sent to this address in that certain timespan\n`transactionCountBetweenAccountsByNumberRange` | fromAccount: String!, toAccount: String!, from: String!, to: String! | When you know tx sender \u0026 receiver addresses, block number range \u0026 want to find out how many tx(s) were sent from sender to receiver in that certain block number range\n`transactionsBetweenAccountsByNumberRange` | fromAccount: String!, toAccount: String!, from: String!, to: String! | When you know tx sender \u0026 receiver addresses, block number range \u0026 want to find out all tx(s) that were sent from sender to receiver in that certain block number range\n`transactionCountBetweenAccountsByTimeRange` | fromAccount: String!, toAccount: String!, from: String!, to: String! | When you know tx sender \u0026 receiver addresses, unix timestamp range \u0026 want to find out how many tx(s) were sent from sender to receiver in that certain timespan\n`transactionsBetweenAccountsByTimeRange` | fromAccount: String!, toAccount: String!, from: String!, to: String! | When you know tx sender \u0026 receiver addresses, unix timestamp range \u0026 want to find out all tx(s) that were sent from sender to receiver in that certain timespan\n`contractsCreatedFromAccountByNumberRange` | account: String!, from: String!, to: String! | When you know EOA's _( externally owned account )_ address \u0026 want to find out all contracts created by that account in block number range\n`contractsCreatedFromAccountByTimeRange` | account: String!, from: String!, to: String! | When you know EOA's _( externally owned account )_ address \u0026 want to find out all contracts created by that account in certain time span\n`transactionFromAccountWithNonce` | account: String!, nonce: String! | When you have EOA's address \u0026 nonce value of it, you can pin point to that tx. This can be used to iterate through all tx(s) from this account, by updating nonce.\n\n---\n\n### Historical Event Data ( GraphQL API ) 🤩\n\nYou can ask `ette` for event data using GraphQL API.\n\n**Path: `/v1/graphql`**\n\n**Method: `POST`**\n\n```graphql\ntype Query {\n    eventsFromContractByNumberRange(contract: String!, from: String!, to: String!): [Event!]!\n    eventsFromContractByTimeRange(contract: String!, from: String!, to: String!): [Event!]!\n    eventsByBlockHash(hash: String!): [Event!]!\n    eventsByTxHash(hash: String!): [Event!]!\n    eventsFromContractWithTopicsByNumberRange(contract: String!, from: String!, to: String!, topics: [String!]!): [Event!]!\n    eventsFromContractWithTopicsByTimeRange(contract: String!, from: String!, to: String!, topics: [String!]!): [Event!]!\n    lastXEventsFromContract(contract: String!, x: Int!): [Event!]!\n    eventByBlockHashAndLogIndex(hash: String!, index: String!): Event!\n    eventByBlockNumberAndLogIndex(number: String!, index: String!): Event!\n}\n```\n\nResponse will be of type 👇\n\n```graphql\ntype Event {\n  origin: String!\n  index: String!\n  topics: [String!]!\n  data: String!\n  txHash: String!\n  blockHash: String!\n}\n```\n\nMethod | Parameters | Possible use case\n--- | --- | ---\n`eventsFromContractByNumberRange` | contract: String!, from: String!, to: String! | When you've one contract address, block number range \u0026 you want to find out all events emitted by that contract in given block range\n`eventsFromContractByTimeRange` | contract: String!, from: String!, to: String! | When you know contract address, unix time stamp range \u0026 you want to find out all events emitted by that contract in given timespan\n`eventsByBlockHash` | hash: String! | When you've block hash \u0026 want to find out all events emitted in tx(s) packed in that block\n`eventsByTxHash` | hash: String! | When you've txHash \u0026 want to find out all events emitted during execution of that tx\n`eventsFromContractWithTopicsByNumberRange` | contract: String!, from: String!, to: String!, topics: [String!]! | When you've smart contract address, block number range \u0026 an ordered list of event log's topic signature(s), you can find out all events emitted by that contract with specific signature(s) in block range\n`eventsFromContractWithTopicsByTimeRange` | contract: String!, from: String!, to: String!, topics: [String!]! | When you've smart contract address, unix time stamp range \u0026 an ordered list of event log's topic signature(s), you can find out all events emitted by that contract with specific signature(s) in given timespan\n`lastXEventsFromContract` | contract: String!, x: Int! | When you know just contract address \u0026 want to find out last **X** events emitted by that contract **[ Very useful sometimes 😅 ]**\n`eventByBlockHashAndLogIndex` | hash: String!, index: String! | When you know block hash, index of event log in block \u0026 want to get back specific event in that position\n`eventByBlockHashAndLogIndex` | number: String!, index: String! | When you know block number, index of event log in block \u0026 want to get back specific event in that position\n\n---\n\n\u003e Browser based GraphQL Playground : **/v1/graphql-playground** 👇🤩\n\n![graphql_playground](./sc/graphQL_playground.png)\n\n---\n\n### Real time notification for mined blocks ⛏\n\n![pubsub-ette](./sc/pubsub-ette.png)\n\nFor listening to blocks getting mined, connect to `/v1/ws` endpoint using websocket client library \u0026 once connected, you need to send **subscription** request with 👇 payload _( JSON encoded )_\n\n```json\n{\n    \"name\": \"block\",\n    \"type\": \"subscribe\",\n    \"apiKey\": \"0x...\"\n}\n```\n\nIf everything goes fine, your subscription will be confirmed with 👇 response _( JSON encoded )_\n\n```json\n{\n    \"code\": 1,\n    \"message\": \"Subscribed to `block`\"\n}\n```\n\nAfter that as long as your machine is reachable, `ette` will keep notifying you about new blocks getting mined in 👇 form\n\n```json\n{\n  \"hash\": \"0x08f50b4795667528f6c0fdda31a0d270aae60dbe7bc4ea950ae1f71aaa01eabc\",\n  \"number\": 7015086,\n  \"time\": 1605328635,\n  \"parentHash\": \"0x5ec0faff8b48e201e366a3f6c505eb274904e034c1565da2241f1327e9bad459\",\n  \"difficulty\": \"6\",\n  \"gasUsed\": 78746,\n  \"gasLimit\": 20000000,\n  \"nonce\": 0,\n  \"miner\": \"0x0000000000000000000000000000000000000000\",\n  \"size\": 1044,\n  \"txRootHash\": \"0x088d6142b1d79803c851b1d839888b1e9f26c31e1266b4e221121f2cd8e85f86\",\n  \"receiptRootHash\": \"0xca3949d52f113935ac08bae15e0816cd0472f01590f0fe0b65584bfb3aa324a6\"\n}\n```\n\nIf you want to cancel subscription, consider sending 👇\n\n```json\n{\n    \"name\": \"block\",\n    \"type\": \"unsubscribe\",\n    \"apiKey\": \"0x...\"\n}\n```\n\nYou'll receive 👇 response, confirming unsubscription\n\n```json\n{\n    \"code\": 1,\n    \"message\": \"Unsubscribed from `block`\"\n}\n```\n\n\u003e Sample code can be found [here](example/block.js)\n\n### Real time notification for transactions ⚡️\n\nFor listening to any transaction happening in network in real-time, send 👇 JSON encoded payload to `/v1/ws`\n\n```json\n{\n    \"name\": \"transaction/\u003cfrom-address\u003e/\u003cto-address\u003e\",\n    \"type\": \"subscribe\",\n    \"apiKey\": \"0x...\"\n}\n```\n\n**Here we've some examples :**\n\n- Any transaction\n\n```json\n{\n    \"name\": \"transaction/*/*\",\n    \"type\": \"subscribe\",\n    \"apiKey\": \"0x...\"\n}\n```\n\n\u003e Sample Code can be found [here](example/transaction_1.js)\n\n- Fixed `from` field **[ tx originated `from` account ]**\n\n```json\n{\n    \"name\": \"transaction/0x4774fEd3f2838f504006BE53155cA9cbDDEe9f0c/*\",\n    \"type\": \"subscribe\",\n    \"apiKey\": \"0x...\"\n}\n```\n\n\u003e Sample Code can be found [here](example/transaction_2.js)\n\n- Fixed `to` field **[ tx targeted `to` account ]**\n\n```json\n{\n    \"name\": \"transaction/*/0x4774fEd3f2838f504006BE53155cA9cbDDEe9f0c\",\n    \"type\": \"subscribe\",\n    \"apiKey\": \"0x...\"\n}\n```\n\n\u003e Sample Code can be found [here](example/transaction_3.js)\n\n- Fixed `from` \u0026 `to` field **[ tx `from` -\u003e `to` account ]**\n\n```json\n{\n    \"name\": \"transaction/0xc9D50e0a571aDd06C7D5f1452DcE2F523FB711a1/0x4774fEd3f2838f504006BE53155cA9cbDDEe9f0c\",\n    \"type\": \"subscribe\",\n    \"apiKey\": \"0x...\"\n}\n```\n\n\u003e Sample Code can be found [here](example/transaction_4.js)\n\nIf everything goes fine, your subscription will be confirmed with 👇 response _( JSON encoded )_\n\n```json\n{\n    \"code\": 1,\n    \"message\": \"Subscribed to `transaction`\",\n    \"apiKey\": \"0x...\"\n}\n```\n\nAfter that as long as your machine is reachable, `ette` will keep notifying you about every transaction happening in 👇 form, where criterias matching\n\n```json\n{\n  \"hash\": \"0x08cfda79bd68ad280c7786e5dd349ab81981c52ea5cdd8e31be0a4b54b976555\",\n  \"from\": \"0xc9D50e0a571aDd06C7D5f1452DcE2F523FB711a1\",\n  \"to\": \"0x4774fEd3f2838f504006BE53155cA9cbDDEe9f0c\",\n  \"contract\": \"\",\n  \"value\": \"\",\n  \"data\": \"0x35086d290000000000000000000000000000000000000000000000000000000000000360\",\n  \"gas\": 200000,\n  \"gasPrice\": \"1000000000\",\n  \"cost\": \"200000000000000\",\n  \"nonce\": 19899,\n  \"state\": 1,\n  \"blockHash\": \"0xc29170d33141602a95b915c954c1068a380ef5169178eef2538beb6edb005810\"\n}\n```\n\nIf you want to cancel subscription, consider sending 👇, while replacing `\u003cfrom-address\u003e` \u0026 `\u003cto-address\u003e` with specific addresses you used when subscribing.\n\n```json\n{\n    \"name\": \"transaction/\u003cfrom-address\u003e/\u003cto-address\u003e\",\n    \"type\": \"unsubscribe\",\n    \"apiKey\": \"0x...\"\n}\n```\n\nYou'll receive 👇 response, confirming unsubscription\n\n```json\n{\n    \"code\": 1,\n    \"message\": \"Unsubscribed from `transaction`\"\n}\n```\n\n### Real-time notification for events 📧\n\nFor listening to any events getting emitted by smart contracts deployed on network, you need to send 👇 JSON encoded payload to `/v1/ws` endpoint, after connecting over websocket\n\n```json\n{\n    \"name\": \"event/\u003ccontract-address\u003e/\u003ctopic-0-signature\u003e/\u003ctopic-1-signature\u003e/\u003ctopic-2-signature\u003e/\u003ctopic-3-signature\u003e\",\n    \"type\": \"subscribe\",\n    \"apiKey\": \"0x...\"\n}\n```\n\n**Here we've some examples :**\n\n- Any event emitted by any smart contract in network\n\n```json\n{\n    \"name\": \"event/*/*/*/*/*\",\n    \"type\": \"subscribe\",\n    \"apiKey\": \"0x...\"\n}\n```\n\n- Any event emitted by one specific smart contract\n\n```json\n{\n    \"name\": \"event/0xcb3fA413B23b12E402Cfcd8FA120f983FB70d8E8/*/*/*/*\",\n    \"type\": \"subscribe\",\n    \"apiKey\": \"0x...\"\n}\n```\n\n- Specific event emitted by one specific smart contract\n\n```json\n{\n    \"name\": \"event/0xcb3fA413B23b12E402Cfcd8FA120f983FB70d8E8/0x2ab93f65628379309f36cb125e90d7c902454a545c4f8b8cb0794af75c24b807/*/*/*\",\n    \"type\": \"subscribe\",\n    \"apiKey\": \"0x...\"\n}\n```\n\n- Specific event emitted by any smart contract in network\n\n```json\n{\n    \"name\": \"event/*/0xddf252ad1be2c89b69c2b068fc378daa952ba7f163c4a11628f55a4df523b3ef/*/*/*\",\n    \"type\": \"subscribe\",\n    \"apiKey\": \"0x...\"\n}\n```\n\n\u003e Sample code can be found [here](example/event_1.js)\n\nIf everything goes fine, your subscription will be confirmed with 👇 JSON encoded response\n\n```json\n{\n    \"code\": 1,\n    \"message\": \"Subscribed to `event`\"\n}\n```\n\nAfter that as long as your machine is reachable, `ette` will keep notifying you about every event emitted by smart contracts, to which you've subscribed to, in 👇 format\n\n```json\n{\n  \"origin\": \"0x0000000000000000000000000000000000001010\",\n  \"index\": 3,\n  \"topics\": [\n    \"0x4dfe1bbbcf077ddc3e01291eea2d5c70c2b422b415d95645b9adcfd678cb1d63\",\n    \"0x0000000000000000000000000000000000000000000000000000000000001010\",\n    \"0x0000000000000000000000004d31abd8533c00436b2145795cc4cef207c3364f\",\n    \"0x00000000000000000000000042eefcda06ead475cde3731b8eb138e88cd0bac3\"\n  ],\n  \"data\": \"0x0000000000000000000000000000000000000000000000000000454b2247e2000000000000000000000000000000000000000000000000001a96ae0b49dfc60000000000000000000000000000000000000000000000003a0df005a45c3dd5dd0000000000000000000000000000000000000000000000001a9668c02797e40000000000000000000000000000000000000000000000003a0df04aef7e85b7dd\",\n  \"txHash\": \"0xfdc5a29fdd57a53953a542f4c46b0ece5423227f26b1191e58d32973b4d81dc9\",\n  \"blockHash\": \"0x08e9ac45e4041a4309c6f5dd42b0fc78e00ca0cb8603965465206b22a63d07fb\"\n}\n```\n\nIf you want to cancel subscription, consider sending 👇, while replacing `\u003ccontract-address\u003e`, `\u003ctopic-{0,1,2,3}-signature\u003e` with specific values you used when subscribing.\n\n```json\n{\n    \"name\": \"event/\u003ccontract-address\u003e/\u003ctopic-0-signature\u003e/\u003ctopic-1-signature\u003e/\u003ctopic-2-signature\u003e/\u003ctopic-3-signature\u003e\",\n    \"type\": \"unsubscribe\",\n    \"apiKey\": \"0x...\"\n}\n```\n\nYou'll receive 👇 response, confirming unsubscription\n\n```json\n{\n    \"code\": 1,\n    \"message\": \"Unsubscribed from `event`\"\n}\n```\n\n\u003e Note: If graceful unsubscription not done, when `ette` finds client unreachable, it'll remove client subscription\n\n### Take snapshot of existing data store ➡️\n\nAssuming you've already a running instance of `ette` for some EVM compatible chain, you can always attempt to take snapshot of whole backing data store, so that if you need to spin up another instance of `ette`, you won't require to sync whole chain data, rather you use this binary data file, which can be used by `ette` for restoring from snapshot data.\n\nSetting `EtteMode` = 4, attempts to take snapshot of DB. \n\n![taking-snapshot](./sc/taking-snapshot.png)\n\n### Restore data from snapshot ⬅️\n\nOnce you've snapshotted binary encoded data file, you can attempt to restore from this \u0026 rebuild whole data store, with out syncing whole chain data. `EtteMode` = 5, attempts to do 👇.\n\n![restoring-from-snapshot](./sc/restoring-from-snapshot.png)\n\nOnce that's done, consider restarting `ette` in desired mode so that it can keep itself in sync with latest chain happenings.\n\n**More coming soon**\n","funding_links":[],"categories":[],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fitzmeanjan%2Fette","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fitzmeanjan%2Fette","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fitzmeanjan%2Fette/lists"}