https://github.com/arcxteam/allora-usa-election
Run a model predicting the USA presidential party winner 2024, Allora worker for topic 11 an initial inferences of token R : Republican and D : Democrat
https://github.com/arcxteam/allora-usa-election
ai allora allora-node crypto-prediction forecasting polymarket prediction-model presidential-election testnet-node trained-models usa-election-result
Last synced: about 2 months ago
JSON representation
Run a model predicting the USA presidential party winner 2024, Allora worker for topic 11 an initial inferences of token R : Republican and D : Democrat
- Host: GitHub
- URL: https://github.com/arcxteam/allora-usa-election
- Owner: arcxteam
- License: apache-2.0
- Created: 2024-10-07T10:41:59.000Z (8 months ago)
- Default Branch: main
- Last Pushed: 2024-12-22T17:12:30.000Z (5 months ago)
- Last Synced: 2025-04-12T19:09:07.124Z (about 2 months ago)
- Topics: ai, allora, allora-node, crypto-prediction, forecasting, polymarket, prediction-model, presidential-election, testnet-node, trained-models, usa-election-result
- Language: Python
- Homepage:
- Size: 194 KB
- Stars: 4
- Watchers: 1
- Forks: 1
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README

# Disclaimer...Reading!!!
**`This campaign rewards users who run worker nodes providing inferences for the US presidential election party winner once a day. Every inference should be the likelihood of the republican party winning the election. source`** [run-inference-political](https://app.allora.network/points/campaign/run-inference-political)## 1. Components
- **Worker**: The node that publishes inferences to the Allora chain.
- **Inference**: A container that conducts inferences, maintains the model state, and responds to internal inference requests via a Flask application. This node operates with a basic linear regression model for price predictions.
- **Updater**: A cron-like container designed to update the inference node's data by daily fetching the latest market information from the data provider, ensuring the model stays current with new market trends.
- **Topic ID**: Running this worker on `TopicId 11`
- **TOKEN= D** For have inference `D`: Democrat
- **TOKEN= R** For have inference `R`: Republic
- **MODEL**: Own model or modify or check folder /models `model.py`
- **Probability**: Predict of `%` total `0 - 100%`
- **Dataset**: [polymarket.com](https://polymarket.com/elections)
- **An expected result**: Every `24` hours### Setup Worker
The structure of topic 11 - Allora Worker Node```sh
./root/allora-usa-election
├── config.json
├── docker-compose.yml
├── Dockerfile
├── app.py
├── model.py
├── requirements.txt
├── worker-data
└── environment
├── inference-data
└── dataset.csv (R/D)
└── model
└── model.pkl (R/D)
```1. **Clone this repository**
```sh
git clone https://github.com/arcxteam/allora-usa-election.git
cd allora-usa-election
```
2. **Provided and config environment file modify model-tunning**
Copy and read the example .env.example for your variables
```sh
nano .env.example .env
```
Here are the currently accepted configuration
- TOKEN= (`D` or `R`)
- MODEL= the name as model (defaults `SVR` or modify your own model)
- Save `ctrl X + Y and Enter`
Modify model-tunning or check /models folder
```sh
nano model.py
```4. **Edit your config & initialize worker**
Edit for key wallet - name wallet - RPC endpoint & interval etc
```sh
nano config.json
```
Run the following commands root directory to initialize the worker
```sh
chmod +x init.config
./init.config
```
5. **Start the Services**
Run the following command to start the worker node, inference, and updater nodes:
```sh
docker compose up --build -d
```
Check running
```sh
docker compose logs -f --tail=100
```To confirm that the worker successfully sends the inferences to the chain, look for the following logs:
```
{"level":"debug","msg":"Send Worker Data to chain","txHash":,"time":,"message":"Success"}
```
## 2. Testing Inference Only
Send requests to the inference model. For example, request probability of Democrat(`D`) or Republic(`R`)
```sh
curl http://127.0.0.1:8000/inference/D
```
Expected response of numbering:
`
"value":"xx.xxxx"`## 3. Note: Checking the logs that worker are registered successfully
```sh
docker logs -f (your container id) 2>&1 | head -n 70
``````sh
docker logs -f (your container id) 2>&1 | grep -i "Success"
```
**Result every 24H: This will only get 1 request per day for topic 11 and points will depends on how your worker prediction vs the ground truth from polymarket**