https://github.com/vbalalian/littlefield
Combined web-scraping, loading, and reporting tool for Littlefield simulation, built for use with Google Cloud Run functions and Google Cloud Scheduler
https://github.com/vbalalian/littlefield
bigquery cloud-functions extraction google-cloud-platform littlefield-simulation-game loading python reporting sql webscraping
Last synced: 20 days ago
JSON representation
Combined web-scraping, loading, and reporting tool for Littlefield simulation, built for use with Google Cloud Run functions and Google Cloud Scheduler
- Host: GitHub
- URL: https://github.com/vbalalian/littlefield
- Owner: vbalalian
- Created: 2024-09-25T22:31:16.000Z (9 months ago)
- Default Branch: main
- Last Pushed: 2025-02-17T19:52:34.000Z (5 months ago)
- Last Synced: 2025-05-19T11:45:39.192Z (about 2 months ago)
- Topics: bigquery, cloud-functions, extraction, google-cloud-platform, littlefield-simulation-game, loading, python, reporting, sql, webscraping
- Language: Python
- Homepage:
- Size: 86.9 KB
- Stars: 0
- Watchers: 1
- Forks: 0
- Open Issues: 7
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README

# Littlefield simulation tools
#### Scrape, save, load, and report data from Littlefield factory simulation, using Google Cloud [Functions](https://cloud.google.com/functions), [Scheduler](https://cloud.google.com/scheduler), and [BigQuery](https://cloud.google.com/bigquery).## Options:
#### Load data to an existing BigQuery table
#### Automatically post snapshot reports and useful charts to Discord## Examples:
#### Daily Report:

#### Demand Chart:

#### Utilization Chart:

#### Standings Chart:
## Requirements:
- Existing Google Cloud Project
- Existing Google BigQuery dataset
- Existing Discord Webhook(s)## To Run:
#### Part 1: Set Up Clound Function
Step 1: Create a Google Cloud Function (1st gen) with an HTTP trigger, with the following runtime environment variables:
- GROUP_ID: Group/team name
- GROUP_PW: Group/team password
- CLASS_URL: Littlefield login page url
- GCP_PROJECT_ID: Google Cloud Project ID
- BIGQUERY_DATASET_NAME: Google BigQuery Dataset name
- DISCORD_REPORT_WEBHOOK: Discord webhook url for quick report
- DISCORD_EXCEL_WEBHOOK: Discord webhook url for updated excel files
- DISCORD_DEMAND_UTIL_WEBHOOK: Discord webhook url for demand and utilization charts
- DISCORD_STANDINGS_WEBHOOK: Discord webhook url for standings chartStep 2: Set runtime to **Python 3.11+** and entry point to **main**
Step 3: Copy [main.py](main.py) and [requirements.txt](requirements.txt) to function code
Step 4: Deploy the function
Step 5: Copy Trigger URL for use with Google Cloud Scheduler
#### Part 2: Set Up Cloud Scheduler
Step 1: Create a Google Cloud Scheduler jobStep 2: Set desired job run frequency using a [cron expression](https://crontab.guru/)
Step 3: Set the Target type to **HTTP** and copy the **Trigger URL** as the execution URL
Step 4: Copy the following into the body ('avg' = desired period count for discord report moving averages)
```
{
"bigquery_factory": true,
"bigquery_standings": true,
"bigquery_settings": true,
"discord_report": true,
"discord_demand_chart": true,
"discord_util_chart": true,
"discord_standings_chart": true,
"discord_excel": true,
"avg": 5
}
```Step 5: Adjust the parameters based on desired usage
Step 6: Create the job
## Success!
...hopefully. Please let me know if you have any issues.