{"id":26938833,"url":"https://github.com/aeonasoft/audioflow","last_synced_at":"2025-09-01T22:37:39.332Z","repository":{"id":285558381,"uuid":"951802767","full_name":"aeonasoft/audioflow","owner":"aeonasoft","description":"Open Source Audio News Subscription Service (Google Trends, Hacker News \u0026 more).","archived":false,"fork":false,"pushed_at":"2025-04-01T11:43:57.000Z","size":24650,"stargazers_count":12,"open_issues_count":0,"forks_count":0,"subscribers_count":1,"default_branch":"main","last_synced_at":"2025-07-11T16:05:10.167Z","etag":null,"topics":["airflow","bbc-news","beautifulsoup","bigquery","celery","docker","fastapi","gmail-smtp","google-ai-studio","google-trends","hacker-news","jinja2","postgresql","pydantic","redis","requests","sqlalchemy","sqlite"],"latest_commit_sha":null,"homepage":"","language":"Python","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"other","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/aeonasoft.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null,"zenodo":null}},"created_at":"2025-03-20T09:12:55.000Z","updated_at":"2025-06-29T06:53:49.000Z","dependencies_parsed_at":null,"dependency_job_id":"38dbad0e-829f-4078-8877-1ff251ad5c7e","html_url":"https://github.com/aeonasoft/audioflow","commit_stats":null,"previous_names":["aeonasoft/audioflow"],"tags_count":0,"template":false,"template_full_name":null,"purl":"pkg:github/aeonasoft/audioflow","repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/aeonasoft%2Faudioflow","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/aeonasoft%2Faudioflow/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/aeonasoft%2Faudioflow/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/aeonasoft%2Faudioflow/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/aeonasoft","download_url":"https://codeload.github.com/aeonasoft/audioflow/tar.gz/refs/heads/main","sbom_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/aeonasoft%2Faudioflow/sbom","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":268657590,"owners_count":24285532,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","status":"online","status_checked_at":"2025-08-04T02:00:09.867Z","response_time":79,"last_error":null,"robots_txt_status":"success","robots_txt_updated_at":"2025-07-24T06:49:26.215Z","robots_txt_url":"https://github.com/robots.txt","online":true,"can_crawl_api":true,"host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["airflow","bbc-news","beautifulsoup","bigquery","celery","docker","fastapi","gmail-smtp","google-ai-studio","google-trends","hacker-news","jinja2","postgresql","pydantic","redis","requests","sqlalchemy","sqlite"],"created_at":"2025-04-02T14:14:48.512Z","updated_at":"2025-08-04T06:16:49.215Z","avatar_url":"https://github.com/aeonasoft.png","language":"Python","readme":"\n# Audioflow\nThe AudioFlow project is a fully automated news service that transforms text-based news into multilingual audio files and delivers them via email. It is designed for scalability and automation, making it a powerful tool for personalized news distribution.\n\n\n![Audioflow Promotion](https://github.com/user-attachments/assets/3b9600cd-c56a-4758-aa17-bc2d4fe47433)\n\n\n### Container setup, which stems out of the docker compose file:\n\n(a) Worker: `airflow_airflow-worker_1` (Where all the magic happens :)\n- Runs DAG tasks in the Celery-queue.\n- Executes the actual Python code inside DAGs.\n- Logs output for task execution.\n\n(b) Web UI \u0026 DAG Management: `airflow_airflow-webserver_1`\n- Hosts the Airflow UI (accessible at http://your-server-ip:8080).\n- Schedules DAG runs and serves the DAG dashboard.\n- Does not execute DAG tasks but allows you to trigger them.\n\n(c) DAG Scheduling: `airflow_airflow-scheduler_1`\n- Schedules DAG runs and assigns tasks to workers.\n- Ensures that DAGs run at the correct intervals.\n- Moves DAG tasks to the Celery queue for execution.\n\n(d) Other Containers:\n- `airflow_postgres_1`: Airflow metadata database.\n- `airflow_redis_1`: Celery backend (aka: message broker).\n- `airflow_flower_1`: Celery monitoring UI (available at http://your-server-ip:5555).\n- `airflow_airflow-init_1`: Used for initializing the Airflow database.\n\n---\n\n### Audioflow's ER Diagram\n\nNote: \n- To view diagram within VSCODE, first install the extension called \"Markdown Preview Mermaid Support\" and right-click \"Open Preview\".\n- Alternatively, browse to https://mermaid.live/ and paste in model's syntax in the live editor. \n\n#### (A) Relationships Explained\n\n| Relationship | New Verb Used | Description |\n|-------------|--------------|-------------|\n| **Subscriber → Email** | `subscribes_to` | A **subscriber** subscribes to receive multiple emails. **(1:M)** |\n| **Email → email_audio_association** | `includes` | An **email** includes multiple audio files via an association table. **(M:M via Junction Table)** |\n| **AudioFile → email_audio_association** | `is_linked_to` | An **audio file** is linked to multiple emails via an association table. **(M:M via Junction Table)** |\n| **Email → AudioFile** | `references` | An **email** references multiple audio files, forming an **M:M relationship**. |\n| **BBCArticleEnhanced → AudioFile** | `is_converted_to` | A **BBC article** is converted into **one or more audio files**. **(1:M)** |\n| **HackerNewsEnhanced → AudioFile** | `is_converted_to` | A **Hacker News article** is converted into **one or more audio files**. **(1:M)** |\n| **GoogleTrendsEnhanced → AudioFile** | `is_converted_to` | A **Google Trends topic** is converted into **one or more audio files**. **(1:M)** |\n*Note:  Subscriber {text email_address **UNIQUE**}*\n\n---\n\n#### (B) Diagram\n\n```mermaid\n\nerDiagram\n    \n    Subscriber ||--o{ Email : receives\n    Email ||--o{ email_audio_association : contains\n    AudioFile ||--o{ email_audio_association : is_attached\n    Email }|--|{ AudioFile : \"M:M\"\n\n    BBCArticleEnhanced ||--o{ AudioFile : \"1:M\"\n    HackerNewsEnhanced ||--o{ AudioFile : \"1:M\"\n    GoogleTrendsEnhanced ||--o{ AudioFile : \"1:M\"\n\n    Subscriber {\n        int id PK\n        text name\n        text lang_preference\n        text email_address\n        text create_date\n    }\n\n    Email {\n        int id PK\n        int subscriber_id FK\n        text subject\n        text body\n        text status\n        text sent_date\n        text create_date\n    }\n    \n    email_audio_association {\n        int email_id PK, FK\n        int audio_file_id PK, FK\n    }\n\n    AudioFile {\n        int id PK\n        int bbc_id FK\n        int hnews_id FK\n        int gtrends_id FK\n        text bbc_en\n        text bbc_de\n        text bbc_fr\n        text hnews_en\n        text hnews_de\n        text hnews_fr\n        text gtrends_en\n        text gtrends_de\n        text gtrends_fr\n        text audio_file_en\n        text audio_file_de\n        text audio_file_fr\n        text create_date\n    }\n\n    BBCArticleEnhanced {\n        int id PK\n        text link\n        text title_en\n        text description_en\n        text pub_date\n        text media_url\n        text title_de\n        text title_fr\n        text description_de\n        text description_fr\n        float sentiment_score_en\n        text create_date\n    }\n\n    HackerNewsEnhanced {\n        int id PK\n        int hn_id\n        text pub_date\n        int rank\n        text link\n        text title_en\n        text text_en\n        int score\n        int comment_count\n        text title_de\n        text title_fr\n        text description_en\n        text description_de\n        text description_fr\n        float sentiment_score_en\n        text create_date\n    }\n\n    GoogleTrendsEnhanced {\n        int id PK\n        text term_en\n        text description_en\n        text description_de\n        text description_fr\n        float sentiment_score_en\n        text link\n        text pub_date\n        text create_date\n    }\n\n```\n\n---\n\n### Credentials \u0026 Configs\n\n#### (a) Create \"keys\"-folder and \".env\"-file in root dir\n\nYou don't need any keys for the HackerNews or BBC News sources, but to access Google Trends data (via BigQuery), as well as Gemini (via Google Ai Studio) and Gmail SMTP, you need to set-up your: GENAI_API_KEY, BIGQUERY_PROJECT_ID, BIGQUERY_CREDENTIALS_PATH \u0026 SMTP_*. The gitignore-file already hides the above folder and file, so you can just set this up and add the details exactly like below. May the force be with you :)\n\n#### (b) Google Trends (BIGQUERY_PROJECT_ID \u0026 BIGQUERY_CREDENTIALS_PATH)\nDownload your own unique \"foobar-63738393030-t464474-62902egh.json\"-file and add it under \"keys\"-folder.\n- Step 1: Go to Google Cloud Console and select the project you want to use for BigQuery.\n- Step 2: Create a Service Account\n    - In the left sidebar, go to IAM \u0026 Admin → Service Accounts.\n    - Click Create Service Account (at the top). \n    - Fill in:\n        - Service Account Name: (e.g., bigquery-access)\n        - Service Account ID: (automatically generated)\n        - Description: (optional)\n        - Click Create and Continue.\n- Step 3: Assign Permissions\n    - Under \"Grant this service account access to the project\", select:\n        - BigQuery Admin (for full access) or BigQuery Data Viewer (for read-only access).\n        - Click Continue.\n- Step 4: Create \u0026 Download the JSON Key\n    - Click Done (after setting permissions).\n    - Find the newly created Service Account in the list.\n    - Click on it to open the details.\n    - Go to the Keys tab. Click \"Add Key\" → \"Create New Key\".\n    - Select JSON format and click Create.\n    - The JSON file will be automatically downloaded to your computer.\n- Step 5: Move the File to: `/home/ubuntu/airflow/keys/foobar-63738393030-t464474-62902egh.json`\n\n#### (c) GENAI_API_KEY: Google Ai Studio\nTo get an API key for Google AI Studio (Gemini API), follow these steps:\n- Step 1: Go to Google AI Studio\n    - Open Google AI Studio.\n    - Sign in with your Google Account.\n- Step 2: Generate an API Key\n    - Click on your profile icon (top-right corner).\n    - Select \"API Keys\" from the dropdown menu.\n    - Click \"Create API Key\".\n    - Copy the generated API key (you will need it for authentication).\n- Step 3: Store the API Key Securely\n    - Save it in a .env file.\n\n#### (d) Email Gmail SMTP\n- Step 1: Gmail requires \"App Passwords\" (not your normal password) for SMTP access.\n    - Enable \"Less Secure Apps\" or App Passwords\n    - Go to your Google Account Security page: [https://myaccount.google.com/security](https://myaccount.google.com/security)\n    - Scroll to \"How you sign in to Google\" and enable:\n        - For regular accounts: Enable 2-Step Verification, then generate an App Password.\n        - For workspace accounts: You may need an admin to allow SMTP access.\n    - Generate an App Password:\n        - Click App Passwords.\n        - Select Mail → Other → Name it something like \"SMTP Access\".\n        - Click Generate and copy the password.\n- Step 2: Configure Environment Variables\n    - Edit your `.env` file:\n    ```sh\n    SMTP_SERVER=\"smtp.gmail.com\"\n    SMTP_PORT=587\n    SMTP_USERNAME=\"your-email@gmail.com\"\n    SMTP_PASSWORD=\"your-app-password\"  # Use the App Password from Step 1\n    SMTP_SENDER_EMAIL=\"your-email@gmail.com\"\n    ```\n- Step 3: Test sending an email with Python\n    - Install `smtplib` if not already installed:\n    ```sh\n    pip install secure-smtplib\n    ```\n    - Use this Python script:\n    ```python\n    import smtplib\n    import os\n\n    SMTP_SERVER = os.getenv(\"SMTP_SERVER\", \"smtp.gmail.com\")\n    SMTP_PORT = int(os.getenv(\"SMTP_PORT\", 587))\n    SMTP_USERNAME = os.getenv(\"SMTP_USERNAME\")\n    SMTP_PASSWORD = os.getenv(\"SMTP_PASSWORD\")\n    SMTP_SENDER_EMAIL = os.getenv(\"SMTP_SENDER_EMAIL\")\n    TO_EMAIL = \"foobarbar@example.com\"\n\n    subject = \"Test Email from Python\"\n    body = \"Hello, this is a test email from my Python script.\"\n\n    message = f\"Subject: {subject}\\n\\n{body}\"\n\n    try:\n        with smtplib.SMTP(SMTP_SERVER, SMTP_PORT) as server:\n            server.starttls()\n            server.login(SMTP_USERNAME, SMTP_PASSWORD)\n            server.sendmail(SMTP_SENDER_EMAIL, TO_EMAIL, message)\n        print(\"Email sent successfully!\")\n    except Exception as e:\n        print(\"Error:\", e)\n    ```\n    - Replace `\"foobarbar@example.com\"` with your test email.\n    - Run the script and check your inbox.\n\n- Step 4: Additional Step: Update AIRFLOW__SMTP__SMTP_* in your docker-compose.yaml (Optional)\n```yml\nversion: \"3\"\nx-airflow-common: \u0026airflow-common\n  build: \n    context: .\n    dockerfile: Dockerfile\n  environment: \u0026airflow-common-env\n    # ...\n    AIRFLOW__SMTP__SMTP_HOST: \"smtp.gmail.com\"\n    AIRFLOW__SMTP__SMTP_PORT: 587\n    AIRFLOW__SMTP__SMTP_USER: \"foobarbar@gmail.com\"\n    AIRFLOW__SMTP__SMTP_PASSWORD: \"foobarbar\"\n    AIRFLOW__SMTP__SMTP_MAIL_FROM: \"foobarbar@gmail.com\"\n    # ...\n```\n*Note: You can also look at other services like [Mailtrap.io](https://mailtrap.io/) if you plan to onboard more subscribers.*\n\n#### (e) If you have not already, add all the details to your \".env\"-file:\n```sh\n# Environment:\nAPP_ENV=dev\n\n# Data Extraction \u0026 Context Generation:\nGENAI_API_KEY = \"....\"\nBIGQUERY_PROJECT_ID=\"....\"\nBIGQUERY_CREDENTIALS_PATH=\"....\"\n\n# Scheduling:\nAIRFLOW_UID=1000\nAIRFLOW_GID=0\n\n# Emailing:\nSMTP_SERVER=\"smtp.gmail.com\"\nSMTP_PORT=587\nSMTP_USERNAME=\"....\"\nSMTP_PASSWORD=\"....\"\nSMTP_SENDER_EMAIL=\"....\"\n```\n\n*Note: It helped me to first test each end-point first individually, and then via the dags afterwards.*\n\n---\n\n### Bash Commands\n\n#### Common Commands for Debugging: \n```sh\ndocker logs \u003ccontainer_name\u003e\ndocker logs airflow_airflow-webserver_1\n\ndocker exec -it \u003ccontainer_name\u003e /bin/bash\ndocker exec -it airflow_airflow-webserver_1 /bin/bash\n\ndocker exec -it \u003ccontainer_name\u003e bash\ndocker exec -it airflow_airflow-webserver-1 bash\ndocker run -it airflow_airflow-worker_1 bash\ndocker run -it apache/airflow:latest-python3.9 bash\n\ndocker exec -it \u003ccontainer_name\u003e ls -l /opt/airflow/data\ndocker exec -it airflow_airflow-worker_1 ls -l /opt/airflow/data\n\necho $BIGQUERY_CREDENTIALS_PATH\necho $_PIP_ADDITIONAL_REQUIREMENTS\n```\n\n#### After updating your docker-compose.yml, run:\n```sh\nubuntu@ip-172-01-35-23:~/airflow $ docker-compose down\nubuntu@ip-172-01-35-23:~/airflow $ docker-compose up -d --build # -d flag: Run in the background.\nubuntu@ip-172-01-35-23:~/airflow $ docker-compose up --build --force-recreate \n```\n\n#### Enter main/specific container:\n```sh\nubuntu@ip-172-01-35-23:~/airflow $ docker exec -it airflow_airflow-worker_1 bash\ndefault@3abc5e24ca4b:/opt/airflow$ \ndefault@3abc5e24ca4b:/opt/airflow$ python3 run.py\n```\n\n#### uvicorn app:app --reload\n- Open your browser and navigate to either:\n  - http://localhost:8000 to access the app.\n  - http://{public_ip}:8000 (e.g. http://52.113.52.270:8000/)\n- Interactive API documentation is provided here: http://localhost:8000/docs (or http://localhost:8000/openapi.json).\n```sh\ndefault@3abc5e24ca4b:/opt/airflow$ python3 app/database.py\ndefault@3abc5e24ca4b:/opt/airflow$ python utils/mockdata.py\n```\n---\n\n### SQL Bash Commands\n\n#### Verify the tables exist: subscribers, emails, email_audio_association, etc...\n```sh\ndefault@3abc5e24ca4b:/opt/airflow$ sqlite3 data/dev/data_dev.db \".tables\" \ndefault@3abc5e24ca4b:/opt/airflow$ sqlite3 data/dev/data_dev.db \".schema bbc_articles_enhanced\"\n```\n\n#### list all the attached databases to ensure querying the right one:\n```sh\ndefault@3abc5e24ca4b:/opt/airflow$ sqlite3 data/dev/data_dev.db \"PRAGMA database_list;\" # 0|main|/opt/airflow/data/dev/data_dev.db\n```\n\n#### Ensure above dir is correctly mounted in docker compose. You can only access above folder/file/db on vscode, if it is mounted in docker-compose file.\n```sh\nvolumes:\n  - ./data:/opt/airflow/data\n```\n\n##### ALL TABLES\n```sh\nsqlite3 data/dev/data_dev.db \"SELECT name FROM sqlite_master WHERE type='table';\"\n# Consider also naming in sqlite queries below ....\n# data/dev/data_dev.db\n# data/prod/data_prod.db\n# data/mock/data_mock.db\n```\n\n##### BBC NEWS TABLE\n```sh\nsqlite3 data/dev/data_dev.db \".schema bbc_articles_enhanced\"\nsqlite3 data/dev/data_dev.db \"SELECT COUNT(*) FROM bbc_articles_enhanced;\"\nsqlite3 data/dev/data_dev.db \"SELECT link, COUNT(*) FROM bbc_articles_enhanced GROUP BY link HAVING COUNT(*) \u003e 1;\"\nsqlite3 data/dev/data_dev.db \"SELECT * FROM bbc_articles_enhanced;\"\nsqlite3 data/dev/data_dev.db \"SELECT * FROM bbc_articles_enhanced ORDER BY create_date DESC LIMIT 10;\"\nsqlite3 -header -csv data/dev/data_dev.db \"SELECT * FROM bbc_articles_enhanced;\" \u003e data/bbc_articles_enhanced.csv\nsqlite3 -header -csv data/dev/data_dev.db \"SELECT * FROM bbc_articles_enhanced LIMIT 15;\" \u003e data/bbc_articles_enhanced.csv\n🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢\n🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴 \nsqlite3 data/dev/data_dev.db \"DELETE FROM bbc_articles_enhanced;\"\nsqlite3 data/dev/data_dev.db \"DROP TABLE bbc_articles_enhanced;\"\n```\n\n##### Hackernews TABLE\n```sh\nsqlite3 data/dev/data_dev.db \".schema hnews_articles_enhanced\"\nsqlite3 data/dev/data_dev.db \"SELECT COUNT(*) FROM hnews_articles_enhanced;\"\nsqlite3 data/dev/data_dev.db \"SELECT hn_id, COUNT(*) FROM hnews_articles_enhanced GROUP BY hn_id HAVING COUNT(*) \u003e 1;\"\nsqlite3 data/dev/data_dev.db \"SELECT * FROM hnews_articles_enhanced;\"\nsqlite3 data/dev/data_dev.db \"SELECT * FROM hnews_articles_enhanced ORDER BY create_date DESC LIMIT 10;\"\nsqlite3 -header -csv data/dev/data_dev.db \"SELECT * FROM hnews_articles_enhanced;\" \u003e data/hnews_articles_enhanced.csv\nsqlite3 -header -csv data/dev/data_dev.db \"SELECT * FROM hnews_articles_enhanced LIMIT 15;\" \u003e data/hnews_articles_enhanced.csv\n🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢\n🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴  \nsqlite3 data/dev/data_dev.db \"DELETE FROM hnews_articles_enhanced;\"\nsqlite3 data/dev/data_dev.db \"DROP TABLE hnews_articles_enhanced;\"\n```\n\n##### GTrends TABLE\n```sh\nsqlite3 data/dev/data_dev.db \".schema gtrends_terms_enhanced\"\nsqlite3 data/dev/data_dev.db \"SELECT COUNT(*) FROM gtrends_terms_enhanced;\"\nsqlite3 data/dev/data_dev.db \"SELECT term_en, COUNT(*) FROM gtrends_terms_enhanced GROUP BY term_en HAVING COUNT(*) \u003e 1;\"\nsqlite3 data/dev/data_dev.db \"SELECT * FROM gtrends_terms_enhanced;\"\nsqlite3 data/dev/data_dev.db \"SELECT * FROM gtrends_terms_enhanced ORDER BY create_date DESC LIMIT 10;\"\nsqlite3 -header -csv data/dev/data_dev.db \"SELECT * FROM gtrends_terms_enhanced;\" \u003e data/gtrends_terms_enhanced.csv\nsqlite3 -header -csv data/dev/data_dev.db \"SELECT * FROM gtrends_terms_enhanced LIMIT 15;\" \u003e data/gtrends_terms_enhanced.csv\n🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢\n🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴  \nsqlite3 data/dev/data_dev.db \"DELETE FROM gtrends_terms_enhanced;\"\nsqlite3 data/dev/data_dev.db \"DROP TABLE gtrends_terms_enhanced;\"\n```\n\n##### AudioFiles TABLE\n```sh\nsqlite3 data/dev/data_dev.db \".schema audio_files\"\nsqlite3 data/dev/data_dev.db \"SELECT COUNT(*) FROM audio_files;\"\nsqlite3 data/dev/data_dev.db \"SELECT * FROM audio_files LIMIT 10;\"\nsqlite3 data/dev/data_dev.db \"SELECT * FROM audio_files ORDER BY create_date DESC LIMIT 10;\"\nsqlite3 -header -csv data/dev/data_dev.db \"SELECT * FROM audio_files;\" \u003e data/audio_files.csv\nsqlite3 -header -csv data/dev/data_dev.db \"SELECT * FROM audio_files LIMIT 15;\" \u003e data/audio_files.csv\n🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢\n🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴  \nsqlite3 data/dev/data_dev.db \"DELETE FROM audio_files;\"\nsqlite3 data/dev/data_dev.db \"DROP TABLE audio_files;\"\n```\n\n##### Emails TABLE\n```sh\nsqlite3 data/dev/data_dev.db \".schema emails\"\nsqlite3 data/dev/data_dev.db \"SELECT COUNT(*) FROM emails;\"\nsqlite3 data/dev/data_dev.db \"SELECT * FROM emails;\"\nsqlite3 data/dev/data_dev.db \"SELECT * FROM emails ORDER BY create_date DESC LIMIT 10;\"\nsqlite3 -header -csv data/dev/data_dev.db \"SELECT * FROM emails;\" \u003e data/emails.csv\nsqlite3 -header -csv data/dev/data_dev.db \"SELECT * FROM emails LIMIT 15;\" \u003e data/emails.csv\n🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢\n🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴  \nsqlite3 data/dev/data_dev.db \"DELETE FROM emails;\"\nsqlite3 data/dev/data_dev.db \"DROP TABLE emails;\"\n# sqlite3 data/dev/data_dev.db \u003c\u003cEOF PRAGMA foreign_keys=off; DROP TABLE emails; PRAGMA foreign_keys=on; .exit EOF\n```\n\n##### Subscribers TABLE\n```sh\nsqlite3 data/dev/data_dev.db \".schema subscribers\"\nsqlite3 data/dev/data_dev.db \"SELECT COUNT(*) FROM subscribers;\"\nsqlite3 data/dev/data_dev.db \"SELECT * FROM subscribers;\"\nsqlite3 data/dev/data_dev.db \"SELECT * FROM subscribers ORDER BY create_date DESC LIMIT 10;\"\nsqlite3 -header -csv data/dev/data_dev.db \"SELECT * FROM subscribers;\" \u003e data/subscribers.csv\nsqlite3 -header -csv data/dev/data_dev.db \"SELECT * FROM subscribers LIMIT 10;\" \u003e data/subscribers.csv\n🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢🟢\n🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴🔴  \nsqlite3 data/dev/data_dev.db \"DELETE FROM subscribers;\"\nsqlite3 data/dev/data_dev.db \"DROP TABLE subscribers;\"\n```\n","funding_links":[],"categories":[],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Faeonasoft%2Faudioflow","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Faeonasoft%2Faudioflow","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Faeonasoft%2Faudioflow/lists"}