{"id":28325464,"url":"https://github.com/pgflow-dev/ai-web-scraper","last_synced_at":"2026-01-27T19:30:24.805Z","repository":{"id":294229888,"uuid":"986318279","full_name":"pgflow-dev/ai-web-scraper","owner":"pgflow-dev","description":null,"archived":false,"fork":false,"pushed_at":"2025-05-19T14:55:08.000Z","size":17,"stargazers_count":1,"open_issues_count":0,"forks_count":0,"subscribers_count":1,"default_branch":"main","last_synced_at":"2025-06-23T12:43:19.507Z","etag":null,"topics":[],"latest_commit_sha":null,"homepage":null,"language":"PLpgSQL","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":null,"status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/pgflow-dev.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":null,"code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null,"zenodo":null}},"created_at":"2025-05-19T12:35:15.000Z","updated_at":"2025-05-22T04:09:36.000Z","dependencies_parsed_at":"2025-05-19T14:00:06.736Z","dependency_job_id":"2af0b9bd-8c35-4171-9445-ac36590464b1","html_url":"https://github.com/pgflow-dev/ai-web-scraper","commit_stats":null,"previous_names":["pgflow-dev/ai-web-scraper"],"tags_count":0,"template":false,"template_full_name":null,"purl":"pkg:github/pgflow-dev/ai-web-scraper","repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/pgflow-dev%2Fai-web-scraper","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/pgflow-dev%2Fai-web-scraper/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/pgflow-dev%2Fai-web-scraper/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/pgflow-dev%2Fai-web-scraper/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/pgflow-dev","download_url":"https://codeload.github.com/pgflow-dev/ai-web-scraper/tar.gz/refs/heads/main","sbom_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/pgflow-dev%2Fai-web-scraper/sbom","scorecard":null,"host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":286080680,"owners_count":28819408,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2026-01-27T18:44:20.126Z","status":"ssl_error","status_checked_at":"2026-01-27T18:44:09.161Z","response_time":168,"last_error":"SSL_connect returned=1 errno=0 peeraddr=140.82.121.6:443 state=error: unexpected eof while reading","robots_txt_status":"success","robots_txt_updated_at":"2025-07-24T06:49:26.215Z","robots_txt_url":"https://github.com/robots.txt","online":false,"can_crawl_api":true,"host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":[],"created_at":"2025-05-25T21:13:34.723Z","updated_at":"2026-01-27T19:30:24.800Z","avatar_url":"https://github.com/pgflow-dev.png","language":"PLpgSQL","readme":"# AI Web Scraper – Quick-start Cheat Sheet\n\n[→ View the full tutorial here](https://pgflow.dev/tutorials/ai-web-scraper/)\n\n---\n\n## 0. One-time setup\n\n```bash\n# Clone \u0026 enter the repo\ngit clone https://github.com/pgflow-dev/ai-web-scraper.git\ncd ai-web-scraper\n\n# Copy the environment file example and add your OpenAI key (required by the tasks)\ncp supabase/functions/.env.example supabase/functions/.env\n# Edit the .env file and add your OpenAI API key\n# OPENAI_API_KEY=sk-...\n```\n\n---\n\n## 1. Boot the local Supabase stack\n\n```bash\nnpx supabase@2.22.12 start\n```\n\n---\n\n## 2. Run all database migrations (table + flow)\n\n```bash\nnpx supabase@2.22.12 migrations up --local\n```\n\n---\n\n## 3. Serve the Edge Functions (keep this terminal open)\n\n```bash\nnpx supabase@2.22.12 functions serve\n```\n\n---\n\n## 4. Start the worker (new terminal)\n\n```bash\ncurl -X POST http://127.0.0.1:54321/functions/v1/analyze_website_worker\n```\n\nThe first `curl` boots the worker; it stays alive and polls for jobs.\n\n---\n\n## 5. Trigger a job (SQL editor or psql)\n\n```sql\nselect * from pgflow.start_flow(\n  flow_slug =\u003e 'analyzeWebsite',\n  input     =\u003e '{\"url\":\"https://supabase.com\"}'\n);\n```\n\n---\n\n## 6. Check results\n\n```sql\nselect * from websites;                 -- scraped data\nselect * from pgflow.runs;              -- run history\n```\n\nThat’s it – scrape, summarize, tag, store!\n","funding_links":[],"categories":[],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fpgflow-dev%2Fai-web-scraper","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fpgflow-dev%2Fai-web-scraper","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fpgflow-dev%2Fai-web-scraper/lists"}