{"id":33288511,"url":"https://github.com/neverinfamous/kv-manager","last_synced_at":"2026-03-07T11:08:53.618Z","repository":{"id":322524049,"uuid":"1089850150","full_name":"neverinfamous/kv-manager","owner":"neverinfamous","description":"A modern, full-featured web application for managing Cloudflare Workers KV namespaces and keys, with enterprise-grade authentication via Cloudflare Access Zero Trust and GitHub SSO.","archived":false,"fork":false,"pushed_at":"2025-11-13T04:21:21.000Z","size":1754,"stargazers_count":2,"open_issues_count":0,"forks_count":0,"subscribers_count":1,"default_branch":"main","last_synced_at":"2025-11-13T06:12:39.456Z","etag":null,"topics":["cloudflare","cloudflare-kv","cloudflare-workers","developer-productivity","developer-tools","development-tools","github-sso","react","shadcn","tailwind","typescript","vite"],"latest_commit_sha":null,"homepage":"https://adamic.tech/articles/2025-11-05-kv-manager-v1-0-0","language":"TypeScript","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"mit","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/neverinfamous.png","metadata":{"files":{"readme":"README.md","changelog":"CHANGELOG.md","contributing":"CONTRIBUTING.md","funding":null,"license":"LICENSE","code_of_conduct":"CODE_OF_CONDUCT.md","threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":"SECURITY.md","support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null,"zenodo":null,"notice":null,"maintainers":null,"copyright":null,"agents":null,"dco":null,"cla":null}},"created_at":"2025-11-04T22:37:18.000Z","updated_at":"2025-11-13T04:21:25.000Z","dependencies_parsed_at":null,"dependency_job_id":null,"html_url":"https://github.com/neverinfamous/kv-manager","commit_stats":null,"previous_names":["neverinfamous/kv-manager"],"tags_count":1,"template":false,"template_full_name":null,"purl":"pkg:github/neverinfamous/kv-manager","repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/neverinfamous%2Fkv-manager","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/neverinfamous%2Fkv-manager/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/neverinfamous%2Fkv-manager/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/neverinfamous%2Fkv-manager/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/neverinfamous","download_url":"https://codeload.github.com/neverinfamous/kv-manager/tar.gz/refs/heads/main","sbom_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/neverinfamous%2Fkv-manager/sbom","scorecard":null,"host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":284948163,"owners_count":27089294,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","status":"online","status_checked_at":"2025-11-17T02:00:06.431Z","response_time":55,"last_error":null,"robots_txt_status":"success","robots_txt_updated_at":"2025-07-24T06:49:26.215Z","robots_txt_url":"https://github.com/robots.txt","online":true,"can_crawl_api":true,"host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["cloudflare","cloudflare-kv","cloudflare-workers","developer-productivity","developer-tools","development-tools","github-sso","react","shadcn","tailwind","typescript","vite"],"created_at":"2025-11-17T20:01:35.035Z","updated_at":"2025-11-17T20:04:19.154Z","avatar_url":"https://github.com/neverinfamous.png","language":"TypeScript","readme":"# Cloudflare KV Manager\n\n*Last Updated: November 14, 2025*\n\nA modern, full-featured web application for managing Cloudflare Workers KV namespaces and keys, with enterprise-grade authentication via Cloudflare Access Zero Trust.\n\n**🎯 [Try the Live Demo](https://kv.adamic.tech/)** - See KV Manager in action\n\n**📰 [Read the v1.0.0 Release Article](https://adamic.tech/articles/2025-11-05-kv-manager-v1-0-0)** - Learn more about features, architecture, and deployment\n\n## Features\n\n### Namespace Management\n- Create, delete, and rename KV namespaces\n- Browse namespaces with key counts and metadata\n- Export entire namespaces to JSON or NDJSON format\n- Import keys from JSON or NDJSON files\n- Namespace-level audit logging\n\n### Key Operations\n- List keys with cursor-based pagination\n- Create, update, and delete individual keys\n- Full CRUD operations with dual metadata support\n- **TTL (expiration) management** - Minimum 60 seconds\n- **KV Native Metadata** - Up to 1024 bytes, stored in Cloudflare KV\n- **D1 Custom Metadata** - Unlimited size, stored in D1 database\n- Single-version backup and restore\n\n### Metadata \u0026 Tags\n- **KV Native Metadata**: Store up to 1024 bytes of JSON metadata directly in Cloudflare KV (retrieved with key value)\n- **D1 Custom Metadata**: Store unlimited JSON metadata in D1 database (searchable, no size limit)\n- **Tags (D1-Backed)**: Add unlimited tags to keys for organization and filtering\n- Search and filter by tags\n- Bulk tag operations (add/remove/replace)\n- Two separate metadata systems for different use cases\n\n### Search \u0026 Discovery\n- Cross-namespace search by key name (partial matches)\n- Filter by specific namespaces\n- Filter by tags (multiple tag support)\n- Real-time search with debouncing\n- Quick navigation to search results\n- **Note**: Search queries D1 metadata (key names, tags, custom metadata) - not KV values\n\n### Bulk Operations\n- **Bulk Delete**: Remove multiple keys at once\n- **Bulk Copy**: Copy keys between namespaces\n- **Bulk TTL Update**: Set expiration on multiple keys\n- **Bulk Tag**: Apply tags to multiple keys\n- Progress tracking with job IDs and event history\n- Batch processing (10,000 keys per operation)\n\n### Import/Export\n- Export namespaces in JSON or NDJSON format\n- Auto-detect format on import\n- Collision handling (skip/overwrite/fail)\n- Progress tracking for large operations\n- Download exported data as files\n\n### Job History\n- **Job History UI** - View complete history of all bulk operations\n- Timeline visualization showing job lifecycle events\n- Filter jobs by status (completed, failed, cancelled, running, queued)\n- Filter by operation type (export, import, bulk delete, bulk copy, bulk TTL, bulk tag)\n- Job cards displaying operation details, namespace, timestamps, and progress\n- Click any job to view detailed event timeline with milestones\n- \"View History\" button in progress dialog for immediate access\n- Pagination support for large job histories\n- User-specific history (only see your own jobs)\n\n### Audit Logging\n- Track all operations with user attribution\n- Filter by namespace or user\n- Filter by operation type\n- Pagination support\n- Export audit logs to CSV\n- Comprehensive operation tracking\n- **Job lifecycle event tracking** - Milestone events (started, 25%, 50%, 75%, completed/failed/cancelled) for all bulk operations\n- Event history API for job replay and debugging\n\n### User Interface\n- **Dark/Light Theme**: System, light, and dark theme support\n- **Navigation**: Switch between Namespaces, Search, Job History, and Audit Log views\n- **Responsive Design**: Works on desktop and mobile\n- **Modern UI**: Built with shadcn/ui components and Tailwind CSS\n\n## Architecture\n\n- **Frontend**: React 19.2.0 + TypeScript 5.9.3 + Vite 7.2.2 + Tailwind CSS 3.4.18 + shadcn/ui\n- **Backend**: Cloudflare Workers + KV + D1 (metadata) + Durable Objects (orchestration)\n- **Progress Tracking**: HTTP polling for job status (simple and reliable)\n- **Auth**: Cloudflare Access (Zero Trust)\n\n### Progress Tracking\n\nAll bulk operations (copy, delete, TTL updates, tag operations, import, export) use **HTTP polling** for progress updates:\n\n- **Async Processing**: Operations start immediately and process in background via Durable Objects\n- **Polling Updates**: Progress, current key, percentage, and errors retrieved via HTTP polling (1-second intervals)\n- **Job History**: Complete event timeline for every job with milestone tracking\n- **Progress Details**: See total keys, processed count, errors, current key being processed, and percentage completion\n- **Simple \u0026 Reliable**: No WebSocket connection issues or complexity\n\n## Docker Deployment\n\n**🐳 Quick Start with Docker**\n\nPull the latest image:\n\n```bash\ndocker pull writenotenow/kv-manager:latest\n```\n\nRun the container:\n\n```bash\ndocker run -d \\\n  -p 8787:8787 \\\n  -e ACCOUNT_ID=your_cloudflare_account_id \\\n  -e API_KEY=your_cloudflare_api_token \\\n  -e TEAM_DOMAIN=https://yourteam.cloudflareaccess.com \\\n  -e POLICY_AUD=your_cloudflare_access_aud_tag \\\n  --name kv-manager \\\n  writenotenow/kv-manager:latest\n```\n\nAccess at `http://localhost:8787`\n\n**📖 Full Docker Documentation:** See [DOCKER_README.md](./DOCKER_README.md) for complete deployment guides including:\n- Docker Compose configurations\n- Kubernetes deployments\n- Reverse proxy examples (Nginx, Traefik, Caddy)\n- Security best practices\n- Troubleshooting guide\n\n## Local Development\n\n### Prerequisites\n\n- Node.js 18+\n- npm or yarn\n- Wrangler CLI (`npm install -g wrangler`)\n\n### Setup\n\n1. **Install dependencies**:\n```bash\nnpm install\n```\n\n2. **Create environment file**:\n```bash\ncp .env.example .env\n```\n\n3. **Initialize local D1 database**:\n```bash\nnpx wrangler d1 execute kv-manager-metadata-dev --local --file=worker/schema.sql\n```\n\n4. **Start the development servers**:\n\nIn Terminal 1, start the frontend:\n\n```bash\nnpm run dev\n```\n\nIn Terminal 2, start the worker:\n\n```bash\nnpx wrangler dev --config wrangler.dev.toml --local\n```\n\n5. **Access the application**:\n- Frontend: http://localhost:5173\n- Worker API: http://localhost:8787\n\n### Local Development Notes\n\n- Authentication is **bypassed** for localhost requests\n- Mock data is returned when no Cloudflare credentials are provided\n- No secrets required for local development\n- CORS is configured to allow `http://localhost:5173`\n\n## Production Deployment\n\n### Prerequisites\n\n- Cloudflare account\n- Domain (optional, can use workers.dev)\n- Cloudflare Access configured for your domain\n\n### Setup\n\n1. **Create production configuration**:\n```bash\ncp wrangler.toml.example wrangler.toml\n```\n\n2. **Create D1 database**:\n\n```bash\nwrangler d1 create kv-manager-metadata\n```\n\nCopy the `database_id` from the output to your `wrangler.toml` file.\n\n3. **Initialize D1 schema**:\n\nFor new installations:\n```bash\nwrangler d1 execute kv-manager-metadata --remote --file=worker/schema.sql\n```\n\nFor existing installations (upgrading), run the migration:\n```bash\nwrangler d1 execute kv-manager-metadata --remote --file=worker/migrations/apply_all_migrations.sql\n```\n\nSee [MIGRATION_GUIDE.md](./MIGRATION_GUIDE.md) for detailed migration instructions.\n\n4. **Set secrets**:\n\nSet your Cloudflare Account ID:\n\n```bash\nwrangler secret put ACCOUNT_ID\n```\n\nSet your API Key:\n\n```bash\nwrangler secret put API_KEY\n```\n\nSet your Team Domain:\n\n```bash\nwrangler secret put TEAM_DOMAIN\n```\n\nSet your Policy AUD tag:\n\n```bash\nwrangler secret put POLICY_AUD\n```\n\n5. **Build and deploy**:\n\nBuild the application:\n\n```bash\nnpm run build\n```\n\nDeploy to Cloudflare:\n\n```bash\nwrangler deploy\n```\n\n### Production Notes\n\n- All API requests require valid Cloudflare Access JWT\n- Audit logging captures all destructive operations\n- D1 stores metadata, tags, and audit logs\n- Durable Objects handle bulk operations exceeding 10,000 keys\n\n## API Endpoints\n\n### Namespaces\n- `GET /api/namespaces` - List all namespaces\n- `POST /api/namespaces` - Create a new namespace\n- `DELETE /api/namespaces/:id` - Delete a namespace\n- `PATCH /api/namespaces/:id/rename` - Rename a namespace\n- `GET /api/namespaces/:id/info` - Get namespace information and statistics\n\n### Keys\n- `GET /api/keys/:namespaceId/list` - List keys with cursor-based pagination\n- `GET /api/keys/:namespaceId/:keyName` - Get a key's value and metadata\n- `PUT /api/keys/:namespaceId/:keyName` - Create or update a key\n- `DELETE /api/keys/:namespaceId/:keyName` - Delete a key\n- `POST /api/keys/:namespaceId/bulk-delete` - Delete multiple keys\n- `POST /api/keys/:namespaceId/bulk-copy` - Copy keys to another namespace\n- `POST /api/keys/:namespaceId/bulk-ttl` - Update TTL on multiple keys\n\n### Metadata \u0026 Tags\n- `GET /api/metadata/:namespaceId/:keyName` - Get D1-backed metadata and tags\n- `PUT /api/metadata/:namespaceId/:keyName` - Update metadata and tags\n- `POST /api/metadata/:namespaceId/bulk-tag` - Apply tags to multiple keys (add/remove/replace)\n\n### Search\n- `GET /api/search` - Search keys across namespaces by key name, tags, or custom metadata\n  - Query params: `query` (key name pattern), `namespaceId` (namespace filter), `tags` (comma-separated)\n  - **Note**: Only searches keys with metadata in D1; does not search KV values\n\n### Backup \u0026 Restore\n- `POST /api/backup/:namespaceId/:keyName/undo` - Restore key to previous version\n- `GET /api/backup/:namespaceId/:keyName/check` - Check if backup exists\n\n### Import/Export\n- `GET /api/export/:namespaceId` - Start async export of namespace keys and values\n  - Query params: `format` (json|ndjson)\n  - Returns: `job_id`, `status`, `ws_url` (ws_url provided for API compatibility, polling recommended)\n- `POST /api/import/:namespaceId` - Start async import of keys into namespace\n  - Query params: `collision` (skip|overwrite|fail)\n  - Returns: `job_id`, `status`, `ws_url` (ws_url provided for API compatibility, polling recommended)\n- `GET /api/jobs/:jobId` - Get status of bulk job (polling endpoint - recommended)\n- `GET /api/jobs/:jobId/download` - Download completed export file\n\n### Job History\n- `GET /api/jobs` - Get paginated list of user's jobs\n  - Query params: \n    - `limit`, `offset` - Pagination\n    - `status` - Filter by job status (completed, failed, cancelled, running, queued)\n    - `operation_type` - Filter by operation (export, import, bulk_copy, bulk_delete, bulk_ttl_update, bulk_tag)\n    - `namespace_id` - Filter by specific namespace\n    - `start_date`, `end_date` - Filter by date range (ISO timestamps)\n    - `job_id` - Search by job ID (partial match with LIKE)\n    - `min_errors` - Filter jobs with error_count \u003e= threshold\n    - `sort_by` - Column to sort by (started_at, completed_at, total_keys, error_count, percentage)\n    - `sort_order` - Sort direction (asc or desc, default: desc)\n  - Returns: Job list with metadata, progress, and timestamps\n- `GET /api/jobs/:jobId/events` - Get lifecycle event history for a job\n  - Returns: Chronological list of events (started, progress_25, progress_50, progress_75, completed/failed/cancelled)\n  - Use case: Job history UI, debugging, event replay\n\n### Audit Logs\n- `GET /api/audit/:namespaceId` - Get audit log for a namespace\n  - Query params: `limit`, `offset`, `operation`\n- `GET /api/audit/user/:userEmail` - Get audit log for a specific user\n  - Query params: `limit`, `offset`, `operation`\n\n### Admin Utilities\n- `POST /api/admin/sync-keys/:namespaceId` - Sync all keys in a namespace to search index\n  - Creates metadata entries for keys that don't have them\n  - Useful for indexing keys created outside the UI (via API, CLI, etc.)\n  - Returns: Total keys found and number successfully synced\n\n## Database Schema\n\nThe D1 database (`kv-manager-metadata`) stores:\n\n### Tables\n\n#### `key_metadata`\n- Stores tags and custom metadata for keys\n- JSON fields for flexible schema\n- Indexed by `namespace_id` and `key_name`\n\n#### `audit_log`\n- Tracks all operations (create, update, delete, bulk operations)\n- User attribution via email\n- Timestamp, operation type, and details\n- Indexed for efficient querying\n\n#### `job_audit_events`\n- Tracks lifecycle events for all bulk jobs (started, progress_25, progress_50, progress_75, completed, failed, cancelled)\n- Stores detailed JSON metadata for each event (processed counts, error counts, percentages)\n- Foreign key relationship to `bulk_jobs` table\n- Indexed by `job_id` and `user_email` for efficient querying\n- Foundation for job history and event replay functionality\n\n#### `bulk_jobs`\n- Tracks import/export and bulk operation progress\n- Status tracking (queued, running, completed, failed)\n- Progress counters (total, processed, errors)\n- Job metadata and timestamps\n\n#### `namespace_metadata`\n- First and last accessed timestamps\n- Namespace-level statistics\n\nSee `worker/schema.sql` for the complete schema definition.\n\n## User Interface\n\n### Navigation\n- **Namespaces View**: Browse and manage KV namespaces\n- **Search View**: Cross-namespace key search with filters\n- **Job History View**: View all bulk operations with event timelines\n- **Audit Log View**: Operation history and tracking\n\n### Theme Support\n- **System** (default): Follows OS preference\n- **Light**: Light mode\n- **Dark**: Dark mode\n\nTheme preference is stored in localStorage and persists across sessions.\n\n## Security\n\n- Cloudflare Access JWT validation on all API requests\n- Auth bypassed for localhost development\n- All KV operations require valid auth token\n- Audit logging of all destructive operations\n- Protected namespaces hidden from UI\n\n## Usage Guide\n\n### Managing Namespaces\n1. View all namespaces on the main page\n2. Click **Create Namespace** to add a new one\n3. Use **Export** to download namespace data (JSON/NDJSON)\n4. Use **Import** to upload keys from a file\n5. Click **Browse Keys** to view namespace contents\n6. Use the three-dot menu for rename/delete operations\n\n### Working with Keys\n1. Browse keys in a namespace with pagination\n2. Click **Add Key** to create a new key-value pair\n3. Select multiple keys using checkboxes for bulk operations\n4. Edit individual keys by clicking on them\n5. View and modify TTL (expiration) settings - **minimum 60 seconds**\n6. Add **KV Native Metadata** (1024 byte limit) and **D1 Custom Metadata** (unlimited) in the \"Metadata \u0026 Tags\" tab\n7. Apply tags for organization and searchability\n\n### Bulk Operations\n1. Select multiple keys using checkboxes\n2. Choose from available bulk actions:\n   - **Copy to Namespace**: Duplicate keys to another namespace\n   - **Update TTL**: Set expiration time on selected keys\n   - **Apply Tags**: Add, remove, or replace tags\n   - **Delete Selected**: Remove multiple keys at once\n3. Monitor progress with job status tracking via polling\n\n### Searching\n1. Click **Search** in the navigation bar\n2. Enter a key name pattern (supports partial matches)\n3. Filter by specific namespace (optional)\n4. Filter by tags (comma-separated, optional)\n5. Click any result to navigate to that key\n\n**Important Notes**:\n- **Key Name Search**: Searches the key NAME (not namespace names or values). Example: searching \"meta\" finds keys like \"meta:test\", \"metadata:config\", etc.\n- **Tag Search**: You can search by tags alone (leave key name empty) or combine with key name search\n- **Automatic Indexing**: All keys created or updated through the UI are automatically indexed for search\n- **External Keys**: Keys created outside the UI (via API, CLI, etc.) won't appear in search until they're viewed/edited in the UI or have metadata added\n\n### Job History\n1. Click **Job History** in the navigation bar\n2. View all your bulk operations (import, export, bulk delete, etc.)\n3. Use advanced filters to find specific jobs:\n   - **Status Filter**: Filter by completed, failed, cancelled, running, or queued\n   - **Operation Type**: Filter by export, import, bulk copy, bulk delete, bulk TTL, or bulk tag\n   - **Namespace Filter**: Filter jobs by specific namespace\n   - **Date Range**: Select preset ranges (Last 24h, Last 7 days, Last 30 days) or custom date range\n   - **Job ID Search**: Search for jobs by their ID (partial matches supported)\n   - **Min Errors**: Filter jobs with a minimum error count threshold\n4. Sort results by:\n   - Started At (default)\n   - Completed At\n   - Total Keys\n   - Error Count\n   - Progress Percentage\n5. Toggle sort order between ascending and descending\n6. Click **Clear All Filters** to reset all filters to defaults\n7. Click any job card to view detailed event timeline\n8. See milestone events: started → 25% → 50% → 75% → completed\n9. After any bulk operation completes, click **View History** in the progress dialog\n\n### Audit Logs\n1. Click **Audit Log** in the navigation bar\n2. Select a namespace to view its operation history\n3. Filter by operation type (create, update, delete, etc.)\n4. Use pagination to browse historical entries\n5. Export logs to CSV for external analysis\n\n### Import/Export with Metadata\n\nWhen importing keys via JSON or NDJSON, you can include multiple types of metadata:\n\n```json\n[\n  {\n    \"name\": \"example-key\",\n    \"value\": \"example-value\",\n    \"ttl\": 600,\n    \"metadata\": {\n      \"type\": \"example\",\n      \"source\": \"import\"\n    },\n    \"custom_metadata\": {\n      \"extended_info\": \"large data here\",\n      \"description\": \"detailed information\"\n    },\n    \"tags\": [\"production\", \"important\"]\n  }\n]\n```\n\n**Field Descriptions:**\n- `name` (required): Key name\n- `value` (required): Key value\n- `ttl` (optional): Time-to-live in seconds (minimum 60). Alternative: `expiration_ttl`\n- `expiration` (optional): Unix timestamp for absolute expiration\n- `metadata` (optional): **KV Native Metadata** - Stored in Cloudflare KV (1024 byte limit)\n- `custom_metadata` (optional): **D1 Custom Metadata** - Stored in D1 database (no size limit, searchable)\n- `tags` (optional): Array of tags stored in D1 for organization and search\n\n**Important Notes:**\n- `metadata` field → Stored in Cloudflare KV as native metadata (fast access, size limited)\n- `custom_metadata` field → Stored in D1 database (searchable, no size limit)\n- Both metadata types can be used simultaneously\n- Export operations include both KV native metadata and D1 tags/custom metadata\n\n### Syncing Existing Keys for Search\nIf you have keys created outside the UI (via API, CLI, etc.) that don't appear in search:\n\n1. Use the sync endpoint to index them:\n   ```bash\n   curl -X POST https://your-domain.com/api/admin/sync-keys/{namespaceId}\n   ```\n2. All keys in the namespace will be indexed for search\n3. Keys with existing metadata won't be affected\n4. Future keys created through the UI are automatically indexed\n\n## Troubleshooting\n\n### Worker not starting\n\nEnsure `wrangler` is installed:\n\n```bash\nnpm install -g wrangler\n```\n\nCheck Node.js version (18+ required):\n\n```bash\nnode --version\n```\n\nTry clearing Wrangler cache:\n\n```bash\nrm -rf ~/.wrangler\n```\n\n### Frontend not connecting to worker\n- Verify `VITE_WORKER_API` in `.env` points to `http://localhost:8787`\n- Check CORS configuration in `worker/utils/cors.ts`\n- Ensure both dev servers are running\n\n### D1 database errors\n\nReinitialize the schema:\n\n```bash\nnpx wrangler d1 execute kv-manager-metadata-dev --local --file=worker/schema.sql\n```\n\nCheck D1 binding in `wrangler.dev.toml`\n\nVerify database exists:\n\n```bash\nnpx wrangler d1 list\n```\n\n### Mock data not appearing\n- Mock data is only returned when `ACCOUNT_ID` and `API_KEY` are not set\n- Check console logs for `[Auth] Localhost detected, skipping JWT validation`\n- Ensure worker is running with `--local` flag\n\n### Import/Export issues\n- Verify file format is valid JSON or NDJSON\n- Check file size (large imports may take time)\n- Monitor job status using the returned `job_id`\n- Check browser console for detailed error messages\n\n### Search not returning results\n- Ensure metadata exists in D1 database\n- Check that keys have been tagged (if filtering by tags)\n- Verify D1 database is properly initialized\n- Try searching without filters first\n\n### Progress tracking issues\n- All progress tracking uses HTTP polling (no WebSocket connections)\n- Jobs poll for status every second until completion\n- Check browser console for API errors if progress isn't updating\n- Verify D1 database has the required tables (see MIGRATION_GUIDE.md)\n- For development, ensure worker is running on expected port (default: 8787)\n\n---\n\n### Future Enhancements:\n1. R2 backup integration\n2. Batch operations to R2\n\n---\n\n## License\n\nMIT\n\n## Contributing\n\nContributions welcome! Please open an issue or PR.\n","funding_links":[],"categories":[],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fneverinfamous%2Fkv-manager","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fneverinfamous%2Fkv-manager","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fneverinfamous%2Fkv-manager/lists"}