{"id":27948618,"url":"https://github.com/designcomputer/ollama-model-lab","last_synced_at":"2025-10-13T10:03:42.797Z","repository":{"id":274103287,"uuid":"921911622","full_name":"designcomputer/ollama-model-lab","owner":"designcomputer","description":"A web-based interface for testing and comparing different Ollama models","archived":false,"fork":false,"pushed_at":"2025-01-25T15:02:03.000Z","size":235,"stargazers_count":5,"open_issues_count":0,"forks_count":1,"subscribers_count":1,"default_branch":"main","last_synced_at":"2025-10-13T10:02:31.690Z","etag":null,"topics":[],"latest_commit_sha":null,"homepage":null,"language":"JavaScript","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"mit","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/designcomputer.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null,"zenodo":null}},"created_at":"2025-01-24T21:17:51.000Z","updated_at":"2025-08-10T19:08:13.000Z","dependencies_parsed_at":null,"dependency_job_id":"3bd7b61f-13bc-4c4f-aaec-ed2a14aecfdf","html_url":"https://github.com/designcomputer/ollama-model-lab","commit_stats":null,"previous_names":["designcomputer/ollama-model-lab"],"tags_count":1,"template":false,"template_full_name":null,"purl":"pkg:github/designcomputer/ollama-model-lab","repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/designcomputer%2Follama-model-lab","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/designcomputer%2Follama-model-lab/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/designcomputer%2Follama-model-lab/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/designcomputer%2Follama-model-lab/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/designcomputer","download_url":"https://codeload.github.com/designcomputer/ollama-model-lab/tar.gz/refs/heads/main","sbom_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/designcomputer%2Follama-model-lab/sbom","scorecard":null,"host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":279014662,"owners_count":26085554,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","status":"online","status_checked_at":"2025-10-13T02:00:06.723Z","response_time":61,"last_error":null,"robots_txt_status":"success","robots_txt_updated_at":"2025-07-24T06:49:26.215Z","robots_txt_url":"https://github.com/robots.txt","online":true,"can_crawl_api":true,"host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":[],"created_at":"2025-05-07T14:59:46.440Z","updated_at":"2025-10-13T10:03:42.787Z","avatar_url":"https://github.com/designcomputer.png","language":"JavaScript","readme":"# Ollama Model Lab\r\n\r\nA web-based interface for testing and comparing different Ollama models with customizable parameters and prompts.\r\n\r\n![Ollama Model Lab Interface](screenshot.png)\r\n\r\n## Overview\r\n\r\nOllama Model Lab provides an intuitive playground for exploring and comparing different Ollama models. Unlike typical chat interfaces or benchmark tools, this lab environment allows you to:\r\n\r\n- Test multiple models simultaneously with the same prompt\r\n- Compare detailed performance metrics and response characteristics\r\n- Customize model parameters and observe their impact\r\n- Generate comprehensive comparison reports\r\n- Save and manage frequently used prompts\r\n\r\nExample comparison report output:\r\n```markdown\r\n### phi4:14b vs hermes3:8b vs qwen2.5:7b\r\nPerformance comparison:\r\n- phi4:14b: 21.05s total (7.13s load)\r\n- hermes3:8b: 6.52s total (4.45s load)\r\n- qwen2.5:7b: 9.31s total (4.19s load)\r\n\r\nEach model's response includes:\r\n✓ Token counts and timing\r\n✓ Parameter settings used\r\n✓ Model architecture details\r\n✓ Full response text\r\n```\r\n\r\nSee [example-report.md](example-report.md) for a complete sample output.\r\n\r\n## Prerequisites\r\n\r\n- [Ollama](https://ollama.ai/) installed and running\r\n- At least one Ollama model pulled\r\n- Python 3.x (if using the included server script)\r\n- A modern web browser\r\n\r\n## Quick Start\r\n\r\n1. Get the files:\r\n   - Download the latest release ZIP file from the [Releases](https://github.com/designcomputer/ollama-model-lab/releases) page\r\n   - Extract the ZIP file to your desired location\r\n   \r\n   *Alternative for contributors: Clone the repository*\r\n   ```bash\r\n   git clone https://github.com/designcomputer/ollama-model-lab.git\r\n   cd ollama-model-lab\r\n   ```\r\n\r\n2. Start a local web server:\r\n   - Windows users can double-click `start.bat`\r\n   - Or use any method to serve the files locally:\r\n     ```bash\r\n     # Python 3.x (default port 80)\r\n     python -m http.server 80\r\n     \r\n     # Use a different port if 80 is in use\r\n     python -m http.server 8080\r\n     \r\n     # Python 2.x\r\n     python -m SimpleHTTPServer 80\r\n     \r\n     # Or use any other local server of your choice\r\n     ```\r\n\r\n3. Ensure Ollama is running (default: http://127.0.0.1:11434)\r\n\r\n4. Open your browser and navigate to:\r\n   ```\r\n   http://localhost           # if using port 80\r\n   http://localhost:8080      # if using port 8080 (or your chosen port)\r\n   ```\r\n\r\n## Features\r\n\r\n### Model Management\r\n- View all available Ollama models\r\n- Select multiple models for testing\r\n- Sort models by name or size\r\n- Clear test selection with one click\r\n\r\n### Parameter Configuration\r\n- Override default model parameters:\r\n  - Temperature\r\n  - Context Window\r\n  - Max Tokens\r\n  - Top K\r\n  - Top P\r\n  - Number of GPUs\r\n  - Memory Mapping\r\n\r\n### Prompt Management\r\n- Save frequently used prompts\r\n- Import/Export prompt collections\r\n- Quick selection of saved prompts\r\n- Example prompts included\r\n\r\n### Response Analysis\r\n- Side-by-side response comparison\r\n- Detailed performance statistics\r\n  - Total processing time\r\n  - Load time\r\n  - Token counts\r\n  - Processing durations\r\n- Model-specific information display\r\n- Generate detailed comparison reports\r\n\r\n## Usage Tips\r\n\r\n1. **Selecting Models**:\r\n   - Use the left panel to choose available models\r\n   - Click the right arrow to add them to your test set\r\n   - Click the left arrow to remove models from testing\r\n\r\n2. **Configuring Parameters**:\r\n   - Select a model in the right panel\r\n   - Click the gear icon to open parameter settings\r\n   - Enable only the parameters you want to override\r\n\r\n3. **Managing Prompts**:\r\n   - Save useful prompts with descriptive names\r\n   - Use the dropdown to quickly load saved prompts\r\n   - Import/Export prompts via Settings\r\n\r\n4. **Generating Reports**:\r\n   - Run your test across all selected models\r\n   - Click \"Download Report\" to save a detailed markdown report\r\n   - Reports include all responses and statistics\r\n\r\n## Settings\r\n\r\nAccess the settings modal to:\r\n- Configure the Ollama server URL\r\n- Import/Export saved prompts\r\n- View version information\r\n\r\n## Troubleshooting\r\n\r\n### Port Configuration\r\n- If port 80 is already in use (common with web servers or other services):\r\n  1. Modify `start.bat` to use a different port number\r\n  2. Or start the server manually with a different port: `python -m http.server 8080`\r\n  3. Remember to access the application using the correct port in your browser URL\r\n\r\n### Common Issues\r\n- Ensure Ollama is running before starting the application\r\n- Check the Ollama server URL in settings if models aren't loading\r\n- Clear your browser cache if you experience UI issues\r\n- Make sure your chosen port isn't blocked by firewall settings\r\n\r\n## Contributing\r\n\r\nContributions are welcome! Please feel free to submit a Pull Request.\r\n\r\n## License\r\n\r\nThis project is licensed under the MIT License - see the [LICENSE](LICENSE) file for details.\r\n\r\n## Acknowledgments\r\n\r\n- Built for use with [Ollama](https://ollama.ai/)\r\n- Uses browser IndexedDB for prompt storage\r\n- Inspired by the need for easy model comparison\r\n\r\n## Support\r\n\r\nIf you encounter any issues or have questions:\r\n1. Check the [Issues](https://github.com/designcomputer/ollama-model-lab/issues) page\r\n2. Submit a new issue with detailed information about your problem\r\n3. Include your browser and Ollama versions when reporting issues","funding_links":[],"categories":[],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fdesigncomputer%2Follama-model-lab","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fdesigncomputer%2Follama-model-lab","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fdesigncomputer%2Follama-model-lab/lists"}