https://github.com/vlm-run/vlmrun-cookbook
Examples and guides for using the VLM Run API
https://github.com/vlm-run/vlmrun-cookbook
cookbook etl llm ocr vlm
Last synced: 7 months ago
JSON representation
Examples and guides for using the VLM Run API
- Host: GitHub
- URL: https://github.com/vlm-run/vlmrun-cookbook
- Owner: vlm-run
- Created: 2024-03-27T17:21:49.000Z (over 1 year ago)
- Default Branch: main
- Last Pushed: 2025-02-26T23:36:19.000Z (7 months ago)
- Last Synced: 2025-02-27T00:26:43.522Z (7 months ago)
- Topics: cookbook, etl, llm, ocr, vlm
- Language: Jupyter Notebook
- Homepage: https://docs.vlm.run/introduction
- Size: 48.8 MB
- Stars: 79
- Watchers: 3
- Forks: 0
- Open Issues: 6
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
Welcome to **[VLM Run](https://vlm.run) Cookbook**, a comprehensive collection of examples and notebooks demonstrating the power of structured visual understanding using the [VLM Run Platform](https://app.vlm.run). This repository hosts practical examples and tutorials for extracting structured data from images, videos, and documents using Vision Language Models (VLMs).
### 💡 Why Use This Cookbook?
---- 📚 **Practical Examples**: A comprehensive collection of Colab notebooks demonstrating real-world applications of VLM Run.
- 🔋 **Ready-to-Use**: Each example comes with complete code and documentation, making it easy to adapt for your use case.
- 🎯 **Domain-Specific**: Examples cover various domains from financial documents to TV news analysis.### 📖 Cookbook Notebooks
---Our collection of Colab notebooks demonstrates various use cases and integrations:
| Name | Type | Colab | Last Updated |
|:---|:---|:---:|:---:|
| [API Quickstart](./notebooks/00_quickstart.ipynb) | | [](https://colab.research.google.com/github/vlm-run/vlmrun-cookbook/blob/main/notebooks/00_quickstart.ipynb) | 02-08-2025 |
| [Schema Showcase](./notebooks/01_schema_showcase.ipynb) | feature | [](https://colab.research.google.com/github/vlm-run/vlmrun-cookbook/blob/main/notebooks/01_schema_showcase.ipynb) | 02-08-2025 |
| [Visual Grounding](./notebooks/04_visual_grounding.ipynb) | feature | [](https://colab.research.google.com/github/vlm-run/vlmrun-cookbook/blob/main/notebooks/04_visual_grounding.ipynb) | 02-18-2025 |
| [Video Inference (Fine-Tuning)](./notebooks/advanced_finetuning_video_inference.ipynb) | feature | [](https://colab.research.google.com/github/vlm-run/vlmrun-cookbook/blob/main/notebooks/advanced_finetuning_video_inference.ipynb) | 02-18-2025 |
| [US Drivers License](./notebooks/02_case_study_drivers_license.ipynb) | application | [](https://colab.research.google.com/github/vlm-run/vlmrun-cookbook/blob/main/notebooks/02_case_study_drivers_license.ipynb) | 02-08-2025 |
| [Parsing Financial Presentations](https://colab.research.google.com/drive/15_iRDucKj2I33p3m5X3ULdXby_DHWgjS) | application | [](https://colab.research.google.com/drive/15_iRDucKj2I33p3m5X3ULdXby_DHWgjS) | 02-04-2025 |
| [TV News Analysis](./notebooks/03_case_study_tv_news.ipynb) | application | [](https://colab.research.google.com/github/vlm-run/vlmrun-cookbook/blob/main/notebooks/03_case_study_tv_news.ipynb) | 02-15-2025 |
| [Fashion Product Catalog](./notebooks/05_case_study_image_catalogue.ipynb) | application | [](https://colab.research.google.com/github/vlm-run/vlmrun-cookbook/blob/main/notebooks/05_case_study_image_catalogue.ipynb) | 02-20-2025 |
| [Fashion Images Hybrid Search](./notebooks/06_fashion_images_hybrid_search.ipynb) | application | [](https://colab.research.google.com/github/vlm-run/vlmrun-cookbook/blob/main/notebooks/06_fashion_images_hybrid_search.ipynb) | 02-21-2025 |### 🔗 Quick Links
---* 💬 Send us an email at [support@vlm.run](mailto:support@vlm.run) or join our [Discord](https://discord.gg/4jgyECY4rq) for help
* 📣 Follow us on [Twitter](https://twitter.com/vlmrun) and [LinkedIn](https://www.linkedin.com/company/vlm-run) to keep up-to-date on our products
* 📚 Check out our [Documentation](https://docs.vlm.run/) for detailed guides and API reference