{"id":26187481,"url":"https://github.com/tinybiggames/phinx","last_synced_at":"2025-12-25T02:10:57.660Z","repository":{"id":281101615,"uuid":"942906095","full_name":"tinyBigGAMES/Phinx","owner":"tinyBigGAMES","description":"A High-Performance AI Inference Library for ONNX and Phi-4","archived":false,"fork":false,"pushed_at":"2025-03-07T01:48:23.000Z","size":1373,"stargazers_count":0,"open_issues_count":0,"forks_count":0,"subscribers_count":1,"default_branch":"main","last_synced_at":"2025-03-07T02:20:29.440Z","etag":null,"topics":["delphi","genai","local-ai-development","onnxruntime","phi4-multimodal","win64"],"latest_commit_sha":null,"homepage":"","language":"Pascal","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"bsd-3-clause","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/tinyBigGAMES.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":".github/FUNDING.yml","license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null},"funding":{"github":"tinyBigGAMES","patreon":null,"open_collective":null,"ko_fi":null,"tidelift":null,"community_bridge":null,"liberapay":null,"issuehunt":null,"lfx_crowdfunding":null,"polar":null,"buy_me_a_coffee":null,"thanks_dev":null,"custom":null}},"created_at":"2025-03-04T21:38:46.000Z","updated_at":"2025-03-07T01:48:27.000Z","dependencies_parsed_at":"2025-03-07T02:20:31.726Z","dependency_job_id":"8d809d42-8441-4e7a-9398-ad4ed852ffdb","html_url":"https://github.com/tinyBigGAMES/Phinx","commit_stats":null,"previous_names":["tinybiggames/phinx"],"tags_count":0,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/tinyBigGAMES%2FPhinx","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/tinyBigGAMES%2FPhinx/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/tinyBigGAMES%2FPhinx/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/tinyBigGAMES%2FPhinx/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/tinyBigGAMES","download_url":"https://codeload.github.com/tinyBigGAMES/Phinx/tar.gz/refs/heads/main","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":243130997,"owners_count":20241177,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["delphi","genai","local-ai-development","onnxruntime","phi4-multimodal","win64"],"created_at":"2025-03-11T23:50:02.867Z","updated_at":"2025-12-25T02:10:52.637Z","avatar_url":"https://github.com/tinyBigGAMES.png","language":"Pascal","readme":"# ![Phinx](media/phinx.png)  \n[![Chat on Discord](https://img.shields.io/discord/754884471324672040?style=for-the-badge)](https://discord.gg/tPWjMwK) [![Follow on Bluesky](https://img.shields.io/badge/Bluesky-tinyBigGAMES-blue?style=for-the-badge\u0026logo=bluesky)](https://bsky.app/profile/tinybiggames.com) [![Reddit](https://img.shields.io/badge/Reddit-Phinx-red?style=for-the-badge\u0026logo=reddit)](https://www.reddit.com/r/Phinx/) [![Hugging Face](https://img.shields.io/badge/Hugging%20Face-Phinx-yellow?style=for-the-badge\u0026logo=huggingface)](https://huggingface.co/tinybiggames/Phinx)\n\n\n### A High-Performance AI Inference Library for ONNX and Phi-4\n\n**Phinx** is an advanced AI inference library that leverages **ONNX Runtime GenAI** and the **Phi-4 Multimodal ONNX** model for fast, efficient, and scalable AI applications. Designed for developers seeking seamless integration of generative and multimodal AI, Phinx offers an optimized and flexible runtime environment with robust performance.\n\n## 🚀 Key Features\n\n- **ONNX-Powered Inference** – Efficient execution of Phi-4 models using ONNX Runtime GenAI.\n- **Multimodal AI** – Supports text, image, and multi-input inference for diverse AI tasks.\n- **Optimized Performance** – Accelerated inference leveraging ONNX optimizations for speed and efficiency.\n- **Developer-Friendly API** – Simple yet powerful APIs for easy integration into **Delphi, Python, and other platforms**.\n- **Self-Contained \u0026 Virtualized** – The `Phinx.model` file acts as a **virtual folder**, bundling **Phi-4 ONNX model files** and all dependencies into a single, portable format.\n\nPhinx is ideal for AI research, creative applications, and production-ready generative AI solutions. Whether you're building **chatbots, AI-powered content generation tools, or multimodal assistants**, Phinx delivers the **speed and flexibility** you need!\n\n\n## 📂 Phinx Model File Format (`Phinx.model`)\n\nThe **Phinx.model** format is a specialized file structure for storing **ONNX-based machine learning models**, optimized for **CUDA-powered inference**. It encapsulates all essential components, ensuring seamless model execution.\n\n### 🔹 Key Benefits\n\n1. **Self-Contained \u0026 Virtualized**\n   - Acts as a **virtual folder** within the application.\n   - Bundles **Phi-4 ONNX model files** and dependencies for portability.\n\n2. **Optimized for CUDA Inference**\n   - Designed for **GPU acceleration**, delivering high-performance AI execution.\n   - Ensures fast loading and efficient CUDA computations.\n\n3. **Structured \u0026 Extensible**\n   - Stores **model weights, metadata, configuration parameters, and dependencies** in a well-organized manner.\n   - Future-proof design allows for additional configurations and optimizations.\n\n4. **Simplified Deployment**\n   - All required files are consolidated into a single **`.model`** file.\n   - Eliminates external dependency management for **plug-and-play usability**.\n\n## 🛠 Getting Started\n\n### 🔧 System Requirements\n\n- **GPU Requirements:** CUDA-compatible NVIDIA GPU with **8–12GB VRAM**.\n- **Storage Requirements:** At least **7GB** of free disk space.\n\n### 📥 Download Model\n\nGet the **Phinx Model** from Hugging Face:\n[📂 Download Phinx Model](https://huggingface.co/tinybiggames/Phinx/resolve/main/Phinx.model?download=true)\n\n### 🏗 Setup Instructions\n\n1. Place the downloaded model in your preferred directory.\n   - Example path: `C:/LLM/PHINX/repo`\n2. Ensure you have a **Delphi version that supports Win64 and Unicode**.\n3. Developed with: **Delphi 12.2**\n4. Tested on: **Windows 11 (24H2)**\n5. Refer to `UTestbed.pas` for usage notes and check the examples.\n\n## 🚧 Project Status\n\n\u003e **⚠️ Note:** This repository is currently in the setup phase. While documentation is being prepared, the **code is fully functional and stable**. Stay tuned—this README and additional resources will be updated soon! 🚀\n\n## 📺 Media\n🌊 Deep Dive Podcast  \nDiscover in-depth discussions and insights about Sophora and its innovative features. 🚀✨\n\n🎥 Phinx Feature Videos  \nExplore videos showcasing the powerful capabilities of the Phinx library, including tutorials, demonstrations, and real-world applications. 🎬🔥\n\n\nhttps://github.com/user-attachments/assets/d58bc2d1-bef5-458d-9377-6b1235c51972\n\n\nhttps://github.com/user-attachments/assets/c166106a-4266-4b0f-9d95-42d1b2cc0921\n\n\n\nhttps://github.com/user-attachments/assets/34bbac1d-e4f6-47d2-8cfc-78f702153144\n\n\n\n## 💬 Support and Resources\n\n- 🐞 **Report Issues:** [Issue Tracker](https://github.com/tinyBigGAMES/Phinx/issues)\n- 💬 **Join the Community:** [Forum](https://github.com/tinyBigGAMES/Phinx/discussions) | [Discord](https://discord.gg/tPWjMwK)\n- 📚 **Learn Delphi:** [Learn Delphi](https://learndelphi.org)\n\n## 🤝 Contributing\n\nContributions to **✨ Phinx** are highly encouraged! 🌟  \nWays to contribute:\n- 🐛 **Report Bugs:** Help us improve by submitting issues.\n- 💡 **Suggest Features:** Share ideas to enhance Phinx.\n- 🔧 **Create Pull Requests:** Improve the library’s capabilities.\n\n### 🏆 Contributors\n\n\u003ca href=\"https://github.com/tinyBigGAMES/Phinx/graphs/contributors\"\u003e\n  \u003cimg src=\"https://contrib.rocks/image?repo=tinyBigGAMES/Phinx\u0026max=250\u0026columns=10\u0026anon=1\" /\u003e\n\u003c/a\u003e\n\n## 📜 License\n\nPhinx is distributed under the **BSD-3-Clause License**, allowing redistribution and use in both source and binary forms, with or without modification.  \nSee the [📜 LICENSE](https://github.com/tinyBigGAMES/Phinx?tab=BSD-3-Clause-1-ov-file#BSD-3-Clause-1-ov-file) for more details.\n\n## 💖 Support \u0026 Sponsorship\n\nIf you find **Phinx** useful, please consider [sponsoring this project](https://github.com/sponsors/tinyBigGAMES). Your support helps sustain development, improve features, and keep the project thriving.\n\n### Other ways to contribute:\n- ⭐ **Star the repo** – It helps increase visibility.\n- 📢 **Spread the word** – Share Phinx with your network.\n- 🐛 **Report bugs** – Help identify issues.\n- 🔧 **Submit fixes** – Found a bug? Fix it and contribute!\n- 💡 **Suggest enhancements** – Share ideas for improvements.\n\nEvery contribution, big or small, helps make **Phinx** better. Thank you for your support! 🚀\n\n---\n\n ⚡ Phinx – Powering AI with Phi-4, ONNX \u0026 CUDA, Seamlessly and Efficiently! ⚡\n\n\u003cp align=\"center\"\u003e\n  \u003cimg src=\"media/delphi.png\" alt=\"Delphi\"\u003e\n\u003c/p\u003e\n\u003ch5 align=\"center\"\u003eMade with ❤️ in Delphi\u003c/h5\u003e\n\n","funding_links":["https://github.com/sponsors/tinyBigGAMES"],"categories":[],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Ftinybiggames%2Fphinx","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Ftinybiggames%2Fphinx","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Ftinybiggames%2Fphinx/lists"}