https://github.com/tinybiggames/phinx
A High-Performance AI Inference Library for ONNX and Phi-4
https://github.com/tinybiggames/phinx
delphi genai local-ai-development onnxruntime phi4-multimodal win64
Last synced: 5 days ago
JSON representation
A High-Performance AI Inference Library for ONNX and Phi-4
- Host: GitHub
- URL: https://github.com/tinybiggames/phinx
- Owner: tinyBigGAMES
- License: bsd-3-clause
- Created: 2025-03-04T21:38:46.000Z (10 months ago)
- Default Branch: main
- Last Pushed: 2025-03-07T01:48:23.000Z (10 months ago)
- Last Synced: 2025-03-07T02:20:29.440Z (10 months ago)
- Topics: delphi, genai, local-ai-development, onnxruntime, phi4-multimodal, win64
- Language: Pascal
- Homepage:
- Size: 1.31 MB
- Stars: 0
- Watchers: 1
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- Funding: .github/FUNDING.yml
- License: LICENSE
Awesome Lists containing this project
README
# 
[](https://discord.gg/tPWjMwK) [](https://bsky.app/profile/tinybiggames.com) [](https://www.reddit.com/r/Phinx/) [](https://huggingface.co/tinybiggames/Phinx)
### A High-Performance AI Inference Library for ONNX and Phi-4
**Phinx** is an advanced AI inference library that leverages **ONNX Runtime GenAI** and the **Phi-4 Multimodal ONNX** model for fast, efficient, and scalable AI applications. Designed for developers seeking seamless integration of generative and multimodal AI, Phinx offers an optimized and flexible runtime environment with robust performance.
## π Key Features
- **ONNX-Powered Inference** β Efficient execution of Phi-4 models using ONNX Runtime GenAI.
- **Multimodal AI** β Supports text, image, and multi-input inference for diverse AI tasks.
- **Optimized Performance** β Accelerated inference leveraging ONNX optimizations for speed and efficiency.
- **Developer-Friendly API** β Simple yet powerful APIs for easy integration into **Delphi, Python, and other platforms**.
- **Self-Contained & Virtualized** β The `Phinx.model` file acts as a **virtual folder**, bundling **Phi-4 ONNX model files** and all dependencies into a single, portable format.
Phinx is ideal for AI research, creative applications, and production-ready generative AI solutions. Whether you're building **chatbots, AI-powered content generation tools, or multimodal assistants**, Phinx delivers the **speed and flexibility** you need!
## π Phinx Model File Format (`Phinx.model`)
The **Phinx.model** format is a specialized file structure for storing **ONNX-based machine learning models**, optimized for **CUDA-powered inference**. It encapsulates all essential components, ensuring seamless model execution.
### πΉ Key Benefits
1. **Self-Contained & Virtualized**
- Acts as a **virtual folder** within the application.
- Bundles **Phi-4 ONNX model files** and dependencies for portability.
2. **Optimized for CUDA Inference**
- Designed for **GPU acceleration**, delivering high-performance AI execution.
- Ensures fast loading and efficient CUDA computations.
3. **Structured & Extensible**
- Stores **model weights, metadata, configuration parameters, and dependencies** in a well-organized manner.
- Future-proof design allows for additional configurations and optimizations.
4. **Simplified Deployment**
- All required files are consolidated into a single **`.model`** file.
- Eliminates external dependency management for **plug-and-play usability**.
## π Getting Started
### π§ System Requirements
- **GPU Requirements:** CUDA-compatible NVIDIA GPU with **8β12GB VRAM**.
- **Storage Requirements:** At least **7GB** of free disk space.
### π₯ Download Model
Get the **Phinx Model** from Hugging Face:
[π Download Phinx Model](https://huggingface.co/tinybiggames/Phinx/resolve/main/Phinx.model?download=true)
### π Setup Instructions
1. Place the downloaded model in your preferred directory.
- Example path: `C:/LLM/PHINX/repo`
2. Ensure you have a **Delphi version that supports Win64 and Unicode**.
3. Developed with: **Delphi 12.2**
4. Tested on: **Windows 11 (24H2)**
5. Refer to `UTestbed.pas` for usage notes and check the examples.
## π§ Project Status
> **β οΈ Note:** This repository is currently in the setup phase. While documentation is being prepared, the **code is fully functional and stable**. Stay tunedβthis README and additional resources will be updated soon! π
## πΊ Media
π Deep Dive Podcast
Discover in-depth discussions and insights about Sophora and its innovative features. πβ¨
π₯ Phinx Feature Videos
Explore videos showcasing the powerful capabilities of the Phinx library, including tutorials, demonstrations, and real-world applications. π¬π₯
https://github.com/user-attachments/assets/d58bc2d1-bef5-458d-9377-6b1235c51972
https://github.com/user-attachments/assets/c166106a-4266-4b0f-9d95-42d1b2cc0921
https://github.com/user-attachments/assets/34bbac1d-e4f6-47d2-8cfc-78f702153144
## π¬ Support and Resources
- π **Report Issues:** [Issue Tracker](https://github.com/tinyBigGAMES/Phinx/issues)
- π¬ **Join the Community:** [Forum](https://github.com/tinyBigGAMES/Phinx/discussions) | [Discord](https://discord.gg/tPWjMwK)
- π **Learn Delphi:** [Learn Delphi](https://learndelphi.org)
## π€ Contributing
Contributions to **β¨ Phinx** are highly encouraged! π
Ways to contribute:
- π **Report Bugs:** Help us improve by submitting issues.
- π‘ **Suggest Features:** Share ideas to enhance Phinx.
- π§ **Create Pull Requests:** Improve the libraryβs capabilities.
### π Contributors
## π License
Phinx is distributed under the **BSD-3-Clause License**, allowing redistribution and use in both source and binary forms, with or without modification.
See the [π LICENSE](https://github.com/tinyBigGAMES/Phinx?tab=BSD-3-Clause-1-ov-file#BSD-3-Clause-1-ov-file) for more details.
## π Support & Sponsorship
If you find **Phinx** useful, please consider [sponsoring this project](https://github.com/sponsors/tinyBigGAMES). Your support helps sustain development, improve features, and keep the project thriving.
### Other ways to contribute:
- β **Star the repo** β It helps increase visibility.
- π’ **Spread the word** β Share Phinx with your network.
- π **Report bugs** β Help identify issues.
- π§ **Submit fixes** β Found a bug? Fix it and contribute!
- π‘ **Suggest enhancements** β Share ideas for improvements.
Every contribution, big or small, helps make **Phinx** better. Thank you for your support! π
---
β‘ Phinx β Powering AI with Phi-4, ONNX & CUDA, Seamlessly and Efficiently! β‘
Made with β€οΈ in Delphi