An open API service indexing awesome lists of open source software.

https://github.com/tinybiggames/phinx

A High-Performance AI Inference Library for ONNX and Phi-4
https://github.com/tinybiggames/phinx

delphi genai local-ai-development onnxruntime phi4-multimodal win64

Last synced: 5 days ago
JSON representation

A High-Performance AI Inference Library for ONNX and Phi-4

Awesome Lists containing this project

README

          

# ![Phinx](media/phinx.png)
[![Chat on Discord](https://img.shields.io/discord/754884471324672040?style=for-the-badge)](https://discord.gg/tPWjMwK) [![Follow on Bluesky](https://img.shields.io/badge/Bluesky-tinyBigGAMES-blue?style=for-the-badge&logo=bluesky)](https://bsky.app/profile/tinybiggames.com) [![Reddit](https://img.shields.io/badge/Reddit-Phinx-red?style=for-the-badge&logo=reddit)](https://www.reddit.com/r/Phinx/) [![Hugging Face](https://img.shields.io/badge/Hugging%20Face-Phinx-yellow?style=for-the-badge&logo=huggingface)](https://huggingface.co/tinybiggames/Phinx)

### A High-Performance AI Inference Library for ONNX and Phi-4

**Phinx** is an advanced AI inference library that leverages **ONNX Runtime GenAI** and the **Phi-4 Multimodal ONNX** model for fast, efficient, and scalable AI applications. Designed for developers seeking seamless integration of generative and multimodal AI, Phinx offers an optimized and flexible runtime environment with robust performance.

## πŸš€ Key Features

- **ONNX-Powered Inference** – Efficient execution of Phi-4 models using ONNX Runtime GenAI.
- **Multimodal AI** – Supports text, image, and multi-input inference for diverse AI tasks.
- **Optimized Performance** – Accelerated inference leveraging ONNX optimizations for speed and efficiency.
- **Developer-Friendly API** – Simple yet powerful APIs for easy integration into **Delphi, Python, and other platforms**.
- **Self-Contained & Virtualized** – The `Phinx.model` file acts as a **virtual folder**, bundling **Phi-4 ONNX model files** and all dependencies into a single, portable format.

Phinx is ideal for AI research, creative applications, and production-ready generative AI solutions. Whether you're building **chatbots, AI-powered content generation tools, or multimodal assistants**, Phinx delivers the **speed and flexibility** you need!

## πŸ“‚ Phinx Model File Format (`Phinx.model`)

The **Phinx.model** format is a specialized file structure for storing **ONNX-based machine learning models**, optimized for **CUDA-powered inference**. It encapsulates all essential components, ensuring seamless model execution.

### πŸ”Ή Key Benefits

1. **Self-Contained & Virtualized**
- Acts as a **virtual folder** within the application.
- Bundles **Phi-4 ONNX model files** and dependencies for portability.

2. **Optimized for CUDA Inference**
- Designed for **GPU acceleration**, delivering high-performance AI execution.
- Ensures fast loading and efficient CUDA computations.

3. **Structured & Extensible**
- Stores **model weights, metadata, configuration parameters, and dependencies** in a well-organized manner.
- Future-proof design allows for additional configurations and optimizations.

4. **Simplified Deployment**
- All required files are consolidated into a single **`.model`** file.
- Eliminates external dependency management for **plug-and-play usability**.

## πŸ›  Getting Started

### πŸ”§ System Requirements

- **GPU Requirements:** CUDA-compatible NVIDIA GPU with **8–12GB VRAM**.
- **Storage Requirements:** At least **7GB** of free disk space.

### πŸ“₯ Download Model

Get the **Phinx Model** from Hugging Face:
[πŸ“‚ Download Phinx Model](https://huggingface.co/tinybiggames/Phinx/resolve/main/Phinx.model?download=true)

### πŸ— Setup Instructions

1. Place the downloaded model in your preferred directory.
- Example path: `C:/LLM/PHINX/repo`
2. Ensure you have a **Delphi version that supports Win64 and Unicode**.
3. Developed with: **Delphi 12.2**
4. Tested on: **Windows 11 (24H2)**
5. Refer to `UTestbed.pas` for usage notes and check the examples.

## 🚧 Project Status

> **⚠️ Note:** This repository is currently in the setup phase. While documentation is being prepared, the **code is fully functional and stable**. Stay tunedβ€”this README and additional resources will be updated soon! πŸš€

## πŸ“Ί Media
🌊 Deep Dive Podcast
Discover in-depth discussions and insights about Sophora and its innovative features. πŸš€βœ¨

πŸŽ₯ Phinx Feature Videos
Explore videos showcasing the powerful capabilities of the Phinx library, including tutorials, demonstrations, and real-world applications. 🎬πŸ”₯

https://github.com/user-attachments/assets/d58bc2d1-bef5-458d-9377-6b1235c51972

https://github.com/user-attachments/assets/c166106a-4266-4b0f-9d95-42d1b2cc0921

https://github.com/user-attachments/assets/34bbac1d-e4f6-47d2-8cfc-78f702153144

## πŸ’¬ Support and Resources

- 🐞 **Report Issues:** [Issue Tracker](https://github.com/tinyBigGAMES/Phinx/issues)
- πŸ’¬ **Join the Community:** [Forum](https://github.com/tinyBigGAMES/Phinx/discussions) | [Discord](https://discord.gg/tPWjMwK)
- πŸ“š **Learn Delphi:** [Learn Delphi](https://learndelphi.org)

## 🀝 Contributing

Contributions to **✨ Phinx** are highly encouraged! 🌟
Ways to contribute:
- πŸ› **Report Bugs:** Help us improve by submitting issues.
- πŸ’‘ **Suggest Features:** Share ideas to enhance Phinx.
- πŸ”§ **Create Pull Requests:** Improve the library’s capabilities.

### πŸ† Contributors



## πŸ“œ License

Phinx is distributed under the **BSD-3-Clause License**, allowing redistribution and use in both source and binary forms, with or without modification.
See the [πŸ“œ LICENSE](https://github.com/tinyBigGAMES/Phinx?tab=BSD-3-Clause-1-ov-file#BSD-3-Clause-1-ov-file) for more details.

## πŸ’– Support & Sponsorship

If you find **Phinx** useful, please consider [sponsoring this project](https://github.com/sponsors/tinyBigGAMES). Your support helps sustain development, improve features, and keep the project thriving.

### Other ways to contribute:
- ⭐ **Star the repo** – It helps increase visibility.
- πŸ“’ **Spread the word** – Share Phinx with your network.
- πŸ› **Report bugs** – Help identify issues.
- πŸ”§ **Submit fixes** – Found a bug? Fix it and contribute!
- πŸ’‘ **Suggest enhancements** – Share ideas for improvements.

Every contribution, big or small, helps make **Phinx** better. Thank you for your support! πŸš€

---

⚑ Phinx – Powering AI with Phi-4, ONNX & CUDA, Seamlessly and Efficiently! ⚑


Delphi


Made with ❀️ in Delphi