{"id":15647252,"url":"https://github.com/ankane/onnxruntime-php","last_synced_at":"2025-11-17T14:18:17.276Z","repository":{"id":57871568,"uuid":"528683671","full_name":"ankane/onnxruntime-php","owner":"ankane","description":"Run ONNX models in PHP ","archived":false,"fork":false,"pushed_at":"2025-09-30T18:21:17.000Z","size":75,"stargazers_count":122,"open_issues_count":0,"forks_count":7,"subscribers_count":6,"default_branch":"master","last_synced_at":"2025-09-30T20:18:27.622Z","etag":null,"topics":[],"latest_commit_sha":null,"homepage":null,"language":"PHP","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"mit","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/ankane.png","metadata":{"files":{"readme":"README.md","changelog":"CHANGELOG.md","contributing":null,"funding":null,"license":"LICENSE.txt","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null,"zenodo":null,"notice":null,"maintainers":null,"copyright":null,"agents":null,"dco":null,"cla":null}},"created_at":"2022-08-25T03:43:48.000Z","updated_at":"2025-09-30T18:21:19.000Z","dependencies_parsed_at":"2023-12-29T02:24:41.411Z","dependency_job_id":"3a591a86-26b3-4736-abe7-a0118e986c7a","html_url":"https://github.com/ankane/onnxruntime-php","commit_stats":{"total_commits":24,"total_committers":1,"mean_commits":24.0,"dds":0.0,"last_synced_commit":"1f530348940c75bd51577ecac70592c386dba05a"},"previous_names":[],"tags_count":18,"template":false,"template_full_name":null,"purl":"pkg:github/ankane/onnxruntime-php","repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/ankane%2Fonnxruntime-php","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/ankane%2Fonnxruntime-php/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/ankane%2Fonnxruntime-php/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/ankane%2Fonnxruntime-php/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/ankane","download_url":"https://codeload.github.com/ankane/onnxruntime-php/tar.gz/refs/heads/master","sbom_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/ankane%2Fonnxruntime-php/sbom","scorecard":null,"host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":284894838,"owners_count":27080782,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","status":"online","status_checked_at":"2025-11-17T02:00:06.431Z","response_time":55,"last_error":null,"robots_txt_status":"success","robots_txt_updated_at":"2025-07-24T06:49:26.215Z","robots_txt_url":"https://github.com/robots.txt","online":true,"can_crawl_api":true,"host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":[],"created_at":"2024-10-03T12:17:49.566Z","updated_at":"2025-11-17T14:18:17.259Z","avatar_url":"https://github.com/ankane.png","language":"PHP","funding_links":[],"categories":["Interop \u0026 Model Serving","Learning Resources"],"sub_categories":["Tokenizers \u0026 Prompt Utilities","Utilities \u0026 Tools"],"readme":"# ONNX Runtime PHP\n\n:fire: [ONNX Runtime](https://github.com/Microsoft/onnxruntime) - the high performance scoring engine for ML models - for PHP\n\nCheck out [an example](https://ankane.org/tensorflow-php)\n\n[![Build Status](https://github.com/ankane/onnxruntime-php/actions/workflows/build.yml/badge.svg)](https://github.com/ankane/onnxruntime-php/actions)\n\n## Installation\n\nRun:\n\n```sh\ncomposer require ankane/onnxruntime\n```\n\nAdd scripts to `composer.json` to download the shared library:\n\n```json\n    \"scripts\": {\n        \"post-install-cmd\": \"OnnxRuntime\\\\Vendor::check\",\n        \"post-update-cmd\": \"OnnxRuntime\\\\Vendor::check\"\n    }\n```\n\nAnd run:\n\n```sh\ncomposer install\n```\n\n## Getting Started\n\nLoad a model and make predictions\n\n```php\n$model = new OnnxRuntime\\Model('model.onnx');\n$model-\u003epredict(['x' =\u003e [1, 2, 3]]);\n```\n\n\u003e Download pre-trained models from the [ONNX Model Zoo](https://github.com/onnx/models)\n\nGet inputs\n\n```php\n$model-\u003einputs();\n```\n\nGet outputs\n\n```php\n$model-\u003eoutputs();\n```\n\nGet metadata\n\n```php\n$model-\u003emetadata();\n```\n\nLoad a model from a stream\n\n```php\n$stream = fopen('model.onnx', 'rb');\n$model = new OnnxRuntime\\Model($stream);\n```\n\nGet specific outputs\n\n```php\n$model-\u003epredict(['x' =\u003e [1, 2, 3]], outputNames: ['label']);\n```\n\n## Session Options\n\n```php\nuse OnnxRuntime\\ExecutionMode;\nuse OnnxRuntime\\GraphOptimizationLevel;\n\nnew OnnxRuntime\\Model(\n    $path,\n    enableCpuMemArena: true,\n    enableMemPattern: true,\n    enableProfiling: false,\n    executionMode: ExecutionMode::Sequential, // or Parallel\n    freeDimensionOverridesByDenotation: null,\n    freeDimensionOverridesByName: null,\n    graphOptimizationLevel: GraphOptimizationLevel::None, // or Basic, Extended, All\n    interOpNumThreads: null,\n    intraOpNumThreads: null,\n    logSeverityLevel: 2,\n    logVerbosityLevel: 0,\n    logid: 'tag',\n    optimizedModelFilepath: null,\n    profileFilePrefix: 'onnxruntime_profile_',\n    sessionConfigEntries: null\n);\n```\n\n## Run Options\n\n```php\n$model-\u003epredict(\n    $inputFeed,\n    outputNames: null,\n    logSeverityLevel: 2,\n    logVerbosityLevel: 0,\n    logid: 'tag',\n    terminate: false\n);\n```\n\n## Inference Session API\n\nYou can also use the Inference Session API, which follows the [Python API](https://onnxruntime.ai/docs/api/python/api_summary.html).\n\n```php\n$session = new OnnxRuntime\\InferenceSession('model.onnx');\n$session-\u003erun(null, ['x' =\u003e [1, 2, 3]]);\n```\n\nThe Python example models are included as well.\n\n```php\nOnnxRuntime\\Datasets::example('sigmoid.onnx');\n```\n\n## GPU Support\n\n### Linux and Windows\n\nDownload the appropriate [GPU release](https://github.com/microsoft/onnxruntime/releases) and set:\n\n```php\nOnnxRuntime\\FFI::$lib = 'path/to/lib/libonnxruntime.so'; // onnxruntime.dll for Windows\n```\n\nand use:\n\n```php\n$model = new OnnxRuntime\\Model('model.onnx', providers: ['CUDAExecutionProvider']);\n```\n\n### Mac\n\nUse:\n\n```php\n$model = new OnnxRuntime\\Model('model.onnx', providers: ['CoreMLExecutionProvider']);\n```\n\n## History\n\nView the [changelog](https://github.com/ankane/onnxruntime-php/blob/master/CHANGELOG.md)\n\n## Contributing\n\nEveryone is encouraged to help improve this project. Here are a few ways you can help:\n\n- [Report bugs](https://github.com/ankane/onnxruntime-php/issues)\n- Fix bugs and [submit pull requests](https://github.com/ankane/onnxruntime-php/pulls)\n- Write, clarify, or fix documentation\n- Suggest or add new features\n\nTo get started with development:\n\n```sh\ngit clone https://github.com/ankane/onnxruntime-php.git\ncd onnxruntime-php\ncomposer install\ncomposer test\n```\n","project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fankane%2Fonnxruntime-php","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fankane%2Fonnxruntime-php","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fankane%2Fonnxruntime-php/lists"}