{"id":23932909,"url":"https://github.com/PropsAI/agentserve","last_synced_at":"2025-09-11T15:32:57.434Z","repository":{"id":259348777,"uuid":"877043330","full_name":"PropsAI/agentserve","owner":"PropsAI","description":"An SDK \u0026 CLI for hosting and managing AI agents.","archived":false,"fork":false,"pushed_at":"2024-10-24T17:19:50.000Z","size":29,"stargazers_count":1,"open_issues_count":0,"forks_count":0,"subscribers_count":0,"default_branch":"main","last_synced_at":"2024-10-24T18:10:49.977Z","etag":null,"topics":[],"latest_commit_sha":null,"homepage":null,"language":"Python","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"mit","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/PropsAI.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null}},"created_at":"2024-10-23T01:44:50.000Z","updated_at":"2024-10-24T17:19:54.000Z","dependencies_parsed_at":"2024-10-24T18:15:55.356Z","dependency_job_id":null,"html_url":"https://github.com/PropsAI/agentserve","commit_stats":null,"previous_names":["propsai/agentserve"],"tags_count":3,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/PropsAI%2Fagentserve","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/PropsAI%2Fagentserve/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/PropsAI%2Fagentserve/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/PropsAI%2Fagentserve/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/PropsAI","download_url":"https://codeload.github.com/PropsAI/agentserve/tar.gz/refs/heads/main","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":232657751,"owners_count":18556895,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":[],"created_at":"2025-01-06T00:29:25.730Z","updated_at":"2025-01-06T00:29:53.828Z","avatar_url":"https://github.com/PropsAI.png","language":"Python","readme":"```\n   ___                __  ____\n  / _ |___ ____ ___  / /_/ __/__ _____  _____\n / __ / _ `/ -_) _ \\/ __/\\ \\/ -_) __/ |/ / -_)\n/_/ |_\\_, /\\__/_//_/\\__/___/\\__/_/  |___/\\__/\n     /___/\n```\n\n# AgentServe\n\n[![Discord](https://img.shields.io/badge/Discord-Join_Discord-blue?style=flat\u0026logo=Discord)](https://discord.gg/JkPrCnExSf)\n[![GitHub](https://img.shields.io/badge/GitHub-View_on_GitHub-blue?style=flat\u0026logo=GitHub)](https://github.com/PropsAI/agentserve)\n![License](https://img.shields.io/badge/License-MIT-blue.svg)\n![PyPI Version](https://img.shields.io/pypi/v/agentserve.svg)\n![GitHub Stars](https://img.shields.io/github/stars/PropsAI/agentserve?style=social)\n![Beta](https://img.shields.io/badge/Status-Beta-yellow)\n\nAgentServe is a lightweight framework for hosting and scaling AI agents. It is designed to be easy to use and integrate with existing projects and agent / LLM frameworks. It wraps your agent in a REST API and supports optional task queuing for scalability.\n\nJoin the [Discord](https://discord.gg/JkPrCnExSf) for support and discussion.\n\n## Goals and Objectives\n\nThe goal of AgentServe is to provide the easiest way to take an local agent to production and standardize the communication layer between multiple agents, humans, and other systems.\n\n## Features\n\n- **Standardized:** AgentServe provides a standardized way to communicate with AI agents via a REST API.\n- **Framework Agnostic:** AgentServe supports multiple agent frameworks (OpenAI, LangChain, LlamaIndex, and Blank).\n- **Task Queuing:** AgentServe supports optional task queuing for scalability. Choose between local, Redis, or Celery task queues based on your needs.\n- **Configurable:** AgentServe is designed to be configurable via an `agentserve.yaml` file and overridable with environment variables.\n- **Easy to Use:** AgentServe aims to be easy to use and integrate with existing projects and make deployment as simple as possible.\n\n## Requirements\n\n- Python 3.9+\n\n## Installation\n\nTo install AgentServe, you can use pip:\n\n```bash\npip install -U agentserve\n```\n\n## Getting Started\n\nAgentServe allows you to easily wrap your agent code in a FastAPI application and expose it via REST endpoints. Below are the steps to integrate AgentServe into your project.\n\n### 1. Install AgentServe\n\nFirst, install the `agentserve` package using pip:\n\n```bash\npip install -U agentserve\n```\n\nMake sure your virtual environment is activated if you're using one.\n\n### 2. Create or Update Your Agent\n\nWithin your entry point file (e.g. `main.py`) we will import `agentserve` and create an app instance, then decorate an agent function with `@app.agent`. Finally, we will call `app.run()` to start the server.\n\nThe agent function should take a single argument, `task_data`, which will be a dictionary of data prequired by your agent.\n\n**Example:**\n\n```python\n# main.py\nimport agentserve\nfrom openai import OpenAI\n\napp = agentserve.app()\n\n@app.agent\ndef my_agent(task_data):\n    # Your agent logic goes here\n    client = OpenAI()\n    response = client.chat.completions.create(\n        model=\"gpt-4o-mini\",\n        messages=[{\"role\": \"user\", \"content\": task_data[\"prompt\"]}]\n    )\n    return response.choices[0].message.content\n\nif __name__ == \"__main__\":\n    app.run()\n```\n\nIn this example:\n\n- We import agentserve and create an app instance using `agentserve.app()`.\n- We define our agent function `my_agent` and decorate it with `@app.agent`.\n- Within the agent function, we implement our agent's logic.\n- We call `app.run()` to start the server.\n\n### 3. Run the Agent Server\n\nTo run the agent server, use the following command:\n\n```bash\npython main.py\n```\n\n### 4. Configure Task Queue (Optional)\n\nBy default, AgentServe uses a local task queue, which is suitable for development and testing. If you need more robust queue management for production, you can configure AgentServe to use Redis or Celery.\n\n**Using a Configuration File**\nCreate a file named agentserve.yaml in your project directory:\n\n```yaml\n# agentserve.yaml\n\ntask_queue: celery # Options: 'local', 'redis', 'celery'\n\ncelery:\n  broker_url: pyamqp://guest@localhost//\n```\n\n**Using Environment Variables**\n\nAlternatively, you can set configuration options using environment variables:\n\n```bash\nexport AGENTSERVE_TASK_QUEUE=celery\nexport AGENTSERVE_CELERY_BROKER_URL=pyamqp://guest@localhost//\n```\n\n### 5. Start the Worker (if using Celery or Redis)\n\nTo start the worker, use the following command:\n\n```bash\nagentserve startworker\n```\n\n### 6. Test the Agent\n\nWith the server and worker (if needed) running, you can test your agent using the available endpoints.\n\n**Synchronous Task Processing**\n\n`POST /task/sync`\n\n```bash\ncurl -X POST http://localhost:8000/task/sync \\\n     -H \"Content-Type: application/json\" \\\n       -d '{\"input\": \"Test input\"}'\n```\n\n**Asynchronously process a task**\n\n`POST /task/async`\n\n```bash\ncurl -X POST http://localhost:8000/task/async \\\n     -H \"Content-Type: application/json\" \\\n       -d '{\"input\": \"Test input\"}'\n```\n\n**Get the status of a task**\n\n`GET /task/status/:task_id`\n\n```bash\ncurl http://localhost:8000/task/status/1234567890\n```\n\n**Get the result of a task**\n\n`GET /task/result/:task_id`\n\n```bash\ncurl http://localhost:8000/task/result/1234567890\n```\n\n## Configuration Options\n\nAgentServe allows you to configure various aspects of the application using a configuration file or environment variables.\n\n#### Using agentserve.yaml\n\nPlace an `agentserve.yaml` file in your project directory with the desired configurations.\n\n**Example:**\n\n```yaml\n# agentserve.yaml\n\ntask_queue: celery # Options: 'local', 'redis', 'celery'\n\ncelery:\n  broker_url: pyamqp://guest@localhost//\n\nredis:\n  host: localhost\n  port: 6379\n\nserver:\n  host: 0.0.0.0\n  port: 8000\n\nqueue: # if using local task queue\n  max_workers: 10 # default\n```\n\n\n#### Using Environment Variables\n\nSet the desired configuration options using environment variables.\n\nYou can override configurations using environment variables without modifying the configuration file.\n\n- `AGENTSERVE_TASK_QUEUE`\n- `AGENTSERVE_CELERY_BROKER_URL`\n- `AGENTSERVE_REDIS_HOST`\n- `AGENTSERVE_REDIS_PORT`\n- `AGENTSERVE_SERVER_HOST`\n- `AGENTSERVE_SERVER_PORT`\n- `AGENTSERVE_QUEUE_MAX_WORKERS`\n\n**Example:**\n\n```bash\nexport AGENTSERVE_TASK_QUEUE=redis\nexport AGENTSERVE_REDIS_HOST=redis-server-host\nexport AGENTSERVE_REDIS_PORT=6379\n```\n\n### FastAPI Configuration\n\nYou can specify FastAPI settings, including CORS configuration, using the `fastapi` key in your `agentserve.yaml` configuration file.\n\n**Example:**\n\n```yaml\n# agentserve.yaml\n\nfastapi:\n  cors:\n    allow_origins:\n      - \"http://localhost:3000\"\n      - \"https://yourdomain.com\"\n    allow_credentials: true\n    allow_methods:\n      - \"*\"\n    allow_headers:\n      - \"*\"\n```\n#### Using Environment Variables\n\nAlternatively, you can set the desired configuration options using environment variables.\n\n**Example:**\n\n```bash\nexport AGENTSERVE_CORS_ORIGINS=\"http://localhost:3000,https://yourdomain.com\"\nexport AGENTSERVE_CORS_ALLOW_CREDENTIALS=\"true\"\nexport AGENTSERVE_CORS_ALLOW_METHODS=\"GET,POST\"\nexport AGENTSERVE_CORS_ALLOW_HEADERS=\"Content-Type,Authorization\"\n```\n\n## Advanced Usage\n\n### Integrating with Existing Projects\n\nYou can integrate AgentServe into your existing projects by importing agentserve and defining your agent function.\n\n**Example:**\n\n```python\n# main.py\nimport agentserve\n\napp = agentserve.app()\n\n@app.agent\ndef my_custom_agent(task_data):\n    # Your custom agent logic (e.g. using LangChain, LlamaIndex, etc.)\n    result = perform_complex_computation(task_data)\n    return {\"result\": result}\n\nif __name__ == \"__main__\":\n    app.run()\n```\n\n### Input Validation\n\nAgentServe allows you to validate the input to your agent function using Pydantic. Simply add an input schema to your agent function.\n\n**Example:**\n\n```python\n# main.py\nimport agentserve\nfrom pydantic import BaseModel\n\nclass MyInputSchema(BaseModel):\n    prompt: str\n\n@app.agent(input_schema=MyInputSchema)\ndef my_custom_agent(task_data):\n    # Your custom agent logic\n    return {\"result\": \"Hello, world!\"}\n\nif __name__ == \"__main__\":\n    app.run()\n```\n\n## Hosting\n\nINSTRUCTIONS COMING SOON\n\n## ROADMAP\n\n- [ ] Add support for streaming responses\n- [ ] Add easy instructions for more hosting options (GCP, Azure, AWS, etc.)\n- [ ] Add support for external storage for task results\n- [ ] Add support for multi model agents\n- [ ] Add support for more agent frameworks\n\n## License\n\nThis project is licensed under the MIT License.\n\n## Contact\n\nJoin the [Discord](https://discord.gg/JkPrCnExSf) for support and discussion.\n\nFor any questions or issues, please contact Peter at peter@getprops.ai.\n","funding_links":[],"categories":["Building"],"sub_categories":["Deployment"],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2FPropsAI%2Fagentserve","html_url":"https://awesome.ecosyste.ms/projects/github.com%2FPropsAI%2Fagentserve","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2FPropsAI%2Fagentserve/lists"}