{"id":20825751,"url":"https://github.com/ioriens/codellama-chat","last_synced_at":"2025-04-10T02:52:24.611Z","repository":{"id":190833598,"uuid":"683372910","full_name":"IOriens/codellama-chat","owner":"IOriens","description":"Codellama Instruct OpenAi style api.","archived":false,"fork":false,"pushed_at":"2023-08-27T03:09:02.000Z","size":286,"stargazers_count":29,"open_issues_count":1,"forks_count":4,"subscribers_count":2,"default_branch":"main","last_synced_at":"2025-03-24T04:22:25.956Z","etag":null,"topics":["codellama","continue","huggingface","openai-api"],"latest_commit_sha":null,"homepage":"","language":"Python","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"mit","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/IOriens.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null}},"created_at":"2023-08-26T11:07:12.000Z","updated_at":"2024-05-31T08:06:15.000Z","dependencies_parsed_at":"2025-01-18T17:43:30.021Z","dependency_job_id":"0ccf30da-7c8b-4dda-b33d-41d6d32fe59d","html_url":"https://github.com/IOriens/codellama-chat","commit_stats":null,"previous_names":["ioriens/codellama-chat"],"tags_count":0,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/IOriens%2Fcodellama-chat","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/IOriens%2Fcodellama-chat/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/IOriens%2Fcodellama-chat/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/IOriens%2Fcodellama-chat/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/IOriens","download_url":"https://codeload.github.com/IOriens/codellama-chat/tar.gz/refs/heads/main","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":248147112,"owners_count":21055479,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["codellama","continue","huggingface","openai-api"],"created_at":"2024-11-17T23:06:43.676Z","updated_at":"2025-04-10T02:52:24.584Z","avatar_url":"https://github.com/IOriens.png","language":"Python","readme":"# CodeLlama Chat Api\n\nThe CodeLlama Chat Assistant is a project built on Flask and the CodeLlama AI model, designed to facilitate real-time chat interactions with an AI assistant. This project enables users to send chat messages and receive responses from the AI assistant.\n\n![ChatGPT Next Web with Codellama](assets/chatgpt-next-web-codellama.png)\n\n## Features\n\n- Real-Time Chat Interaction: Engage in real-time chat interactions with the AI assistant by sending chat messages to the API.\n- Response Streaming: Responses can be streamed as event flows, providing an efficient real-time chat experience if desired.\n- Powered by CodeLlama AI: The project leverages the CodeLlama AI model to generate responses from the assistant, delivering an intelligent chat experience.\n\n## Getting Started\n\nThe following steps will guide you through setting up and running the project in your local environment.\n\n### 1. Environment Setup\n\nEnsure your environment meets the following requirements:\n\n- Python 3.6 or higher\n- Flask and other required Python libraries\n\nYou can install the necessary dependencies using the following command:\n\n```bash\npip install -r requirements.txt\n```\n\n### 2. Install Prerequisite\n\nBefore running the project, you need to install the `bitsandbytes` library on Windows. You can install it using the following command:\n\n```shell\npython -m pip install bitsandbytes --prefer-binary --extra-index-url=https://jllllll.github.io/bitsandbytes-windows-webui\n```\n\n### 3. Configure the Model\n\nWithin the project, you'll need to configure the CodeLlama AI model. In the code, locate the following section to configure the model:\n\n```python\nmodel_id = \"codellama/CodeLlama-7b-Instruct-hf\"\n# ...\n```\n\n### 4. Launch the Project\n\nIn your terminal, use the following command to launch the Flask application:\n\n```bash\npython main.py\n```\n\nThe application will run on the default host (usually `localhost`) and port (typically `5000`). You can interact with the AI assistant by accessing `http://localhost:5000`.\n\n\n## Usage\n\n### Continue Integration\n\nType `/config` and change model config like [ggml](https://continue.dev/docs/customization#local-models-with-ggml)\n\n### Chatgpt Next Web\n\nChange endpoint to `http://localhost:5000`\n\n\n\n\n## API Endpoints\n\n### POST `/v1/chat/completions`\n\nSend a JSON request to this endpoint containing chat messages to interact with the AI assistant. The request body should include the following fields:\n\n- `messages`: A list containing chat messages, each with `role` and `content` fields specifying the message's role and content.\n- `stream`: A boolean indicating whether to return the response as an event stream.\n\nThe response will be returned in JSON format, containing the AI assistant's response.\n\n## Contribution\n\nFeel free to raise issues, provide suggestions, and contribute code. If you encounter any issues or have suggestions for improvements, create an Issue to let us know.\n\n## License\n\nThis project is licensed under the [MIT License](LICENSE).","funding_links":[],"categories":[],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fioriens%2Fcodellama-chat","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fioriens%2Fcodellama-chat","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fioriens%2Fcodellama-chat/lists"}