{"id":28255844,"url":"https://github.com/gerome-elassaad/codingit","last_synced_at":"2026-04-26T16:32:48.350Z","repository":{"id":294012613,"uuid":"976033730","full_name":"Gerome-Elassaad/CodingIT","owner":"Gerome-Elassaad","description":"CodinIT.dev Demo | Open-source, AI app builder prototype 🌟 Star to support the project!","archived":false,"fork":false,"pushed_at":"2025-12-26T14:47:03.000Z","size":27011,"stargazers_count":151,"open_issues_count":4,"forks_count":67,"subscribers_count":2,"default_branch":"main","last_synced_at":"2025-12-28T02:39:37.559Z","etag":null,"topics":["ai-coding","bolt-new","coding","codinit-dev","cursor","e2b-dev","lovable","lovable-dev","open-source","v0-dev","v0dev","vercel","vercel-ai-sdk"],"latest_commit_sha":null,"homepage":"https://codingit.vercel.app","language":"TypeScript","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"apache-2.0","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/Gerome-Elassaad.png","metadata":{"files":{"readme":"README.md","changelog":"CHANGELOG.md","contributing":".github/CONTRIBUTING.md","funding":".github/FUNDING.yml","license":"LICENSE","code_of_conduct":".github/CODE_OF_CONDUCT.md","threat_model":null,"audit":null,"citation":null,"codeowners":".github/CODEOWNERS","security":".github/SECURITY.md","support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null,"zenodo":null,"notice":null,"maintainers":null,"copyright":null,"agents":null,"dco":null,"cla":null},"funding":{"github":"gerome-elassaad"}},"created_at":"2025-05-01T11:10:58.000Z","updated_at":"2025-12-27T12:20:18.000Z","dependencies_parsed_at":null,"dependency_job_id":"74605640-2c66-4110-98a5-2130859fa024","html_url":"https://github.com/Gerome-Elassaad/CodingIT","commit_stats":null,"previous_names":["gerome-elassaad/codingit"],"tags_count":62,"template":false,"template_full_name":null,"purl":"pkg:github/Gerome-Elassaad/CodingIT","repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/Gerome-Elassaad%2FCodingIT","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/Gerome-Elassaad%2FCodingIT/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/Gerome-Elassaad%2FCodingIT/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/Gerome-Elassaad%2FCodingIT/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/Gerome-Elassaad","download_url":"https://codeload.github.com/Gerome-Elassaad/CodingIT/tar.gz/refs/heads/main","sbom_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/Gerome-Elassaad%2FCodingIT/sbom","scorecard":null,"host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":286080680,"owners_count":32305035,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2026-04-26T09:34:17.070Z","status":"ssl_error","status_checked_at":"2026-04-26T09:34:00.993Z","response_time":129,"last_error":"SSL_read: unexpected eof while reading","robots_txt_status":"success","robots_txt_updated_at":"2025-07-24T06:49:26.215Z","robots_txt_url":"https://github.com/robots.txt","online":false,"can_crawl_api":true,"host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["ai-coding","bolt-new","coding","codinit-dev","cursor","e2b-dev","lovable","lovable-dev","open-source","v0-dev","v0dev","vercel","vercel-ai-sdk"],"created_at":"2025-05-19T22:14:47.099Z","updated_at":"2026-04-26T16:32:48.343Z","avatar_url":"https://github.com/Gerome-Elassaad.png","language":"TypeScript","readme":"![opengraph](https://github.com/user-attachments/assets/de684e88-a65c-42ea-b067-d1a3bc85a420)\n\n\n\u003cp align=\"center\"\u003e\n  \u003ca href=\"https://e2b.dev/startups\"\u003e\n    \u003cimg src=\"https://img.shields.io/badge/SPONSORED%20BY-E2B%20FOR%20STARTUPS-32CD32?style=for-the-badge\" alt=\"SPONSORED BY E2B FOR STARTUPS\" /\u003e\n  \u003c/a\u003e\n\u003c/p\u003e\n\n\u003cp align=\"center\"\u003e\n  \u003ca href=\"https://huntscreens.com/en/products/codinit\" target=\"_blank\" title=\"Featured on HuntScreens\" aria-label=\"Featured on HuntScreens\"\u003e\n  \u003cimg src=\"https://shot.huntscreens.com/badge.svg\" alt=\"Featured on HuntScreens\" width=\"240\" height=\"60\" loading=\"lazy\" /\u003e\n\u003c/a\u003e\n\u003c/p\u003e\n\n\n## USE NEW DESKTOP APP: [HERE](https://github.com/codinit-dev/codinit-dev)\n\n## Features\n\n- Based on Next.js 14 (App Router, Server Actions), shadcn/ui, TailwindCSS, Vercel AI SDK.\n- Uses the [E2B SDK](https://github.com/e2b-dev/code-interpreter) by [E2B](https://e2b.dev) to securely execute code generated by AI.\n- Streaming in the UI.\n- Can install and use any package from npm, pip.\n- Supported stacks ([add your own](#adding-custom-personas)):\n  - 🔸 Python data analyst\n  - 🔸 Next.js\n  - 🔸 Vue.js\n  - 🔸 Streamlit\n  - 🔸 Gradio\n  - 🔸 CodinIT Engineer\n  - Supported LLM Providers ([add your own](#adding-custom-llm-models)):\n  - 🔸 OpenAI\n  - 🔸 Anthropic\n  - 🔸 Google Generative AI\n  - 🔸 Google Vertex AI\n  - 🔸 Mistral\n  - 🔸 Groq\n  - 🔸 Fireworks\n  - 🔸 Together AI\n  - 🔸 Ollama\n  - 🔸 xAI\n  - 🔸 DeepSeek\n\n## Get started\n\n### Prerequisites\n\n- [git](https://git-scm.com)\n- Recent version of [Node.js](https://nodejs.org) and npm package manager\n- [E2B API Key](https://e2b.dev)\n- LLM Provider API Key\n\n### 1. Clone the repository\n\nIn your terminal:\n\n```\ngit clone https://github.com/Gerome-Elassaad/CodingIT.git\n```\nReplace `Gerome-Elassaad/CodingIT.git` with your actual repository details.\n\n### 2. Install the dependencies\n\nNavigate into the cloned project directory (if you're not already in it) and run the following to install the required dependencies:\n\n```\nnpm i\n```\n\n### 3. Set the environment variables\n\nCreate a `.env.local` file and set the following:\n\n```sh\n# Get your API key here - https://e2b.dev/\nE2B_API_KEY=\"your-e2b-api-key\"\n\n# OpenAI API Key\nOPENAI_API_KEY=\n\n# Other providers\nANTHROPIC_API_KEY=\nGROQ_API_KEY=\nFIREWORKS_API_KEY=\nTOGETHER_API_KEY=\nGOOGLE_AI_API_KEY=\nGOOGLE_VERTEX_CREDENTIALS=\nMISTRAL_API_KEY=\nXAI_API_KEY=\nDEEPSEEK_API_KEY=\n\n# Google AUth (for Google Vertex AI)\nGOOGLE_AI_API_KEY=\nGOOGLE_CLIENT_ID=\nGOOGLE_VERTEX_CREDENTIALS=\n\n# Github OAuth (for GitHub login)\nGITHUB_CLIENT_ID=\nGITHUB_CLIENT_SECRET=\nPRIVATE_KEY_PEM=\n\n\n# Domain of the site\nNEXT_PUBLIC_SITE_URL=\n\n# Rate limit\nRATE_LIMIT_MAX_REQUESTS=\nRATE_LIMIT_WINDOW=\n\n# Vercel/Upstash KV (short URLs, rate limiting)\nKV_REST_API_URL=\nKV_REST_API_TOKEN=\n\n# Supabase (auth)\nSUPABASE_URL=\nSUPABASE_ANON_KEY=\n\n# PostHog (analytics)\nNEXT_PUBLIC_POSTHOG_KEY=\nNEXT_PUBLIC_POSTHOG_HOST=\n\n### Disabling functionality (when uncommented)\n\n# Disable API key and base URL input in the chat\n# NEXT_PUBLIC_NO_API_KEY_INPUT=\n# NEXT_PUBLIC_NO_BASE_URL_INPUT=\n\n# Hide local models from the list of available models\n# NEXT_PUBLIC_HIDE_LOCAL_MODELS=\n```\n\n### 4. Start the development server\n\n```\nnpm run dev\n```\n\n### 5. Build the web app\n\n```\nnpm run build\n```\n\n## Customize\n\n### Adding custom personas\n\n1. Make sure [E2B CLI](https://e2b.dev/docs/cli) is installed and you're logged in.\n\n2. Add a new folder under [sandbox-templates/](sandbox-templates/)\n\n3. Initialize a new template using E2B CLI:\n\n    ```\n    e2b template init\n    ```\n\n    This will create a new file called `e2b.Dockerfile`.\n\n4. Adjust the `e2b.Dockerfile`\n\n    Here's an example streamlit template:\n\n    ```Dockerfile\n    # You can use most Debian-based base images\n    FROM python:3.19-slim\n\n    RUN pip3 install --no-cache-dir streamlit pandas numpy matplotlib requests seaborn plotly\n\n    # Copy the code to the container\n    WORKDIR /home/user\n    COPY . /home/user\n    ```\n\n5. Specify a custom start command in `e2b.toml`:\n\n    ```toml\n    start_cmd = \"cd /home/user \u0026\u0026 streamlit run app.py\"\n    ```\n\n6. Deploy the template with the E2B CLI\n\n    ```\n    e2b template build --name \u003ctemplate-name\u003e\n    ```\n\n    After the build has finished, you should get the following message:\n\n    ```\n    ✅ Building sandbox template \u003ctemplate-id\u003e \u003ctemplate-name\u003e finished.\n    ```\n\n7. Open [lib/templates.json](lib/templates.json) in your code editor.\n\n    Add your new template to the list. Here's an example for Streamlit:\n\n    ```json\n    \"streamlit-developer\": {\n      \"name\": \"Streamlit developer\",\n      \"lib\": [\n        \"streamlit\",\n        \"pandas\",\n        \"numpy\",\n        \"matplotlib\",\n        \"request\",\n        \"seaborn\",\n        \"plotly\"\n      ],\n      \"file\": \"app.py\",\n      \"instructions\": \"A streamlit app that reloads automatically.\",\n      \"port\": 8501 // can be null\n    },\n    ```\n\n    Provide a template id (as key), name, list of dependencies, entrypoint and a port (optional). You can also add additional instructions that will be given to the LLM.\n\n8. Optionally, add a new logo under [public/thirdparty/templates](public/thirdparty/templates)\n\n### Adding custom LLM models\n\n1. Open [lib/models.json](lib/models.json) in your code editor.\n\n2. Add a new entry to the models list:\n\n    ```json\n    {\n      \"id\": \"mistral-large\",\n      \"name\": \"Mistral Large\",\n      \"provider\": \"Ollama\",\n      \"providerId\": \"ollama\"\n    }\n    ```\n\n    Where id is the model id, name is the model name (visible in the UI), provider is the provider name and providerId is the provider tag (see [adding providers](#adding-custom-llm-providers) below).\n\n### Adding custom LLM providers\n\n1. Open [lib/models.ts](lib/models.ts) in your code editor.\n\n2. Add a new entry to the `providerConfigs` list:\n\n    Example for fireworks:\n\n    ```ts\n    fireworks: () =\u003e createOpenAI({ apiKey: apiKey || process.env.FIREWORKS_API_KEY, baseURL: baseURL || 'https://api.fireworks.ai/inference/v1' })(modelNameString),\n    ```\n\n3. Optionally, adjust the default structured output mode in the `getDefaultMode` function:\n\n    ```ts\n    if (providerId === 'fireworks') {\n      return 'json'\n    }\n    ```\n\n4. Optionally, add a new logo under [public/thirdparty/logos](public/thirdparty/logos)\n\n## Contributing\n\nAs an open-source project, we welcome contributions from the community. If you are experiencing any bugs or want to add some improvements, please feel free to open an issue or pull request.\n","funding_links":["https://github.com/sponsors/gerome-elassaad"],"categories":[],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fgerome-elassaad%2Fcodingit","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fgerome-elassaad%2Fcodingit","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fgerome-elassaad%2Fcodingit/lists"}