{"id":15919641,"url":"https://github.com/e2b-dev/fragments","last_synced_at":"2025-05-13T18:10:04.733Z","repository":{"id":247847296,"uuid":"827005317","full_name":"e2b-dev/fragments","owner":"e2b-dev","description":"Open-source Next.js template for building apps that are fully generated by AI. By E2B.","archived":false,"fork":false,"pushed_at":"2025-04-23T11:55:37.000Z","size":5084,"stargazers_count":5281,"open_issues_count":7,"forks_count":683,"subscribers_count":42,"default_branch":"main","last_synced_at":"2025-04-25T15:48:46.051Z","etag":null,"topics":["ai","ai-code-generation","anthropic","claude","claude-ai","code-interpreter","e2b","javascript","llm","nextjs","react","sandbox","typescript"],"latest_commit_sha":null,"homepage":"https://fragments.e2b.dev","language":"TypeScript","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"apache-2.0","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/e2b-dev.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":"CODEOWNERS","security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null,"zenodo":null}},"created_at":"2024-07-10T20:31:32.000Z","updated_at":"2025-04-25T11:33:30.000Z","dependencies_parsed_at":"2024-07-18T07:43:44.331Z","dependency_job_id":"8196d412-c8aa-403e-b9b6-d19db9643544","html_url":"https://github.com/e2b-dev/fragments","commit_stats":{"total_commits":175,"total_committers":8,"mean_commits":21.875,"dds":"0.24571428571428566","last_synced_commit":"02fa2deedeb127eff57b758c40f4141add77d719"},"previous_names":["e2b-dev/ai-artifacts","e2b-dev/fragments"],"tags_count":0,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/e2b-dev%2Ffragments","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/e2b-dev%2Ffragments/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/e2b-dev%2Ffragments/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/e2b-dev%2Ffragments/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/e2b-dev","download_url":"https://codeload.github.com/e2b-dev/fragments/tar.gz/refs/heads/main","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":254000854,"owners_count":21997442,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["ai","ai-code-generation","anthropic","claude","claude-ai","code-interpreter","e2b","javascript","llm","nextjs","react","sandbox","typescript"],"created_at":"2024-10-06T19:02:00.613Z","updated_at":"2025-05-13T18:10:04.678Z","avatar_url":"https://github.com/e2b-dev.png","language":"TypeScript","readme":"![E2B Fragments Preview Light](/readme-assets/fragments-light.png#gh-light-mode-only)\n![E2B Fragments Preview Dark](/readme-assets/fragments-dark.png#gh-dark-mode-only)\n\n# Fragments by E2B\n\nThis is an open-source version of apps like [Anthropic's Claude Artifacts](https://www.anthropic.com/news/claude-3-5-sonnet), Vercel [v0](https://v0.dev), or [GPT Engineer](https://gptengineer.app).\n\nPowered by the [E2B SDK](https://github.com/e2b-dev/code-interpreter).\n\n[→ Try on fragments.e2b.dev](https://fragments.e2b.dev)\n\n## Features\n\n- Based on Next.js 14 (App Router, Server Actions), shadcn/ui, TailwindCSS, Vercel AI SDK.\n- Uses the [E2B SDK](https://github.com/e2b-dev/code-interpreter) by [E2B](https://e2b.dev) to securely execute code generated by AI.\n- Streaming in the UI.\n- Can install and use any package from npm, pip.\n- Supported stacks ([add your own](#adding-custom-personas)):\n  - 🔸 Python interpreter\n  - 🔸 Next.js\n  - 🔸 Vue.js\n  - 🔸 Streamlit\n  - 🔸 Gradio\n- Supported LLM Providers ([add your own](#adding-custom-llm-models)):\n  - 🔸 OpenAI\n  - 🔸 Anthropic\n  - 🔸 Google AI\n  - 🔸 Mistral\n  - 🔸 Groq\n  - 🔸 Fireworks\n  - 🔸 Together AI\n  - 🔸 Ollama\n\n**Make sure to give us a star!**\n\n\u003cimg width=\"165\" alt=\"Screenshot 2024-04-20 at 22 13 32\" src=\"https://github.com/mishushakov/llm-scraper/assets/10400064/11e2a79f-a835-48c4-9f85-5c104ca7bb49\"\u003e\n\n## Get started\n\n### Prerequisites\n\n- [git](https://git-scm.com)\n- Recent version of [Node.js](https://nodejs.org) and npm package manager\n- [E2B API Key](https://e2b.dev)\n- LLM Provider API Key\n\n### 1. Clone the repository\n\nIn your terminal:\n\n```\ngit clone https://github.com/e2b-dev/fragments.git\n```\n\n### 2. Install the dependencies\n\nEnter the repository:\n\n```\ncd fragments\n```\n\nRun the following to install the required dependencies:\n\n```\nnpm i\n```\n\n### 3. Set the environment variables\n\nCreate a `.env.local` file and set the following:\n\n```sh\n# Get your API key here - https://e2b.dev/\nE2B_API_KEY=\"your-e2b-api-key\"\n\n# OpenAI API Key\nOPENAI_API_KEY=\n\n# Other providers\nANTHROPIC_API_KEY=\nGROQ_API_KEY=\nFIREWORKS_API_KEY=\nTOGETHER_API_KEY=\nGOOGLE_AI_API_KEY=\nGOOGLE_VERTEX_CREDENTIALS=\nMISTRAL_API_KEY=\nXAI_API_KEY=\n\n### Optional env vars\n\n# Domain of the site\nNEXT_PUBLIC_SITE_URL=\n\n# Rate limit\nRATE_LIMIT_MAX_REQUESTS=\nRATE_LIMIT_WINDOW=\n\n# Vercel/Upstash KV (short URLs, rate limiting)\nKV_REST_API_URL=\nKV_REST_API_TOKEN=\n\n# Supabase (auth)\nSUPABASE_URL=\nSUPABASE_ANON_KEY=\n\n# PostHog (analytics)\nNEXT_PUBLIC_POSTHOG_KEY=\nNEXT_PUBLIC_POSTHOG_HOST=\n\n### Disabling functionality (when uncommented)\n\n# Disable API key and base URL input in the chat\n# NEXT_PUBLIC_NO_API_KEY_INPUT=\n# NEXT_PUBLIC_NO_BASE_URL_INPUT=\n\n# Hide local models from the list of available models\n# NEXT_PUBLIC_HIDE_LOCAL_MODELS=\n```\n\n### 4. Start the development server\n\n```\nnpm run dev\n```\n\n### 5. Build the web app\n\n```\nnpm run build\n```\n\n## Customize\n\n### Adding custom personas\n\n1. Make sure [E2B CLI](https://e2b.dev/docs/cli) is installed and you're logged in.\n\n2. Add a new folder under [sandbox-templates/](sandbox-templates/)\n\n3. Initialize a new template using E2B CLI:\n\n    ```\n    e2b template init\n    ```\n\n    This will create a new file called `e2b.Dockerfile`.\n\n4. Adjust the `e2b.Dockerfile`\n\n    Here's an example streamlit template:\n\n    ```Dockerfile\n    # You can use most Debian-based base images\n    FROM python:3.19-slim\n\n    RUN pip3 install --no-cache-dir streamlit pandas numpy matplotlib requests seaborn plotly\n\n    # Copy the code to the container\n    WORKDIR /home/user\n    COPY . /home/user\n    ```\n\n5. Specify a custom start command in `e2b.toml`:\n\n    ```toml\n    start_cmd = \"cd /home/user \u0026\u0026 streamlit run app.py\"\n    ```\n\n6. Deploy the template with the E2B CLI\n\n    ```\n    e2b template build --name \u003ctemplate-name\u003e\n    ```\n\n    After the build has finished, you should get the following message:\n\n    ```\n    ✅ Building sandbox template \u003ctemplate-id\u003e \u003ctemplate-name\u003e finished.\n    ```\n\n7. Open [lib/templates.json](lib/templates.json) in your code editor.\n\n    Add your new template to the list. Here's an example for Streamlit:\n\n    ```json\n    \"streamlit-developer\": {\n      \"name\": \"Streamlit developer\",\n      \"lib\": [\n        \"streamlit\",\n        \"pandas\",\n        \"numpy\",\n        \"matplotlib\",\n        \"request\",\n        \"seaborn\",\n        \"plotly\"\n      ],\n      \"file\": \"app.py\",\n      \"instructions\": \"A streamlit app that reloads automatically.\",\n      \"port\": 8501 // can be null\n    },\n    ```\n\n    Provide a template id (as key), name, list of dependencies, entrypoint and a port (optional). You can also add additional instructions that will be given to the LLM.\n\n4. Optionally, add a new logo under [public/thirdparty/templates](public/thirdparty/templates)\n\n### Adding custom LLM models\n\n1. Open [lib/models.json](lib/models.ts) in your code editor.\n\n2. Add a new entry to the models list:\n\n    ```json\n    {\n      \"id\": \"mistral-large\",\n      \"name\": \"Mistral Large\",\n      \"provider\": \"Ollama\",\n      \"providerId\": \"ollama\"\n    }\n    ```\n\n    Where id is the model id, name is the model name (visible in the UI), provider is the provider name and providerId is the provider tag (see [adding providers](#adding-custom-llm-providers) below).\n\n### Adding custom LLM providers\n\n1. Open [lib/models.ts](lib/models.ts) in your code editor.\n\n2. Add a new entry to the `providerConfigs` list:\n\n    Example for fireworks:\n\n    ```ts\n    fireworks: () =\u003e createOpenAI({ apiKey: apiKey || process.env.FIREWORKS_API_KEY, baseURL: baseURL || 'https://api.fireworks.ai/inference/v1' })(modelNameString),\n    ```\n\n3. Optionally, adjust the default structured output mode in the `getDefaultMode` function:\n\n    ```ts\n    if (providerId === 'fireworks') {\n      return 'json'\n    }\n    ```\n\n4. Optionally, add a new logo under [public/thirdparty/logos](public/thirdparty/logos)\n\n## Contributing\n\nAs an open-source project, we welcome contributions from the community. If you are experiencing any bugs or want to add some improvements, please feel free to open an issue or pull request.\n","funding_links":[],"categories":["TypeScript","Starter","Boilerplates \u0026 Starters","HarmonyOS","Repos","Agent Integration \u0026 Deployment Tools"],"sub_categories":["Windows Manager","AI Agent Gateway"],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fe2b-dev%2Ffragments","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fe2b-dev%2Ffragments","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fe2b-dev%2Ffragments/lists"}