Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/e2b-dev/ai-artifacts
Hackable open-source version of Anthropic's Claude Artifacts by E2B
https://github.com/e2b-dev/ai-artifacts
ai ai-code-generation anthropic claude claude-ai code-interpreter e2b javascript llm nextjs react sandbox typescript
Last synced: 3 months ago
JSON representation
Hackable open-source version of Anthropic's Claude Artifacts by E2B
- Host: GitHub
- URL: https://github.com/e2b-dev/ai-artifacts
- Owner: e2b-dev
- License: apache-2.0
- Created: 2024-07-10T20:31:32.000Z (6 months ago)
- Default Branch: main
- Last Pushed: 2024-09-25T15:04:44.000Z (4 months ago)
- Last Synced: 2024-09-26T01:16:14.140Z (4 months ago)
- Topics: ai, ai-code-generation, anthropic, claude, claude-ai, code-interpreter, e2b, javascript, llm, nextjs, react, sandbox, typescript
- Language: TypeScript
- Homepage: https://artifacts.e2b.dev
- Size: 2.82 MB
- Stars: 2,107
- Watchers: 16
- Forks: 246
- Open Issues: 21
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
- awesome - e2b-dev/ai-artifacts - Hackable open-source version of Anthropic's Claude Artifacts by E2B (TypeScript)
- StarryDivineSky - e2b-dev/ai-artifacts
- awesome_ai_agents - ai-artifacts - This project implements Anthropic's Artifacts UI, using E2B's Code Interpreter SDK for secure AI code execution and Claude Sonnet 3.5 for code generation [github example](https://github.com/e2b-dev/ai-artifacts) | [reddit announcement](https://www.reddit.com/r/ClaudeAI/comments/1dmy6y2/open_source_version_of_anthropics_artifacts_ui/) | [github](https://github.com/e2b-dev/ai-artifacts) (Learning / Repositories)
- awesome_ai_agents - ai-artifacts - This project implements Anthropic's Artifacts UI, using E2B's Code Interpreter SDK for secure AI code execution and Claude Sonnet 3.5 for code generation [github example](https://github.com/e2b-dev/ai-artifacts) | [reddit announcement](https://www.reddit.com/r/ClaudeAI/comments/1dmy6y2/open_source_version_of_anthropics_artifacts_ui/) | [github](https://github.com/e2b-dev/ai-artifacts) (Learning / Repositories)
README
# E2B AI Artifacts
This is an open source version of [Anthropic's Claude Artifacts](https://www.anthropic.com/news/claude-3-5-sonnet) and Vercel [v0](https://v0.dev).
Powered by [E2B Sandbox SDK](https://github.com/e2b-dev/e2b) and [Code Interpreter SDK](https://github.com/e2b-dev/code-interpreter). Made by the [E2B](https://e2b.dev) team.
![Preview](preview.png)
[→ Try on artifacts.e2b.dev](https://artifacts.e2b.dev)
## Features
- Based on Next.js 14 (App Router), TailwindCSS, Vercel AI SDK.
- Uses [Code Interpreter SDK](https://github.com/e2b-dev/code-interpreter) from [E2B](https://e2b.dev) to securely execute code generated by AI.
- Streaming in the UI.
- Can install and use any package from npm, pip.
- Supported stacks ([add your own](#adding-custom-personas)):
- 🔸 Python interpreter
- 🔸 Next.js
- 🔸 Vue.js
- 🔸 Streamlit
- 🔸 Gradio
- Supported LLM Providers ([add your own](#adding-custom-llm-models)):
- 🔸 OpenAI
- 🔸 Anthropic
- 🔸 Google AI
- 🔸 Mistral
- 🔸 Groq
- 🔸 Fireworks
- 🔸 Together AI
- 🔸 Ollama**Make sure to give us a star!**
## Get started
### Prerequisites
- [git](https://git-scm.com)
- Recent version of [Node.js](https://nodejs.org) and npm package manager
- [E2B API Key](https://e2b.dev)
- LLM Provider API Key### 1. Clone the repository
In your terminal:
```
git clone https://github.com/e2b-dev/ai-artifacts.git
```### 2. Install the dependencies
Enter the repository:
```
cd ai-artifacts
```Run the following to install the required dependencies:
```
npm i
```### 3. Set the environment variables
Create a `.env.local` file and set the following:
```sh
# Get your API key here - https://e2b.dev/
E2B_API_KEY="your-e2b-api-key"# OpenAI API Key
OPENAI_API_KEY=# Other providers
ANTHROPIC_API_KEY=
GROQ_API_KEY=
FIREWORKS_API_KEY=
TOGETHER_AI_API_KEY=
GOOGLE_AI_API_KEY=
MISTRAL_API_KEY=
```### 4. Start the development server
```
npm run dev
```### 5. Build the web app
```
npm run build
```## Customize
### Adding custom personas
1. Make sure [E2B CLI](https://e2b.dev/docs/cli/installation) is installed and you're logged in.
2. Add a new folder under [sandbox-templates/](sandbox-templates/)
3. Initialize a new template using E2B CLI:
```
e2b template init
```This will create a new file called `e2b.Dockerfile`.
4. Adjust the `e2b.Dockerfile`
Here's an example streamlit template:
```Dockerfile
# You can use most Debian-based base images
FROM python:3.19-slimRUN pip3 install --no-cache-dir streamlit pandas numpy matplotlib requests seaborn plotly
# Copy the code to the container
WORKDIR /home/user
COPY . /home/user
```5. Specify a custom start command in `e2b.toml`:
```toml
start_cmd = "cd /home/user && streamlit run app.py"
```6. Deploy the template with the E2B CLI
```
e2b template build --name
```After the build has finished, you should get the following message:
```
✅ Building sandbox template finished.
```7. Open [lib/templates.json](lib/templates.json) in your code editor.
Add your new template to the list. Here's an example for Streamlit:
```json
"streamlit-developer": {
"name": "Streamlit developer",
"lib": [
"streamlit",
"pandas",
"numpy",
"matplotlib",
"request",
"seaborn",
"plotly"
],
"file": "app.py",
"instructions": "A streamlit app that reloads automatically.",
"port": 8501 // can be null
},
```Provide a template id (as key), name, list of dependencies, entrypoint and a port (optional). You can also add additional instructions that will be given to the LLM.
4. Optionally, add a new logo under [public/thirdparty/templates](public/thirdparty/templates)
### Adding custom LLM models
1. Open [lib/models.json](lib/models.ts) in your code editor.
2. Add a new entry to the models list:
```json
{
"id": "mistral-large",
"name": "Mistral Large",
"provider": "Ollama",
"providerId": "ollama"
}
```Where id is the model id, name is the model name (visible in the UI), provider is the provider name and providerId is the provider tag (see [adding providers](#adding-custom-llm-providers) below).
### Adding custom LLM providers
1. Open [lib/models.ts](lib/models.ts) in your code editor.
2. Add a new entry to the `providerConfigs` list:
Example for fireworks:
```ts
fireworks: () => createOpenAI({ apiKey: apiKey || process.env.FIREWORKS_API_KEY, baseURL: baseURL || 'https://api.fireworks.ai/inference/v1' })(modelNameString),
```3. Optionally, adjust the default structured output mode in the `getDefaultMode` function:
```ts
if (providerId === 'fireworks') {
return 'json'
}
```4. Optionally, add a new logo under [public/thirdparty/logos](public/thirdparty/logos)
## Contributing
As an open-source project, we welcome contributions from the community. If you are experiencing any bugs or want to add some improvements, please feel free to open an issue or pull request.