An open API service indexing awesome lists of open source software.

https://github.com/renset/macai

Swift powered native macOS client for Ollama, ChatGPT and compatible API-backends
https://github.com/renset/macai

ai api-client bot chat chatgpt chatgpt-api llm macos ollama swift swiftui

Last synced: 4 months ago
JSON representation

Swift powered native macOS client for Ollama, ChatGPT and compatible API-backends

Awesome Lists containing this project

README

          




macai

GitHub top language GitHub code size in bytes GitHub Workflow Status GitHub
GitHub all releases

macai (macOS AI) is a simple yet powerful native macOS AI chat client that supports most AI providers: ChatGPT, Claude, xAI (Grok), Google Gemini, Perplexity, Ollama, OpenRouter, and almost any OpenAI-compatible APIs.

## Downloads

### Manual
Download [latest universal binary](https://github.com/Renset/macai/releases), notarized by Apple.

### Homebrew
Install macai cask with homebrew:
`brew install --cask macai`

### Build from source
Checkout main branch and open project in Xcode 14.3 or later

### iCloud Sync (Forks / Custom Builds)
If you want iCloud Sync to work in a fork or custom build, you must use your own CloudKit container.

1. Create a CloudKit container in your Apple Developer account.
2. Enable the iCloud capability for the macai target in Xcode, and add your container.
3. Update the `CloudKitContainerIdentifier` value in `macai/Info.plist` to your container ID.
4. Ensure your app’s bundle identifier matches the one you registered for the container.

If `CloudKitContainerIdentifier` is missing, the app falls back to the default container.

## Contributions
Contributions are welcome. Take a look at [Issues page](https://github.com/Renset/macai/issues) to see already added features/bugs before creating new one.
You can also support project by funding. This support is very important for me and allows to focus more on macai development.

Buy Me A Coffee

## Why macai
- **macOS-native and lightweight**
- **User-friendly**: simple setup, minimalist light/dark UI
- **Feature-rich**: vision, image generation, search, reasoning, import/export and more
- **iCloud Sync**: keep chats, messages, and settings in sync across devices
- **Private and secure**: no telemetry or usage tracking by macai (Note: Apple may collect anonymized telemetry when iCloud Sync is enabled)

## Run with ChatGPT, Claude, xAI or Google Gemini
To run macai with ChatGPT or Claude, you need to have an API token. API token is like password. You need to obtain the API token first to use any commercial LLM API. Most API services offer free credits on registering new account, so you can try most of them for free.
Here is how to get API token for all supported services:
- OpenAI: https://help.openai.com/en/articles/4936850-where-do-i-find-my-secret-api-key
- Claude: https://docs.anthropic.com/en/api/getting-started
- Google Gemini: https://ai.google.dev/gemini-api/docs/api-key
- xAI Grok: https://docs.x.ai/docs#models
- OpenRouter: https://openrouter.ai/docs/api-reference/authentication#using-an-api-key

If you are new to LLM and don't want to pay for the tokens, take a look Ollama. It supports dozens of OpenSource LLM models that can run locally on Apple M1/M2/M3/M4 Macs.

## Run with [Ollama](https://ollama.com)
Ollama is the open-source back-end for various LLM models.
Run macai with Ollama is easy:
1. Install Ollama from the [official website](https://ollama.com)
2. Follow installation guides
3. After installation, select model (llama3.1 or llama3.2 are recommended) and pull model using command in terminal: `ollama pull `
4. In macai settings, open API Service tab, add new API service (Expert mode) and select type Ollama":

5. Select model, and default AI Assistant and save
6. Test and enjoy!

## System requirements
macOS 14.0 and later (both Intel and Apple chips are supported)

## Project status
Project is in the active development phase.

## License
[Apache-2.0](https://github.com/Renset/macai/blob/main/LICENSE.md)