Ecosyste.ms: Awesome

An open API service indexing awesome lists of open source software.

https://github.com/ocularengineering/ocular

Open Core Enterprise Search
https://github.com/ocularengineering/ocular

ai chatgpt enterprise-search llm rag

Last synced: about 1 month ago
JSON representation

Open Core Enterprise Search

Lists

README

        


Ocular AI


Website


Building Blocks for Search Platforms.



## Introduction

[Ocular](https://useocular.com) is a set of modules and tools that allow you to build rich, reliable, and performant Generative AI-Powered Search Platforms without the need to reinvent Search Architecture.

We're help to you build you spin up customized internal search in days not months.

Dashboard

## Features

- ** Google Like Search Interface - Find what you need.
- ** App MarketPlace - Connect to all of your favorite Apps.
- ** Custom Connectors - Build your own connectors to propeitary data sources.
- ** Customizable Modular Infrastructure - Bring your own custom LLM's, Vector DB and more into Ocular.
- ** Governance Engine - Role Based Access Control, Audit Logs etc.

## Open-source vs Paid

Repo is under [Elastic License 2.0 (ELv2)](https://github.com/OcularEngineering/ocular/blob/main/LICENSE).

If you are interested in managed Ocular Cloud of self-hosted Enterprise Offering [book a meeting with us](https://calendly.com/louis-murerwa):

## Getting started

## Running Ocular in Docker

To run Ocular locally, you'll need to setup Docker in addition to Ocular.

### Prerequsites

First, make sure you have the Docker installed on your device. You can download and install it from [here](https://docs.docker.com/get-docker/).

1. Clone the Ocular directory.

```sh
git clone https://github.com/OcularEngineering/ocular.git && cd ocular
```

2. In the home directory, open `env.local` add the required OPEN AI env variables

- Required Keys

- Open AI Keys - To run Ocular **an LLM provider must be setup in the backend** . By default Open AI is the LLM Provider for Ocular so please add the Open AI keys in `env.local`.
- Support for other LLM providers is coming soon!

- Optional Keys
- Apps (Gmail|GoogleDrive|Asana|GitHub etc) - To Index Documents from Apps the Api keys have to be set up in the `env.local` for that specific app. Please read our docs on how to set up each app.

4. Run Docker.

```sh
docker compose -f docker-compose.local.yml up --build --force-recreate
```

This command initializes the containers specified in the `docker-compose.local.yml` file. It might take a few moments to complete, depending on your computer and internet connection.

Once the `docker compose` process completes, you should have your local version of Ocular up and running within Docker containers. You can access it at `http://localhost:3001/create-account`.

Remember to keep the Docker application open as long as you're working with your local Ocular instance.

## Contributing

We love contributions. Check out our guide to see how to [get started](https://github.com/OcularEngineering/ocular/blob/main/CONTRIBUTING.md).

Not sure where to get started? You can:

- Join our Slack, and ask us any questions there.

## Resources
- [Docs](https://docs.useocular.com) for comprehensive documentation and guides
- [Slack](https://join.slack.com/t/ocular-ai/shared_invite/zt-2g7ka0j1c-Tx~Q46MjplNma2Sk2Ruplw) for discussion with the community and Ocular team.
- [GitHub](https://github.com/OcularEngineering/ocular/issues) for code, issues, and pull requests
- Roadmap - Coming Soon

## Star History

[![Star History Chart](https://api.star-history.com/svg?repos=OcularEngineering/ocular&type=Date)](https://star-history.com/#OcularEngineering/ocular&Date)