https://github.com/parthapray/ollama_googlecolab_colabxterm_langchain
This repo contains a code that uses colabxterm and langchain community packages to install Ollama on Google Colab free tier T4 and pulls a model from Ollama and chats with it
https://github.com/parthapray/ollama_googlecolab_colabxterm_langchain
colabxterm freecolab googlecolab langchain ollama ollama-interface
Last synced: 7 months ago
JSON representation
This repo contains a code that uses colabxterm and langchain community packages to install Ollama on Google Colab free tier T4 and pulls a model from Ollama and chats with it
- Host: GitHub
- URL: https://github.com/parthapray/ollama_googlecolab_colabxterm_langchain
- Owner: ParthaPRay
- License: apache-2.0
- Created: 2024-05-09T12:17:48.000Z (over 1 year ago)
- Default Branch: main
- Last Pushed: 2024-05-09T12:53:41.000Z (over 1 year ago)
- Last Synced: 2025-01-09T23:54:51.537Z (9 months ago)
- Topics: colabxterm, freecolab, googlecolab, langchain, ollama, ollama-interface
- Language: Jupyter Notebook
- Homepage:
- Size: 205 KB
- Stars: 7
- Watchers: 1
- Forks: 1
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# Ollama integration with Google Colab
1. In your Google Colab notebook, install the necessary Python packages.
2. Load the xterm extension to use a terminal within a Colab notebook.
3. Open the terminal inside the Colab cell by running:
%xterm
4. nside the xterm terminal (opened within the Colab cell):
- Type the following command to install Ollama:
curl -fsSL https://ollama.com/install.sh | sh
- Type the following command to install Ollama:ollama serve & ollama run llama3
5. After leaving the xterm terminal, import the Ollama class from the LangChain community library.
6. Use the llm.invoke() method to prompt the model and receive its response.
Reference: https://www.youtube.com/watch?v=LN9rlGNaXUA&ab_channel=AkashDawari