https://github.com/langchain-ai/langchain-postgres
LangChain abstractions backed by Postgres Backend
https://github.com/langchain-ai/langchain-postgres
langchain langchain-python postgres postgresql
Last synced: about 1 month ago
JSON representation
LangChain abstractions backed by Postgres Backend
- Host: GitHub
- URL: https://github.com/langchain-ai/langchain-postgres
- Owner: langchain-ai
- License: mit
- Created: 2024-04-08T13:38:40.000Z (about 1 year ago)
- Default Branch: main
- Last Pushed: 2025-05-08T21:49:23.000Z (about 2 months ago)
- Last Synced: 2025-05-08T22:19:50.316Z (about 2 months ago)
- Topics: langchain, langchain-python, postgres, postgresql
- Language: Python
- Homepage:
- Size: 916 KB
- Stars: 181
- Watchers: 10
- Forks: 77
- Open Issues: 67
-
Metadata Files:
- Readme: README.md
- Contributing: CONTRIBUTING.md
- License: LICENSE
- Security: security.md
Awesome Lists containing this project
README
# langchain-postgres
[](https://github.com/langchain-ai/langchain-postgres/releases)
[](https://github.com/langchain-ai/langchain-postgres/actions/workflows/ci.yml)
[](https://opensource.org/licenses/MIT)
[](https://twitter.com/langchainai)
[](https://discord.gg/6adMQxSpJS)
[](https://github.com/langchain-ai/langchain-postgres/issues)The `langchain-postgres` package implementations of core LangChain abstractions using `Postgres`.
The package is released under the MIT license.
Feel free to use the abstraction as provided or else modify them / extend them as appropriate for your own application.
## Requirements
The package supports the [asyncpg](https://github.com/MagicStack/asyncpg) and [psycogp3](https://www.psycopg.org/psycopg3/) drivers.
## Installation
```bash
pip install -U langchain-postgres
```## Vectorstore
> [!WARNING]
> In v0.0.14+, `PGVector` is deprecated. Please migrate to `PGVectorStore`
> for improved performance and manageability.
> See the [migration guide](https://github.com/langchain-ai/langchain-postgres/blob/main/examples/migrate_pgvector_to_pgvectorstore.ipynb) for details on how to migrate from `PGVector` to `PGVectorStore`.### Documentation
* [Quickstart](https://github.com/langchain-ai/langchain-postgres/blob/main/examples/pg_vectorstore.ipynb)
* [How-to](https://github.com/langchain-ai/langchain-postgres/blob/main/examples/pg_vectorstore_how_to.ipynb)### Example
```python
from langchain_core.documents import Document
from langchain_core.embeddings import DeterministicFakeEmbedding
from langchain_postgres import PGEngine, PGVectorStore# Replace the connection string with your own Postgres connection string
CONNECTION_STRING = "postgresql+psycopg3://langchain:langchain@localhost:6024/langchain"
engine = PGEngine.from_connection_string(url=CONNECTION_STRING)# Replace the vector size with your own vector size
VECTOR_SIZE = 768
embedding = DeterministicFakeEmbedding(size=VECTOR_SIZE)TABLE_NAME = "my_doc_collection"
engine.init_vectorstore_table(
table_name=TABLE_NAME,
vector_size=VECTOR_SIZE,
)store = PGVectorStore.create_sync(
engine=engine,
table_name=TABLE_NAME,
embedding_service=embedding,
)docs = [
Document(page_content="Apples and oranges"),
Document(page_content="Cars and airplanes"),
Document(page_content="Train")
]store.add_documents(docs)
query = "I'd like a fruit."
docs = store.similarity_search(query)
print(docs)
```> [!TIP]
> All synchronous functions have corresponding asynchronous functions## ChatMessageHistory
The chat message history abstraction helps to persist chat message history
in a postgres table.PostgresChatMessageHistory is parameterized using a `table_name` and a `session_id`.
The `table_name` is the name of the table in the database where
the chat messages will be stored.The `session_id` is a unique identifier for the chat session. It can be assigned
by the caller using `uuid.uuid4()`.```python
import uuidfrom langchain_core.messages import SystemMessage, AIMessage, HumanMessage
from langchain_postgres import PostgresChatMessageHistory
import psycopg# Establish a synchronous connection to the database
# (or use psycopg.AsyncConnection for async)
conn_info = ... # Fill in with your connection info
sync_connection = psycopg.connect(conn_info)# Create the table schema (only needs to be done once)
table_name = "chat_history"
PostgresChatMessageHistory.create_tables(sync_connection, table_name)session_id = str(uuid.uuid4())
# Initialize the chat history manager
chat_history = PostgresChatMessageHistory(
table_name,
session_id,
sync_connection=sync_connection
)# Add messages to the chat history
chat_history.add_messages([
SystemMessage(content="Meow"),
AIMessage(content="woof"),
HumanMessage(content="bark"),
])print(chat_history.messages)
```## Google Cloud Integrations
[Google Cloud](https://python.langchain.com/docs/integrations/providers/google/) provides Vector Store, Chat Message History, and Data Loader integrations for [AlloyDB](https://cloud.google.com/alloydb) and [Cloud SQL](https://cloud.google.com/sql) for PostgreSQL databases via the following PyPi packages:
* [`langchain-google-alloydb-pg`](https://github.com/googleapis/langchain-google-alloydb-pg-python)
* [`langchain-google-cloud-sql-pg`](https://github.com/googleapis/langchain-google-cloud-sql-pg-python)
Using the Google Cloud integrations provides the following benefits:
- **Enhanced Security**: Securely connect to Google Cloud databases utilizing IAM for authorization and database authentication without needing to manage SSL certificates, configure firewall rules, or enable authorized networks.
- **Simplified and Secure Connections:** Connect to Google Cloud databases effortlessly using the instance name instead of complex connection strings. The integrations creates a secure connection pool that can be easily shared across your application using the `engine` object.| Vector Store | Metadata filtering | Async support | Schema Flexibility | Improved metadata handling | Hybrid Search |
|--------------------------|--------------------|----------------|--------------------|----------------------------|---------------|
| Google AlloyDB | ✓ | ✓ | ✓ | ✓ | ✗ |
| Google Cloud SQL Postgres| ✓ | ✓ | ✓ | ✓ | ✗ |