Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/llukas22/tei-client
Convenience Client for Hugging Face Text Embeddings Inference (TEI) with synchronous and asynchronous HTTP/gRPC support
https://github.com/llukas22/tei-client
client embeddings grpc http
Last synced: about 1 month ago
JSON representation
Convenience Client for Hugging Face Text Embeddings Inference (TEI) with synchronous and asynchronous HTTP/gRPC support
- Host: GitHub
- URL: https://github.com/llukas22/tei-client
- Owner: LLukas22
- License: mit
- Created: 2024-08-21T12:49:55.000Z (3 months ago)
- Default Branch: main
- Last Pushed: 2024-08-29T12:47:54.000Z (3 months ago)
- Last Synced: 2024-09-29T15:41:11.962Z (about 2 months ago)
- Topics: client, embeddings, grpc, http
- Language: Python
- Homepage:
- Size: 66.4 KB
- Stars: 1
- Watchers: 1
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- Contributing: CONTRIBUTING.md
- License: LICENSE
Awesome Lists containing this project
README
# tei-client
[![PyPI Version](https://img.shields.io/pypi/v/tei-client.svg)](https://pypi.org/project/tei-client)
[![Supported Python Versions](https://img.shields.io/pypi/pyversions/tei-client.svg)](https://pypi.org/project/tei-client)
[![example workflow](https://github.com//LLukas22/tei-client/actions/workflows/CI.yml/badge.svg)](https://github.com/LLukas22/tei-client/actions)
[![coverage](https://raw.githubusercontent.com/LLukas22/tei-client/coverage/badge.svg)](https://github.com/LLukas22/tei-client/blob/coverage/coverage.xml)Convenience Client for [Hugging Face Text Embeddings Inference (TEI)](https://github.com/huggingface/text-embeddings-inference) with synchronous and asynchronous HTTP/gRPC support.
Implements the API defined in [TEI Swagger](https://huggingface.github.io/text-embeddings-inference/).
## Installation
You can easily install `tei-client` via pip:
```shell
pip install tei-client
```### Grpc Support
If you want to use grpc, you need to install `tei-client` with grpc support:
```shell
pip install tei-client[grpc]
```## Usage
## Creating a Client
### HTTP Example
To create an instance of the client, you can do the following:
```python
from tei_client import HTTPClienturl = 'http://localhost:8080'
client = HTTPClient(url)
```Example docker server
```shell
docker run -p 8080:80 -v ./tei_data:/data ghcr.io/huggingface/text-embeddings-inference:cpu-latest --model-id sentence-transformers/all-MiniLM-L6-v2
```### gRPC Example
Alternatively, you can use gRPC to connect to your server:
```python
import grpc
from tei_client import GrpcClientchannel = grpc.insecure_channel('localhost:8080')
client = GrpcClient(channel)
```Example docker server
```shell
docker run -p 8080:80 -v ./tei_data:/data ghcr.io/huggingface/text-embeddings-inference:cpu-latest-grpc --model-id sentence-transformers/all-MiniLM-L6-v2
```## Embedding
To generate embeddings, you can use the following methods:
#### Single Embedding Generation
You can generate a single embedding using the `embed` method:
```python
result = client.embed("This is an example sentence")
print(result[0])
```#### Batch Embedding Generation
To generate multiple embeddings in batch mode, use the `embed` method with a list of sentences:
```python
results = client.embed(["This is an example sentence", "This is another example sentence"])
for result in results:
print(result)
```#### Asynchronous Embedding Generation
For asynchronous embedding generation, you can use the `async_embed` method:
```python
result = await client.async_embed("This is an example sentence")
```## Classification
To generate classification results for a given text, you can use the following methods:
#### Basic Classification
You can classify a single text using the `classify` method:
```python
result = client.classify("This is an example sentence")
print(result[0].scores)
```#### NLI Style Classification
For Natural Language Inference (NLI) style classification, you can use the `classify` method with tuples as input. The first element of the tuple represents the premise and the second element represents the hypothesis.
```python
premise = "This is an example sentence"
hypothesis = "An example was given"result = client.classify((premise, hypothesis))
print(result[0].scores)
```#### Asynchronous and Batched Classification
The `classify` method also supports batched requests by passing a list of tuples or strings. For asynchronous classification, you can use the `async_classify` method.
```python
# Classify multiple texts in batch mode
results = client.classify(["This is an example sentence", "This is another example sentence"])
for result in results:
print(result.scores)# Asynchronous classification
result = await client.async_classify("This is an example sentence")
```## Reranking
Reranking allows you to refine the order of search results based on additional information. This feature is supported by the `rerank` method.
#### Basic Reranking
You can use the `rerank` method to rerank search results with the following syntax:
```python
result = client.rerank(
query="What is Deep Learning?", # Search query
texts=["Lore ipsum", "Deep Learning is ..."] # List of text snippets
)
print(result)
```
#### Asynchronous RerankingFor asynchronous reranking, use the `async_rerank` method:
```python
result = await client.async_rerank(
query="What is Deep Learning?", # Search query
texts=["Lore ipsum", "Deep Learning is ..."] # List of text snippets
)
```