An open API service indexing awesome lists of open source software.

https://github.com/chigwell/tech-summary

tech-summary processes text to extract structured summaries of technical concepts, ensuring consistent and reliable output for developers, educators, and writers.
https://github.com/chigwell/tech-summary

automation comparison-of-features concise-explanations consistent-output data-extraction developer-tools educational-aids formalization manual-reformatting-avoidance pattern-matching programming-languages reliable-formatting structured-summaries technical-concepts

Last synced: 30 days ago
JSON representation

tech-summary processes text to extract structured summaries of technical concepts, ensuring consistent and reliable output for developers, educators, and writers.

Awesome Lists containing this project

README

          

# tech-summary
[![PyPI version](https://badge.fury.io/py/tech-summary.svg)](https://badge.fury.io/py/tech-summary)
[![License: MIT](https://img.shields.io/badge/License-MIT-green.svg)](https://opensource.org/licenses/MIT)
[![Downloads](https://static.pepy.tech/badge/tech-summary)](https://pepy.tech/project/tech-summary)
[![LinkedIn](https://img.shields.io/badge/LinkedIn-blue)](https://www.linkedin.com/in/eugene-evstafev-716669181/)

Package to extract structured summaries of technical concepts from text input.

## Overview

This package uses pattern matching to ensure output consistency and reliability, avoiding unstructured or ambiguous responses. It's useful for developers, educators, or technical writers who need concise, formatted explanations without manual reformatting.

## Installation

```bash
pip install tech_summary
```

## Usage

```python
from tech_summary import tech_summary

user_input = "Compare garbage collection and move semantics in programming languages."
response = tech_summary(user_input)
print(response)
```

You can also pass a LangChain LLM instance to use:

```python
from langchain_llm7 import ChatLLM7
from tech_summary import tech_summary

llm = ChatLLM7()
response = tech_summary(user_input, llm=llm)
print(response)
```

You can also use another LLM instance (e.g. OpenAI, Anthropic, Google Generative AI) by passing your own instance:

```python
from langchain_openai import ChatOpenAI
from tech_summary import tech_summary

llm = ChatOpenAI()
response = tech_summary(user_input, llm=llm)
print(response)

from langchain_anthropic import ChatAnthropic
from tech_summary import tech_summary

llm = ChatAnthropic()
response = tech_summary(user_input, llm=llm)
print(response)

from langchain_google_genai import ChatGoogleGenerativeAI
from tech_summary import tech_summary

llm = ChatGoogleGenerativeAI()
response = tech_summary(user_input, llm=llm)
print(response)
```

## Configuration

You can configure the LLM7 API key by setting the `LLM7_API_KEY` environment variable or passing it directly to the `tech_summary` function:

```python
tech_summary(user_input, api_key="your_api_key")
```

If you haven't registered for an API key, you can get one for free at https://token.llm7.io/.

## GitHub

Raise issues at https://github.tech-summary.

## Author

Eugene Evstafev
hi@euegne.plus

## Changelog

This package is under development. See GitHub for updates.

## Acknowledgments

This package uses ChatLLM7 (https://pypi.org/project/langchain-llm7/) by default.