https://github.com/agents-flex/agents-flex
Agents-Flex is an elegant LLM Application Framework like LangChain with Java.
https://github.com/agents-flex/agents-flex
agent ai chatbot chatgpt gpt langchain4j llama3 llm ollama spring-ai
Last synced: 6 months ago
JSON representation
Agents-Flex is an elegant LLM Application Framework like LangChain with Java.
- Host: GitHub
- URL: https://github.com/agents-flex/agents-flex
- Owner: agents-flex
- License: apache-2.0
- Created: 2024-01-12T08:23:40.000Z (over 1 year ago)
- Default Branch: main
- Last Pushed: 2025-04-11T10:26:23.000Z (6 months ago)
- Last Synced: 2025-04-11T11:58:00.551Z (6 months ago)
- Topics: agent, ai, chatbot, chatgpt, gpt, langchain4j, llama3, llm, ollama, spring-ai
- Language: Java
- Homepage: https://agentsflex.com
- Size: 6.85 MB
- Stars: 376
- Watchers: 11
- Forks: 69
- Open Issues: 6
-
Metadata Files:
- Readme: readme.md
- Changelog: changes.md
- License: LICENSE
Awesome Lists containing this project
- awesome-ChatGPT-repositories - agents-flex - Agents-Flex is an elegant LLM Application Framework like LangChain with Java. (Chatbots)
README
English | 简体中文 | 日本語
![]()
# Agents-Flex is a LLM Application Framework like LangChain base on Java.
---
## Features
- LLM Visit
- Prompt、Prompt Template
- Function Calling Definer, Invoker、Running
- Memory
- Embedding
- Vector Store
- Resource Loaders
- Document
- Splitter
- Loader
- Parser
- PoiParser
- PdfBoxParser
- Agent
- LLM Agent
- Chain
- SequentialChain
- ParallelChain
- LoopChain
- ChainNode
- AgentNode
- EndNode
- RouterNode
- GroovyRouterNode
- QLExpressRouterNode
- LLMRouterNode## Simple Chat
use OpenAI LLM:
```java
@Test
public void testChat() {
OpenAILlmConfig config = new OpenAILlmConfig();
config.setApiKey("sk-rts5NF6n*******");Llm llm = new OpenAILlm(config);
String response = llm.chat("what is your name?");System.out.println(response);
}
```use Qwen LLM:
```java
@Test
public void testChat() {
QwenLlmConfig config = new QwenLlmConfig();
config.setApiKey("sk-28a6be3236****");
config.setModel("qwen-turbo");Llm llm = new QwenLlm(config);
String response = llm.chat("what is your name?");System.out.println(response);
}
```use SparkAi LLM:
```java
@Test
public void testChat() {
SparkLlmConfig config = new SparkLlmConfig();
config.setAppId("****");
config.setApiKey("****");
config.setApiSecret("****");Llm llm = new SparkLlm(config);
String response = llm.chat("what is your name?");System.out.println(response);
}
```## Chat With Histories
```java
public static void main(String[] args) {
SparkLlmConfig config = new SparkLlmConfig();
config.setAppId("****");
config.setApiKey("****");
config.setApiSecret("****");Llm llm = new SparkLlm(config);
HistoriesPrompt prompt = new HistoriesPrompt();
System.out.println("ask for something...");
Scanner scanner = new Scanner(System.in);
String userInput = scanner.nextLine();while (userInput != null) {
prompt.addMessage(new HumanMessage(userInput));
llm.chatStream(prompt, (context, response) -> {
System.out.println(">>>> " + response.getMessage().getContent());
});userInput = scanner.nextLine();
}
}
```## Function Calling
- step 1: define the function native
```java
public class WeatherUtil {@FunctionDef(name = "get_the_weather_info", description = "get the weather info")
public static String getWeatherInfo(
@FunctionParam(name = "city", description = "the city name") String name
) {
//we should invoke the third part api for weather info here
return "Today it will be dull and overcast in " + name;
}
}```
- step 2: invoke the function from LLM
```java
public static void main(String[] args) {
OpenAILlmConfig config = new OpenAILlmConfig();
config.setApiKey("sk-rts5NF6n*******");OpenAILlm llm = new OpenAILlm(config);
FunctionPrompt prompt = new FunctionPrompt("How is the weather in Beijing today?", WeatherUtil.class);
FunctionResultResponse response = llm.chat(prompt);Object result = response.getFunctionResult();
System.out.println(result);
//Today it will be dull and overcast in Beijing
}
```## Communication
- Twitter: https://twitter.com/yangfuhai
## Modules
