Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/tzolov/spring-ai-function-calling-portability
Demonstrate Function Calling code portability across 4 AI Models: OpenAI, AzureOpenAI, VertexAI Gemini and Mistral AI.
https://github.com/tzolov/spring-ai-function-calling-portability
Last synced: 4 days ago
JSON representation
Demonstrate Function Calling code portability across 4 AI Models: OpenAI, AzureOpenAI, VertexAI Gemini and Mistral AI.
- Host: GitHub
- URL: https://github.com/tzolov/spring-ai-function-calling-portability
- Owner: tzolov
- License: apache-2.0
- Created: 2024-03-04T14:17:57.000Z (8 months ago)
- Default Branch: main
- Last Pushed: 2024-06-07T12:13:47.000Z (5 months ago)
- Last Synced: 2024-06-07T13:37:40.072Z (5 months ago)
- Language: Java
- Homepage: https://docs.spring.io/spring-ai/reference/index.html
- Size: 80.1 KB
- Stars: 7
- Watchers: 3
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE.txt
Awesome Lists containing this project
README
# Spring AI Function Calling Portability
Demonstrate `Function Calling` code portability across 4 AI Models: OpenAI, AzureOpenAI, VertexAI Gemini and Mistral AI.
Use Case: Suppose we want the AI model to respond with information that it does not have.
For example the status of your recent payment transactions.
Users can ask questions about current status for certain payment transactions and use function calling to answer them.For example, let's consider a sample dataset and a function that retrieves the payment status given a transaction:
```java
record Transaction(String id) {
}record Status(String name) {
}private static final Map DATASET =
Map.of(
new Transaction("001"), new Status("pending"),
new Transaction("002"), new Status("approved"),
new Transaction("003"), new Status("rejected"));@Bean
@Description("Get the status of a payment transaction")
public Function paymentStatus() {
return transaction -> DATASET.get(transaction);
}
```Function is registered as `@Bean` and uses the `@Description` annotation to define function description.
Spring AI greatly simplifies code you need to write to support function invocation.
It brokers the function invocation conversation for you.
You simply provide your function definition as a `@Bean` and then provide the bean name of the function in your prompt options.Lets add the boot starters for 4 AI Models that support function calling:
```xml
org.springframework.boot
spring-boot-starterorg.springframework.ai
spring-ai-mistral-ai-spring-boot-starterorg.springframework.ai
spring-ai-openai-spring-boot-starterorg.springframework.ai
spring-ai-vertex-ai-gemini-spring-boot-starterorg.springframework.ai
spring-ai-azure-openai-spring-boot-starterorg.springframework.ai
spring-ai-anthropic-spring-boot-starter```
and configure them in `application.properties`:
```
# MistralAI
spring.ai.mistralai.api-key=${MISTRAL_AI_API_KEY}
spring.ai.mistralai.chat.options.model=mistral-small-latest
spring.ai.mistralai.chat.options.functions=paymentStatus# OpenAI
spring.ai.openai.api-key=${OPENAI_API_KEY}
spring.ai.openai.chat.options.functions=paymentStatus# Google VertexAI Gemini
spring.ai.vertex.ai.gemini.project-id=${VERTEX_AI_GEMINI_PROJECT_ID}
spring.ai.vertex.ai.gemini.location=${VERTEX_AI_GEMINI_LOCATION}
spring.ai.vertex.ai.gemini.chat.options.model=gemini-pro
spring.ai.vertex.ai.gemini.chat.options..functions=paymentStatus# Microsoft Azure OpenAI
spring.ai.azure.openai.api-key=${AZURE_OPENAI_API_KEY}
spring.ai.azure.openai.endpoint=${AZURE_OPENAI_ENDPOINT}
# This name is acutally the model deployment name in the Azure OpenAI platform.
spring.ai.azure.openai.chat.options.model=gpt-4-0125-preview
spring.ai.azure.openai.chat.options.functions=paymentStatus
```Now you can test them with the same prompt:
```java
@Bean
ApplicationRunner applicationRunner(
MistralAiChatModel mistralAi,
VertexAiGeminiChatModel vertexAiGemini,
OpenAiChatModel openAi,
AzureOpenAiChatModel azureOpenAi,
AnthropicChatModel anthropicChatClient) {return args -> {
String prompt = "What is the status of my payment transaction 003?";
System.out.println("MISTRAL_AI: " + mistralAi.call(prompt));
System.out.println("VERTEX_AI_GEMINI: " + vertexAiGemini.call(prompt));
System.out.println("OPEN_AI: " + openAi.call(prompt));
System.out.println("AZURE_OPEN_AI: " + azureOpenAi.call(prompt));
System.out.println("ANTHROPIC: " + anthropicChatClient.call(prompt));
};
}
```The output would look something like:
```
MISTRAL_AI: The status of your payment transaction 003 is rejected.
VERTEX_AI_GEMINI: Your transaction has been rejected.
OPEN_AI: The status of your payment transaction 003 is rejected.
AZURE_OPEN_AI: The status of your payment transaction 003 is "rejected".
```If you change the question, slightly, you can see the limitations of some models.
For example lets ask form multiple transactions (e.g. activate the Parallel Function calling):
`String prompt = "What is the status of my payment transactions 003 and 001?"
Then the result would look something like:```
MISTRAL_AI: To check the status of multiple payment transactions, I would need to call the "paymentStatus" function for each transaction ID separately as the function currently only accepts one transaction ID at a time. Here are the requests:
1. For transaction ID 003:
``
[{"name": "paymentStatus", "arguments": {"id": "003"}}]
``2. For transaction ID 001:
``
[{"name": "paymentStatus", "arguments": {"id": "001"}}]
``
Please send these requests one by one to get the status of your transactions.VERTEX_AI_GEMINI: OK. The status of the payment transaction with the ID `003` is `rejected` and the status of the payment transaction with the ID `001` is `pending`.
OPEN_AI: The status of payment transaction 003 is rejected and the status of payment transaction 001 is pending.
AZURE_OPEN_AI: The status of your payment transactions is as follows:
- Transaction 003: Rejected
- Transaction 001: Pending
```As you can see, currently Mistral AI doesn't support parallel function calling.
## Related [Spring AI](https://docs.spring.io/spring-ai/reference) documentation:
* [Spring AI OpenAI](https://docs.spring.io/spring-ai/reference/api/chat/openai-chat.html) and [Function Calling](https://docs.spring.io/spring-ai/reference/api/chat/functions/openai-chat-functions.html)
* [Spring AI Azure OpenAI](https://docs.spring.io/spring-ai/reference/api/chat/azure-openai-chat.html) and [Function Calling](https://docs.spring.io/spring-ai/reference/api/chat/functions/azure-open-ai-chat-functions.html)
* [Spring AI Google VertexAI Gemini](https://docs.spring.io/spring-ai/reference/api/chat/vertexai-gemini-chat.html) and [Function Calling](https://docs.spring.io/spring-ai/reference/api/chat/functions/vertexai-gemini-chat-functions.html)
* [Spring AI Mistral AI](https://docs.spring.io/spring-ai/reference/api/chat/mistralai-chat.html) and [Function Calling](https://docs.spring.io/spring-ai/reference/api/chat/functions/mistralai-chat-functions.html)
* [Spring AI Anthropic AI](https://docs.spring.io/spring-ai/reference/api/chat/anthropic-chat.html) and [Function Calling](https://docs.spring.io/spring-ai/reference/api/chat/functions/anthropic-chat-functions.html)
## Native (GraalVM) Build
You can build this as a native executable.
First maker sure you are using GraalVM 21 JDK. For example:
```
export JAVA_HOME=/Library/Java/JavaVirtualMachines/graalvm-jdk-21.0.2+13.1/Contents/Home
```Then build:
```
./mvnw clean install -Pnative native:compile
```Run the native executable:
```
./target/function-calling-portability
```