1、介绍
LangChain4j 的目标是简化将大型语言模型集成到 Java 应用中。
本文详细介绍 LangChain4J 在 Java 项目中的集成与应用。涵盖基础入门、Spring Boot 整合、高低阶 API 使用、模型参数配置(日志、监控、重试、超时)、多模态视觉处理、流式输出、记忆存储(Redis 持久化)、提示词工程、工具调用(Function Calling)及向量数据库与 RAG 检索增强等内容。通过具体代码示例,帮助开发者快速掌握基于 LangChain4J 构建大模型应用的核心技能。

LangChain4j 的目标是简化将大型语言模型集成到 Java 应用中。

支持的模型地址 案例中使用阿里的通义千问和 deepseek 模型。

qwen-plus



<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>com.hk</groupId>
<artifactId>langcChain4J-study</artifactId>
<version>1.0-SNAPSHOT</version>
<packaging>pom</packaging>
<name>langchain4j-study 父工程</name>
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<project.reporting.outputEncoding>UTF-8</project.reporting.outputEncoding>
<java.version>17</java.version>
<maven.compiler.source>21</maven.compiler.source>
<maven.compiler.target>21</maven.compiler.target>
<!-- Spring Boot -->
<spring-boot.version>3.5.0</spring-boot.version>
<!-- Spring AI -->
<spring-ai.version>1.0.0</spring-ai.version>
<!-- Spring AI Alibaba -->
<spring-ai-alibaba.version>1.0.0-M6.1</spring-ai-alibaba.version>
<!-- langchain4j -->
<langchain4j.version>1.0.1</langchain4j.version>
<!--langchain4j-community 引入阿里云百炼平台依赖管理清单-->
<langchain4j-community.version>1.0.1-beta6</langchain4j-community.version>
<!-- maven plugin -->
<maven-deploy-plugin.version>3.1.1</maven-deploy-plugin.version>
<flatten-maven-plugin.version>1.3.0</flatten-maven-plugin.version>
<maven-compiler-plugin.version>3.8.1</maven-compiler-plugin.version>
</properties>
<dependencyManagement>
<dependencies>
<!-- Spring Boot -->
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-dependencies</artifactId>
<version>${spring-boot.version}</version>
<type>pom</type>
<scope>import</scope>
</dependency>
<!-- Spring AI -->
<dependency>
<groupId>org.springframework.ai</groupId>
<artifactId>spring-ai-bom</artifactId>
<version>${spring-ai.version}</version>
<type>pom</type>
<scope>import</scope>
</dependency>
<!-- Spring AI Alibaba -->
<dependency>
<groupId>com.alibaba.cloud.ai</groupId>
<artifactId>spring-ai-alibaba-starter</artifactId>
<version>${spring-ai-alibaba.version}</version>
</dependency>
<!--langchain4j 的依赖清单,加载 BOM 后所有 langchain4j 版本号可以被统一管理起来 https://docs.langchain4j.dev/get-started -->
<dependency>
<groupId>dev.langchain4j</groupId>
<artifactId>langchain4j-bom</artifactId>
<version>${langchain4j.version}</version>
<type>pom</type>
<scope>import</scope>
</dependency>
<!--引入阿里云百炼平台依赖管理清单 https://docs.langchain4j.dev/integrations/language-models/dashscope -->
<dependency>
<groupId>dev.langchain4j</groupId>
<artifactId>langchain4j-community-bom</artifactId>
<version>${langchain4j-community.version}</version>
<type>pom</type>
<scope>import</scope>
</dependency>
</dependencies>
</dependencyManagement>
<build>
<plugins>
<plugin>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
<version>${spring-boot.version}</version>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-deploy-plugin</artifactId>
<version>${maven-deploy-plugin.version}</version>
<configuration>
<skip>true</skip>
</configuration>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>${maven-compiler-plugin.version}</version>
<configuration>
<release>${java.version}</release>
<compilerArgs>
<compilerArg>-parameters</compilerArg>
</compilerArgs>
</configuration>
</plugin>
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>flatten-maven-plugin</artifactId>
<version>${flatten-maven-plugin.version}</version>
<inherited>true</inherited>
<executions>
<execution>
<id>flatten</id>
<phase>process-resources</phase>
<goals>
<goal>flatten</goal>
</goals>
<configuration>
<updatePomFile>true</updatePomFile>
<flattenMode>ossrh</flattenMode>
<pomElements>
<distributionManagement>remove</distributionManagement>
<dependencyManagement>remove</dependencyManagement>
<repositories>remove</repositories>
<scm>keep</scm>
<url>keep</url>
<organization>resolve</organization>
</pomElements>
</configuration>
</execution>
<execution>
<id>flatten.clean</id>
<phase>clean</phase>
<goals>
<goal>clean</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
</build>
<repositories>
<repository>
<id>spring-milestones</id>
<name>Spring Milestones</name>
<url>https://repo.spring.io/milestone</url>
<snapshots>
<enabled>false</enabled>
</snapshots>
</repository>
<repository>
<id>spring-snapshots</id>
<name>Spring Snapshots</name>
<url>https://repo.spring.io/snapshot</url>
<releases>
<enabled>false</enabled>
</releases>
</repository>
<repository>
<id>aliyunmaven</id>
<name>aliyun</name>
<url>https://maven.aliyun.com/repository/public</url>
</repository>
</repositories>
<pluginRepositories>
<pluginRepository>
<id>public</id>
<name>aliyun nexus</name>
<url>https://maven.aliyun.com/repository/public</url>
<releases>
<enabled>true</enabled>
</releases>
<snapshots>
<enabled>false</enabled>
</snapshots>
</pluginRepository>
</pluginRepositories>
</project>

<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<parent>
<groupId>com.hk</groupId>
<artifactId>langcChain4J-study</artifactId>
<version>1.0-SNAPSHOT</version>
</parent>
<artifactId>langcChain4J-01study</artifactId>
<properties>
<maven.compiler.source>21</maven.compiler.source>
<maven.compiler.target>21</maven.compiler.target>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
</properties>
<dependencies>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-web</artifactId>
</dependency>
<!--所有调用均基于 OpenAI 协议标准,实现一致的接口设计与规范 LangChain4j 提供与许多 LLM 提供商的集成 从最简单的开始方式是从 OpenAI 集成开始 https://docs.langchain4j.dev/get-started -->
<dependency>
<groupId>dev.langchain4j</groupId>
<artifactId>langchain4j-open-ai</artifactId>
</dependency>
<!--langchain4j 高阶-->
<dependency>
<groupId>dev.langchain4j</groupId>
<artifactId>langchain4j</artifactId>
</dependency>
<!--lombok-->
<dependency>
<groupId>org.projectlombok</groupId>
<artifactId>lombok</artifactId>
<optional>true</optional>
</dependency>
<!--test-->
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-test</artifactId>
<scope>test</scope>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
</plugin>
</plugins>
</build>
</project>
server.port=8001
spring.application.name=langchain4j-01study
@Configuration
public class LLMConfig {
@Bean
public ChatModel chatModel() {
return OpenAiChatModel.builder()
.apiKey(System.getenv("aliQwen-api"))
.baseUrl("https://dashscope.aliyuncs.com/compatible-mode/v1")
.modelName("qwen-plus")
.build();
}
}
@RestController
public class Study01Controller {
@Resource
private ChatModel chatModel;
@GetMapping("/chat")
public String chat(@RequestParam(name = "msg", defaultValue = "你是谁") String msg) {
return chatModel.chat(msg);
}
}





@Configuration
public class LLMConfig {
@Bean("qwen")
public ChatModel qwenChatModel() {
return OpenAiChatModel.builder()
.apiKey(System.getenv("aliQwen-api"))
.baseUrl("https://dashscope.aliyuncs.com/compatible-mode/v1")
.modelName("qwen-plus")
.build();
}
@Bean("deepseek")
public ChatModel deepseekChatModel() {
return OpenAiChatModel.builder()
.apiKey("sk-4af********")
.baseUrl("https://api.deepseek.com/v1")
.modelName("deepseek-chat")
.build();
}
}
@RestController
public class Study01Controller {
@Resource(name = "qwen")
private ChatModel qwenChatModel;
@Resource(name = "deepseek")
private ChatModel deepseekChatModel;
@GetMapping("/chat/v1")
public String chatV1(@RequestParam(name = "msg", defaultValue = "你是谁") String msg) {
return qwenChatModel.chat(msg);
}
@GetMapping("/chat/v2")
public String chatV2(@RequestParam(name = "msg", defaultValue = "你是谁") String msg) {
return deepseekChatModel.chat(msg);
}
}




<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<parent>
<groupId>com.hk</groupId>
<artifactId>langcChain4J-study</artifactId>
<version>1.0-SNAPSHOT</version>
</parent>
<artifactId>langChain4j-02boot</artifactId>
<properties>
<maven.compiler.source>21</maven.compiler.source>
<maven.compiler.target>21</maven.compiler.target>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
</properties>
<dependencies>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-web</artifactId>
</dependency>
<!--langchain4j 原生 基础-->
<dependency>
<groupId>dev.langchain4j</groupId>
<artifactId>langchain4j-open-ai</artifactId>
</dependency>
<!--langchain4j 原生 高阶-->
<dependency>
<groupId>dev.langchain4j</groupId>
<artifactId>langchain4j</artifactId>
</dependency>
<!--1 LangChain4j 整合 boot 底层支持-->
<!-- https://docs.langchain4j.dev/tutorials/spring-boot-integration -->
<dependency>
<groupId>dev.langchain4j</groupId>
<artifactId>langchain4j-open-ai-spring-boot-starter</artifactId>
</dependency>
<!--2 LangChain4j 整合 boot 高阶支持-->
<dependency>
<groupId>dev.langchain4j</groupId>
<artifactId>langchain4j-spring-boot-starter</artifactId>
</dependency>
<!--lombok-->
<dependency>
<groupId>org.projectlombok</groupId>
<artifactId>lombok</artifactId>
<optional>true</optional>
</dependency>
<!--test-->
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-test</artifactId>
<scope>test</scope>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
</plugin>
</plugins>
</build>
</project>
server.port=8002
spring.application.name=langchain4j-02boot
# https://docs.langchain4j.dev/tutorials/spring-boot-integration
langchain4j.open-ai.chat-model.api-key=${aliQwen-api}
langchain4j.open-ai.chat-model.model-name=qwen-plus
langchain4j.open-ai.chat-model.base-url=https://dashscope.aliyuncs.com/compatible-mode/v1
@RestController
public class PopularIntegrationsController {
@Resource
private ChatModel chatModel;
@GetMapping("/chatv1")
public String chat(String msg) {
return chatModel.chat(msg);
}
}
@AiService
public interface ChatAssistant {
String chat(String prompt);
}
@RestController
public class DeclarativeAIServiceController {
@Resource
private ChatAssistant chatAssistant;
@GetMapping("/chatv2")
public String chat(String msg) {
return chatAssistant.chat(msg);
}
}


低阶 api 有最多的选择,可以使用所有底层组件,比如 ChatModel 等。这些是基于 LLM 的应用中的'primitives'。可以完全控制组合它们,但需要写更多的代码。UserMessage, AiMessage, EmbeddingStore, Embedding。
高阶 api 在程序员自己定义接口,通过 AiServices 类里面的方法实现,优点是 Api 封装度较高,减少代码的复杂度,但是仍可以进行微调。

<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<parent>
<groupId>com.hk</groupId>
<artifactId>langcChain4J-study</artifactId>
<version>1.0-SNAPSHOT</version>
</parent>
<artifactId>langChain4j-03lowhigh</artifactId>
<properties>
<maven.compiler.source>21</maven.compiler.source>
<maven.compiler.target>21</maven.compiler.target>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
</properties>
<dependencies>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-web</artifactId>
</dependency>
<!--langchain4j-open-ai 基础-->
<!--所有调用均基于 OpenAI 协议标准,实现一致的接口设计与规范 LangChain4j 提供与许多 LLM 提供商的集成 从最简单的开始方式是从 OpenAI 集成开始 https://docs.langchain4j.dev/get-started -->
<dependency>
<groupId>dev.langchain4j</groupId>
<artifactId>langchain4j-open-ai</artifactId>
</dependency>
<!--langchain4j 高阶-->
<dependency>
<groupId>dev.langchain4j</groupId>
<artifactId>langchain4j</artifactId>
</dependency>
<!--lombok-->
<dependency>
<groupId>org.projectlombok</groupId>
<artifactId>lombok</artifactId>
<optional>true</optional>
</dependency>
<!--test-->
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-test</artifactId>
<scope>test</scope>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
</plugin>
</plugins>
</build>
</project>
server.port=8003
spring.application.name=langChain4j-03lowhigh
@Configuration
public class LLMConfig {
@Bean
public ChatModel chatModel() {
return OpenAiChatModel.builder()
.apiKey(System.getenv("aliQwen-api"))
.modelName("qwen-plus")
.baseUrl("https://dashscope.aliyuncs.com/compatible-mode/v1")
.build();
}
}
@RestController
public class LowApiController {
@Resource
private ChatModel chatModel;
@GetMapping("/chat1")
public String chat1(@RequestParam(name = "msg", defaultValue = "你是谁") String msg) {
return chatModel.chat(msg);
}
}

@RestController
public class LowApiController {
@Resource
private ChatModel chatModel;
@GetMapping("/chat1")
public String chat1(@RequestParam(name = "msg", defaultValue = "你是谁") String msg) {
return chatModel.chat(msg);
}
@GetMapping("/chat2")
public String chat2(@RequestParam(name = "msg", defaultValue = "你是谁") String msg) {
ChatResponse chatResponse = chatModel.chat(UserMessage.from(msg));
String text = chatResponse.aiMessage().text();
TokenUsage tokenUsage = chatResponse.tokenUsage();
System.out.println("本次 token 消耗:" + tokenUsage);
return text;
}
}

public interface ChatAssistant {
String chat(String prompt);
}
AiService 接口实现
package com.hk.study.config;
import com.hk.study.service.ChatAssistant;
import dev.langchain4j.model.chat.ChatModel;
import dev.langchain4j.model.openai.OpenAiChatModel;
import dev.langchain4j.service.AiServices;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
@Configuration
public class LLMConfig {
@Bean
public ChatModel chatModel() {
return OpenAiChatModel.builder()
.apiKey(System.getenv("aliQwen-api"))
.modelName("qwen-plus")
.baseUrl("https://dashscope.aliyuncs.com/compatible-mode/v1")
.build();
}
@Bean
public ChatAssistant chatAssistant(ChatModel chatModel) {
return AiServices.create(ChatAssistant.class, chatModel);
}
}
@RestController
public class HighApiController {
@Resource
private ChatAssistant chatAssistant;
@GetMapping("chatv2")
public String chatv2(String msg) {
return chatAssistant.chat(msg);
}
}

在 03 模块的基础上配置
@Bean
public ChatModel chatModel() {
return OpenAiChatModel.builder()
.apiKey(System.getenv("aliQwen-api"))
.modelName("qwen-plus")
.baseUrl("https://dashscope.aliyuncs.com/compatible-mode/v1")
// 开启日志,日志级别设置为 DEBUG 才会打印日志
.logRequests(true)
.logResponses(true)
.build();
}
server.port=8003
spring.application.name=langChain4j-03lowhigh
# 开启 langchain4j 的日志,日志级别设置为 DEBUG 才会打印日志
logging.level.dev.langchain4j=DEBUG

public class TestChatModelListener implements ChatModelListener {
@Override
public void onRequest(ChatModelRequestContext requestContext) {
// onRequest 配置的 k:v 键值对,在 onResponse 阶段可以获得,上下文传递参数好用
String uuidValue = IdUtil.simpleUUID();
requestContext.attributes().put("TraceID", uuidValue);
System.out.println("配置请求参数:TraceID == " + uuidValue);
}
@Override
public void onResponse(ChatModelResponseContext responseContext) {
Object object = responseContext.attributes().get("TraceID");
System.out.println("获取请求参数:TraceID == " + object);
}
@Override
public void onError(ChatModelErrorContext errorContext) {
System.out.println("出错了...");
}
}
@Bean
public ChatModel chatModel() {
return OpenAiChatModel.builder()
.apiKey(System.getenv("aliQwen-api"))
.modelName("qwen-plus")
.baseUrl("https://dashscope.aliyuncs.com/compatible-mode/v1")
.listeners(List.of(new TestChatModelListener()))
.build();
}

@Bean
public ChatModel chatModel() {
return OpenAiChatModel.builder()
.apiKey(System.getenv("aliQwen-api"))
.modelName("qwen-plus")
.baseUrl("https://dashscope.aliyuncs.com/compatible-mode/v1")
.logRequests(true)
.logResponses(true)
.maxRetries(2)
.build();
}
断开网络,再次访问


@Bean
public ChatModel chatModel() {
return OpenAiChatModel.builder()
.apiKey(System.getenv("aliQwen-api"))
.modelName("qwen-plus")
.baseUrl("https://dashscope.aliyuncs.com/compatible-mode/v1")
.logRequests(true)
.maxRetries(2)
.timeout(Duration.ofSeconds(1))
.build();
}

UserMessage 不仅可以包含文本,还可以包含其他类型的内容。包含一个 List<Content> contents。是一个接口,具有以下实现方式:
TextContentImageContentAudioContentVideoContentPdfFileContent本次选择的模型:qwen-vl-max

<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<parent>
<groupId>com.hk</groupId>
<artifactId>langcChain4J-study</artifactId>
<version>1.0-SNAPSHOT</version>
</parent>
<artifactId>langChain4j-04chatimage</artifactId>
<properties>
<maven.compiler.source>21</maven.compiler.source>
<maven.compiler.target>21</maven.compiler.target>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
</properties>
<dependencies>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-web</artifactId>
</dependency>
<!--langchain4j-open-ai 基础-->
<!--所有调用均基于 OpenAI 协议标准,实现一致的接口设计与规范 LangChain4j 提供与许多 LLM 提供商的集成 从最简单的开始方式是从 OpenAI 集成开始 https://docs.langchain4j.dev/get-started -->
<dependency>
<groupId>dev.langchain4j</groupId>
<artifactId>langchain4j-open-ai</artifactId>
</dependency>
<!--langchain4j 高阶-->
<dependency>
<groupId>dev.langchain4j</groupId>
<artifactId>langchain4j</artifactId>
</dependency>
<!--lombok-->
<dependency>
<groupId>org.projectlombok</groupId>
<artifactId>lombok</artifactId>
<optional>true</optional>
</dependency>
<!--hutool-->
<dependency>
<groupId>cn.hutool</groupId>
<artifactId>hutool-all</artifactId>
<version>5.8.22</version>
</dependency>
<!--test-->
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-test</artifactId>
<scope>test</scope>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
</plugin>
</plugins>
</build>
</project>
server.port=8004
spring.application.name=langchain4j-04chatimage
@Configuration
public class LLMConfig {
@Bean
public ChatModel chatModel() {
return OpenAiChatModel.builder()
.apiKey(System.getenv("aliQwen-api"))
.modelName("qwen-vl-max")
.baseUrl("https://dashscope.aliyuncs.com/compatible-mode/v1")
.build();
}
}
@RestController
public class ImageModelController {
@Resource
private ChatModel chatModel;
@Value("classpath:mi.jpg")
private org.springframework.core.io.Resource imageResource;
@GetMapping("/image")
public String image() throws IOException {
// 1、将图片通过 Base64 编码转为字符串
byte[] imageBytes = imageResource.getContentAsByteArray();
String encodeToString = Base64.getEncoder().encodeToString(imageBytes);
// 2、构建提示词 userMessage
UserMessage userMessage = UserMessage.from(
TextContent.from("从以下图片中获取来源网站名称,股价走势和 5 月 30 号股价"),
ImageContent.from(encodeToString, "image/jpg")
);
// 3、调用模型获取回答
ChatResponse chatResponse = chatModel.chat(userMessage);
String response = chatResponse.aiMessage().text();
return response;
}
}

结合阿里巴巴的通义万象进行图像理解,所有出处均来自官网,模型使用阿里的wanx2.1-t2i-turbo
<!--langchain4j-community 引入阿里云百炼平台依赖管理清单-->
<langchain4j-community.version>1.0.1-beta6</langchain4j-community.version>
<!--引入阿里云百炼平台依赖管理清单
https://docs.langchain4j.dev/integrations/language-models/dashscope
-->
<dependency>
<groupId>dev.langchain4j</groupId>
<artifactId>langchain4j-community-bom</artifactId>
<version>${langchain4j-community.version}</version>
<type>pom</type>
<scope>import</scope>
</dependency>

<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<parent>
<groupId>com.hk</groupId>
<artifactId>langcChain4J-study</artifactId>
<version>1.0-SNAPSHOT</version>
</parent>
<artifactId>langChain4j-05chatimage</artifactId>
<properties>
<maven.compiler.source>21</maven.compiler.source>
<maven.compiler.target>21</maven.compiler.target>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
</properties>
<dependencies>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-web</artifactId>
</dependency>
<!--langchain4j-open-ai 基础-->
<!--所有调用均基于 OpenAI 协议标准,实现一致的接口设计与规范 LangChain4j 提供与许多 LLM 提供商的集成 从最简单的开始方式是从 OpenAI 集成开始 https://docs.langchain4j.dev/get-started -->
<dependency>
<groupId>dev.langchain4j</groupId>
<artifactId>langchain4j-open-ai</artifactId>
</dependency>
<!--langchain4j 高阶-->
<dependency>
<groupId>dev.langchain4j</groupId>
<artifactId>langchain4j</artifactId>
</dependency>
<!--DashScope (Qwen) 接入阿里云百炼平台 https://docs.langchain4j.dev/integrations/language-models/dashscope -->
<dependency>
<groupId>dev.langchain4j</groupId>
<artifactId>langchain4j-community-dashscope-spring-boot-starter</artifactId>
</dependency>
<!--lombok-->
<dependency>
<groupId>org.projectlombok</groupId>
<artifactId>lombok</artifactId>
<optional>true</optional>
</dependency>
<!--hutool-->
<dependency>
<groupId>cn.hutool</groupId>
<artifactId>hutool-all</artifactId>
<version>5.8.22</version>
</dependency>
<!--test-->
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-test</artifactId>
<scope>test</scope>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
</plugin>
</plugins>
</build>
</project>
server.port=8005
spring.application.name=langchain4j-05chatimage
@Configuration
public class LLMConfig {
@Bean
public WanxImageModel wanxImageModel() {
return WanxImageModel.builder()
.apiKey(System.getenv("aliQwen-api"))
.modelName("wanx2.1-t2i-turbo")
.build();
}
}
@RestController
public class ImageModelController {
@Resource
private WanxImageModel wanxImageModel;
@GetMapping("/chat/image-1")
public String image1() {
Response<Image> generate = wanxImageModel.generate("一张图,描述:一个人在沙滩上");
return generate.content().url().toString();
}
@GetMapping("/chat/image-2")
public String image2() throws NoApiKeyException {
String prompt = "近景镜头,18 岁的中国女孩,古代服饰,圆脸,正面看着镜头," +
"民族优雅的服装,商业摄影,室外,电影级光照,半身特写,精致的淡妆,锐利的边缘。";
ImageSynthesisParam build = ImageSynthesisParam.builder()
.apiKey(System.getenv("aliQwen-api"))
.model("wanx2.1-t2i-turbo")
.prompt(prompt)
.n(2)
.size("1024*1024")
.build();
ImageSynthesis imageSynthesis = new ImageSynthesis();
ImageSynthesisResult result = imageSynthesis.call(build);
return JsonUtils.toJson(result);
}
}




流式输出 (StreamingOutput) 是一种逐步返回大模型生成结果的技术,允许服务器将响应内容分批次实时传输给客户端,而不是等待全部内容生成完毕后再一次性返回。这种机制能显著提升用户体验,尤其适用于大模型响应较慢的场景(如生成长文本或复杂推理结果)。

<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<parent>
<groupId>com.hk</groupId>
<artifactId>langcChain4J-study</artifactId>
<version>1.0-SNAPSHOT</version>
</parent>
<artifactId>langChain4j-06chatstream</artifactId>
<properties>
<maven.compiler.source>21</maven.compiler.source>
<maven.compiler.target>21</maven.compiler.target>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
</properties>
<dependencies>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-web</artifactId>
</dependency>
<!--langchain4j-open-ai + langchain4j + langchain4j-reactor-->
<dependency>
<groupId>dev.langchain4j</groupId>
<artifactId>langchain4j-open-ai</artifactId>
</dependency>
<dependency>
<groupId>dev.langchain4j</groupId>
<artifactId>langchain4j</artifactId>
</dependency>
<dependency>
<groupId>dev.langchain4j</groupId>
<artifactId>langchain4j-reactor</artifactId>
</dependency>
<!--lombok-->
<dependency>
<groupId>org.projectlombok</groupId>
<artifactId>lombok</artifactId>
<optional>true</optional>
</dependency>
<!--hutool-->
<dependency>
<groupId>cn.hutool</groupId>
<artifactId>hutool-all</artifactId>
<version>5.8.22</version>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-test</artifactId>
<scope>test</scope>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
</plugin>
</plugins>
</build>
</project>
server.port=8006
spring.application.name=langchain4j-06chat-stream
# 设置响应的字符编码,避免流式返回输出乱码
server.servlet.encoding.charset=utf-8
server.servlet.encoding.enabled=true
server.servlet.encoding.force=true
public interface ChatAssistant {
Flux<String> chat(String prompt);
}
@Configuration
public class LLMConfig {
@Bean
public StreamingChatModel streamingChatModel() {
return OpenAiStreamingChatModel.builder()
.apiKey(System.getenv("aliQwen-api"))
.modelName("qwen-plus")
.baseUrl("https://dashscope.aliyuncs.com/compatible-mode/v1")
.build();
}
@Bean
public ChatAssistant chatAssistant(StreamingChatModel streamingChatModel) {
return AiServices.create(ChatAssistant.class, streamingChatModel);
}
}
@RestController
public class StreamingChatModelController {
@Resource
private StreamingChatModel streamingChatModel;
@Resource
private ChatAssistant chatAssistant;
// 方式一:使用 Flux.create 手动创建 Flux
@GetMapping("/chat1")
public Flux<String> chat1(String msg) {
return Flux.create(fluxSink -> {
streamingChatModel.chat(msg, new StreamingChatResponseHandler() {
@Override
public void onPartialResponse(String s) {
System.out.println(s);
fluxSink.next(s);
}
@Override
public void onCompleteResponse(ChatResponse chatResponse) {
System.out.println("---response over: " + chatResponse);
fluxSink.complete();
}
@Override
public void onError(Throwable throwable) {
fluxSink.error(throwable);
}
});
});
}
// 方式二
@GetMapping("/chat2")
public Flux<String> chat2(String msg) {
return chatAssistant.chat(msg);
}
}


记忆缓存是聊天系统中的一个重要组件,用于存储和管理对话的上下文信息。它的主要作用是让 AI 助手能够'记住'之前的对话内容,从而提供连贯和个性化的回复。
LangChain4j 提供两种开箱即用实现:

<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<parent>
<groupId>com.hk</groupId>
<artifactId>langcChain4J-study</artifactId>
<version>1.0-SNAPSHOT</version>
</parent>
<artifactId>langChain4j-07chatmemory</artifactId>
<properties>
<maven.compiler.source>21</maven.compiler.source>
<maven.compiler.target>21</maven.compiler.target>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
</properties>
<dependencies>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-web</artifactId>
</dependency>
<!--langchain4j-open-ai + langchain4j + langchain4j-reactor-->
<dependency>
<groupId>dev.langchain4j</groupId>
<artifactId>langchain4j-open-ai</artifactId>
</dependency>
<dependency>
<groupId>dev.langchain4j</groupId>
<artifactId>langchain4j</artifactId>
</dependency>
<dependency>
<groupId>dev.langchain4j</groupId>
<artifactId>langchain4j-reactor</artifactId>
</dependency>
<!--lombok-->
<dependency>
<groupId>org.projectlombok</groupId>
<artifactId>lombok</artifactId>
<optional>true</optional>
</dependency>
<!--hutool-->
<dependency>
<groupId>cn.hutool</groupId>
<artifactId>hutool-all</artifactId>
<version>5.8.22</version>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-test</artifactId>
<scope>test</scope>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
</plugin>
</plugins>
</build>
</project>
server.port=8007
spring.application.name=langchain4j-07chat-memory
# 设置响应的字符编码
server.servlet.encoding.charset=utf-8
server.servlet.encoding.enabled=true
server.servlet.encoding.force=true
public interface ChatMemoryAssistant {
/**
* 聊天带记忆缓存功能
*
* @param userId 用户 ID
* @param prompt 消息
* @return String
*/
String chatWithChatMemory(@MemoryId Long userId, @UserMessage String prompt);
}
@Configuration
public class LLMConfig {
@Bean
public ChatModel chatModel() {
return OpenAiChatModel.builder()
.apiKey(System.getenv("aliQwen-api"))
.modelName("qwen-long")
.baseUrl("https://dashscope.aliyuncs.com/compatible-mode/v1")
.build();
}
@Bean("chatMessageWindowChatMemory")
public ChatMemoryAssistant chatMessageWindowChatMemory(ChatModel chatModel) {
return AiServices.builder(ChatMemoryAssistant.class)
.chatModel(chatModel)
// 每个 memoryId 对应创建一个 ChatMemory
.chatMemoryProvider(memoryId -> MessageWindowChatMemory.withMaxMessages(20))
.build();
}
@Bean
public ChatMemoryAssistant chatTokenWindowChatMemory(ChatModel chatModel) {
// TokenCountEstimator 默认的 token 分词器,需要结合 Tokenizer 计算 ChatMessage 的 token 数量
OpenAiTokenCountEstimator openAiTokenCountEstimator = new OpenAiTokenCountEstimator("gpt-4");
return AiServices.builder(ChatMemoryAssistant.class)
.chatModel(chatModel)
// 每个 memoryId 对应创建一个 ChatMemory
.chatMemoryProvider(memoryId -> TokenWindowChatMemory.withMaxTokens(100, openAiTokenCountEstimator))
.build();
}
}
@RestController
public class ChatMemoryController {
@Resource(name = "chatMessageWindowChatMemory")
private ChatMemoryAssistant chatMessageWindowChatMemory;
@Resource(name = "chatTokenWindowChatMemory")
private ChatMemoryAssistant chatTokenWindowChatMemory;
// MessageWindowChatMemory 实现聊天功能
@GetMapping("/chatv1")
public void chatv1() {
chatMessageWindowChatMemory.chatWithChatMemory(1L, "你好!我的名字是 Java.");
String answer01 = chatMessageWindowChatMemory.chatWithChatMemory(1L, "我的名字是什么");
System.out.println("answer01 返回结果:" + answer01);
chatMessageWindowChatMemory.chatWithChatMemory(3L, "你好!我的名字是 C++");
String answer02 = chatMessageWindowChatMemory.chatWithChatMemory(3L, "我的名字是什么");
System.out.println("answer02 返回结果:" + answer02);
}
// TokenWindowChatMemory 实现聊天功能
@GetMapping("/chatv2")
public void chatv2() {
chatTokenWindowChatMemory.chatWithChatMemory(1L, "你好!我的名字是 Java.");
String answer01 = chatTokenWindowChatMemory.chatWithChatMemory(1L, "我的名字是什么");
System.out.println("answer01 返回结果:" + answer01);
chatTokenWindowChatMemory.chatWithChatMemory(3L, "你好!我的名字是 C++");
String answer02 = chatTokenWindowChatMemory.chatWithChatMemory(3L, "我的名字是什么");
System.out.println("answer02 返回结果:" + answer02);
}
}


五种类型的聊天的消息:
CustomMessage:这是一个可以包含任意属性的自定义消息。这种消息类型只能被支持它的实现使用(目前仅支持 Ollama)。
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<parent>
<groupId>com.hk</groupId>
<artifactId>langcChain4J-study</artifactId>
<version>1.0-SNAPSHOT</version>
</parent>
<artifactId>langChain4j-08chatprompt</artifactId>
<properties>
<maven.compiler.source>21</maven.compiler.source>
<maven.compiler.target>21</maven.compiler.target>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
</properties>
<dependencies>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-web</artifactId>
</dependency>
<!--langchain4j-open-ai + langchain4j + langchain4j-reactor-->
<dependency>
<groupId>dev.langchain4j</groupId>
<artifactId>langchain4j-open-ai</artifactId>
</dependency>
<dependency>
<groupId>dev.langchain4j</groupId>
<artifactId>langchain4j</artifactId>
</dependency>
<dependency>
<groupId>dev.langchain4j</groupId>
<artifactId>langchain4j-reactor</artifactId>
</dependency>
<!--lombok-->
<dependency>
<groupId>org.projectlombok</groupId>
<artifactId>lombok</artifactId>
<optional>true</optional>
</dependency>
<!--hutool-->
<dependency>
<groupId>cn.hutool</groupId>
<artifactId>hutool-all</artifactId>
<version>5.8.22</version>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-test</artifactId>
<scope>test</scope>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
</plugin>
</plugins>
</build>
</project>
server.port=8008
spring.application.name=langchain4j-08chat-prompt
# 设置响应的字符编码
server.servlet.encoding.charset=utf-8
server.servlet.encoding.enabled=true
server.servlet.encoding.force=true
@SystemMessage + @UserMessage + @V
public interface LawAssistant {
@SystemMessage("你是一位专业的中国法律顾问,只回答与中国法律相关的问题。" +
"输出限制:对于其他领域的问题禁止回答,直接返回'抱歉,我只能回答中国法律相关的问题。'")
@UserMessage("请回答以下法律问题:{{question}},字数控制在{{length}}以内")
String chat(@V("question") String question, @V("length") int length);
}
@Configuration
public class LLMConfig {
@Bean
public ChatModel chatModel() {
return OpenAiChatModel.builder()
.apiKey(System.getenv("aliQwen-api"))
.modelName("qwen-long")
.baseUrl("https://dashscope.aliyuncs.com/compatible-mode/v1")
.build();
}
@Bean
public LawAssistant lawAssistant(ChatModel chatModel) {
return AiServices.create(LawAssistant.class, chatModel);
}
}
@RestController
public class ChatPromptController {
@Resource
private LawAssistant lawAssistant;
@GetMapping(value = "/chatprompt/test1")
public void test1() {
String chat = lawAssistant.chat("什么是知识产权?", 2000);
System.out.println(chat);
System.out.println("========================================================");
String chat2 = lawAssistant.chat("什么是 java?", 2000);
System.out.println(chat2);
System.out.println("========================================================");
String chat3 = lawAssistant.chat("介绍下西瓜和芒果", 2000);
System.out.println(chat3);
System.out.println("========================================================");
String chat4 = lawAssistant.chat("飞机发动机原理", 2000);
System.out.println(chat4);
}
}


使用@structuredPrompt 业务实体类
@Data
@StructuredPrompt("根据中国{{legal}}法律,解答以下问题:{{question}}")
public class LawPrompt {
private String legal;
private String question;
}
public interface LawAssistant {
@SystemMessage("你是一位专业的中国法律顾问,只回答与中国法律相关的问题。" +
"输出限制:对于其他领域的问题禁止回答,直接返回'抱歉,我只能回答中国法律相关的问题。'")
@UserMessage("请回答以下法律问题:{{question}},字数控制在{{length}}以内")
String chat(@V("question") String question, @V("length") int length);
@SystemMessage("你是一位专业的中国法律顾问,只回答与中国法律相关的问题。" +
"输出限制:对于其他领域的问题禁止回答,直接返回'抱歉,我只能回答中国法律相关的问题。'")
String chat(LawPrompt lawPrompt);
}
@RestController
public class ChatPromptController {
@Resource
private LawAssistant lawAssistant;
......
@GetMapping(value = "/chatprompt/test2")
public String test2() {
LawPrompt prompt = new LawPrompt();
prompt.setLegal("知识产权");
prompt.setQuestion("TRIPS 协议?");
String chat = lawAssistant.chat(prompt);
System.out.println(chat);
return chat;
}
}

在 LangChain4j 中有两个对象 PromptTemplate 以及 Prompt 用来实现提示词相关功能
单个参数可以使用{it}"占位符或者"{{参数名}",如果为其他字符,系统不能自动识别会报错。
@Resource
private ChatModel chatModel;
@GetMapping(value = "/chatprompt/test3")
public String test3() {
// 默认 PromptTemplate 构造使用 it 属性作为默认占位符
String role = "财务会计";
String question = "人民币大写";
//1 构造 PromptTemplate 模板
PromptTemplate template = PromptTemplate.from("你是一个{{it}}助手,{{question}}怎么办");
//2 由 PromptTemplate 生成 Prompt
Prompt prompt = template.apply(Map.of("it", role, "question", question));
//3 Prompt 提示词变成 UserMessage
UserMessage userMessage = prompt.toUserMessage();
//4 调用大模型
ChatResponse chatResponse = chatModel.chat(userMessage);
//4.1 后台打印
System.out.println(chatResponse.aiMessage().text());
//4.2 前台返回
return chatResponse.aiMessage().text();
}

将用户和大模型的对话进行持久化存储

<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<parent>
<groupId>com.hk</groupId>
<artifactId>langcChain4J-study</artifactId>
<version>1.0-SNAPSHOT</version>
</parent>
<artifactId>langChain4j-09chatPersistence</artifactId>
<properties>
<maven.compiler.source>21</maven.compiler.source>
<maven.compiler.target>21</maven.compiler.target>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
</properties>
<dependencies>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-web</artifactId>
</dependency>
<!--langchain4j-open-ai + langchain4j + langchain4j-reactor-->
<dependency>
<groupId>dev.langchain4j</groupId>
<artifactId>langchain4j-open-ai</artifactId>
</dependency>
<dependency>
<groupId>dev.langchain4j</groupId>
<artifactId>langchain4j</artifactId>
</dependency>
<dependency>
<groupId>dev.langchain4j</groupId>
<artifactId>langchain4j-reactor</artifactId>
</dependency>
<!--spring-boot-starter-data-redis https://docs.langchain4j.dev/tutorials/chat-memory#persistence -->
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-data-redis</artifactId>
</dependency>
<!--lombok-->
<dependency>
<groupId>org.projectlombok</groupId>
<artifactId>lombok</artifactId>
<optional>true</optional>
</dependency>
<!--hutool-->
<dependency>
<groupId>cn.hutool</groupId>
<artifactId>hutool-all</artifactId>
<version>5.8.22</version>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-test</artifactId>
<scope>test</scope>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
</plugin>
</plugins>
</build>
</project>
server.port=8009
spring.application.name=langchain4j-09chat-persistence
# 设置响应的字符编码
server.servlet.encoding.charset=utf-8
server.servlet.encoding.enabled=true
server.servlet.encoding.force=true
# ==========config redis===============
spring.data.redis.host=localhost
spring.data.redis.port=6379
spring.data.redis.password=12345
spring.data.redis.database=0
spring.data.redis.connect-timeout=3s
spring.data.redis.timeout=2s
public interface ChatPersistenceAssistant {
String chat(@MemoryId Long userId, @UserMessage String message);
}
@Configuration
public class RedisConfig {
@Bean
public RedisTemplate<String, Object> redisTemplate(RedisConnectionFactory redisConnectionFactory) {
RedisTemplate<String, Object> template = new RedisTemplate<>();
template.setConnectionFactory(redisConnectionFactory);
//设置 key 序列化方式 string
template.setKeySerializer(new StringRedisSerializer());
//设置 value 序列化方式 json
template.setDefaultSerializer(new GenericJackson2JsonRedisSerializer());
template.setHashKeySerializer(new StringRedisSerializer());
template.setHashValueSerializer(new GenericJackson2JsonRedisSerializer());
template.setEnableTransactionSupport(true);
template.afterPropertiesSet();
return template;
}
}
@Component
public class RedisChatMemoryStore implements ChatMemoryStore {
private static final String CHAT_MEMORY_KEY_PREFIX = "chat_memory:";
@Resource
private RedisTemplate<String, String> redisTemplate;
@Override
public List<ChatMessage> getMessages(Object memoryId) {
String messages = redisTemplate.opsForValue().get(CHAT_MEMORY_KEY_PREFIX + memoryId);
return ChatMessageDeserializer.messagesFromJson(messages);
}
@Override
public void updateMessages(Object memoryId, List<ChatMessage> list) {
redisTemplate.opsForValue().set(CHAT_MEMORY_KEY_PREFIX + memoryId, ChatMessageSerializer.messagesToJson(list));
}
@Override
public void deleteMessages(Object memoryId) {
redisTemplate.delete(CHAT_MEMORY_KEY_PREFIX + memoryId);
}
}
@Configuration
public class LLMConfig {
@Resource
private RedisChatMemoryStore redisChatMemoryStore;
@Bean
public ChatModel chatModel() {
return OpenAiChatModel.builder()
.apiKey(System.getenv("aliQwen-api"))
.modelName("qwen-plus")
.baseUrl("https://dashscope.aliyuncs.com/compatible-mode/v1")
.build();
}
@Bean
public ChatPersistenceAssistant chatPersistenceAssistant(ChatModel chatModel) {
ChatMemoryProvider chatMemoryProvider = memoryId -> MessageWindowChatMemory.builder()
.id(memoryId)
.maxMessages(10)
.chatMemoryStore(redisChatMemoryStore)
.build();
return AiServices.builder(ChatPersistenceAssistant.class)
.chatModel(chatModel)
.chatMemoryProvider(chatMemoryProvider)
.build();
}
}
@RestController
public class ChatPersistenceController {
@Resource
private ChatPersistenceAssistant chatPersistenceAssistant;
@GetMapping("/chat")
public void chat() {
chatPersistenceAssistant.chat(1L, "你好!我的名字是 redis");
chatPersistenceAssistant.chat(2L, "你好!我的名字是 nacos");
String chat = chatPersistenceAssistant.chat(1L, "我的名字是什么");
System.out.println(chat);
System.out.println("=====================================================");
chat = chatPersistenceAssistant.chat(2L, "我的名字是什么");
System.out.println(chat);
}
}


大模型对话可回答训练时间之前的问题,对当前时间的问题无法获取,需要给大模型配一个调用其它外部 Uti 工具类;大语言模型 (LLMs) 不仅仅是文本生成的能手,它们还能触发并调用第 3 方函数,比如查询微信/调用支付宝/查看顺丰快递单据号等等..,LLM 本身并不执行函数,它只是指示应该调用哪个函数以及如何调用。

<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<parent>
<groupId>com.hk</groupId>
<artifactId>langcChain4J-study</artifactId>
<version>1.0-SNAPSHOT</version>
</parent>
<artifactId>langChain4j-10chatFunctionCalling</artifactId>
<properties>
<maven.compiler.source>21</maven.compiler.source>
<maven.compiler.target>21</maven.compiler.target>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
</properties>
<dependencies>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-web</artifactId>
</dependency>
<!--langchain4j-open-ai-->
<dependency>
<groupId>dev.langchain4j</groupId>
<artifactId>langchain4j-open-ai</artifactId>
</dependency>
<!--langchain4j-->
<dependency>
<groupId>dev.langchain4j</groupId>
<artifactId>langchain4j</artifactId>
</dependency>
<!--langchain4j-reactor-->
<dependency>
<groupId>dev.langchain4j</groupId>
<artifactId>langchain4j-reactor</artifactId>
</dependency>
<!--httpclient5-->
<dependency>
<groupId>org.apache.httpcomponents.client5</groupId>
<artifactId>httpclient5</artifactId>
<version>5.5</version>
</dependency>
<!--lombok-->
<dependency>
<groupId>org.projectlombok</groupId>
<artifactId>lombok</artifactId>
<optional>true</optional>
</dependency>
<!--hutool-->
<dependency>
<groupId>cn.hutool</groupId>
<artifactId>hutool-all</artifactId>
<version>5.8.22</version>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-test</artifactId>
<scope>test</scope>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
</plugin>
</plugins>
</build>
</project>
server.port=8010
spring.application.name=langchain4j-10chat-functioncalling
public interface FunctionAssistant {
String chat(String message);
}
@Configuration
public class LLMConfig {
@Bean
public ChatModel chatModel() {
return OpenAiChatModel.builder()
.apiKey(System.getenv("aliQwen-api"))
.modelName("qwen-plus")
.baseUrl("https://dashscope.aliyuncs.com/compatible-mode/v1")
.build();
}
// 低阶工具 Api
@Bean
public FunctionAssistant functionAssistant(ChatModel chatModel) {
ToolSpecification toolSpecification = ToolSpecification.builder()
.name("开局发票助手")
.description("根据用户提交的开票信息,开具发票")
.parameters(JsonObjectSchema.builder()
.addStringProperty("companyName", "公司名称")
.addStringProperty("dutyNumber", "税号序列")
.addStringProperty("amount", "开票金额,保留两位有效数字")
.build())
.build();
ToolExecutor toolExecutor = (toolExecutionRequest, memoryId) -> {
System.out.println("toolExecutionRequest=====" + toolExecutionRequest.id());
System.out.println("toolExecutionRequest=====" + toolExecutionRequest.name());
String arguments = toolExecutionRequest.arguments();
System.out.println("toolExecutionRequest=====" + arguments);
return "发票已开具";
};
return AiServices.builder(FunctionAssistant.class)
.chatModel(chatModel)
.tools(Map.of(toolSpecification, toolExecutor))
.build();
}
}
@RestController
public class ChatFunctionCallingController {
@Resource
private FunctionAssistant functionAssistant;
@GetMapping("/chat")
public void chat() {
String msg = "开张发票,公司:尚硅谷教育科技有限公司 税号:zfdehtrhrt533 金额:668.12";
String chat = functionAssistant.chat(msg);
System.out.println("ok:" + chat);
}
}







@Service
public class WeatherService {
private static final String WEATHER_API_URL = "https://nXXXX.qweatherapi.com/v7/weather/now?location=%s&key=%s";
public JsonNode getWeather(String location) throws JsonProcessingException {
String url = String.format(WEATHER_API_URL, location, "XXXXXX");
// 发送 HTTP 请求并解析响应
CloseableHttpClient httpClient = HttpClients.createDefault();
// 这里省略具体实现,假设使用 RestTemplate 发送 GET 请求并返回 JsonNode
HttpComponentsClientHttpRequestFactory requestFactory = new HttpComponentsClientHttpRequestFactory(httpClient);
// 配置 RestTemplate 使用 HttpComponentsClientHttpRequestFactory
RestTemplate restTemplate = new RestTemplate(requestFactory);
// 发送 GET 请求并获取响应
String response = restTemplate.getForObject(url, String.class);
// 解析响应体为 JsonNode
JsonNode jsonNode = new ObjectMapper().readTree(response);
return jsonNode;
}
}
public class InvoiceHandler {
@Tool("获取天气信息")
public String handle(@P("天气状况") String text, @P("风向") String windDir) throws Exception {
System.out.println("天气状况:" + text + " 风向:" + windDir);
//----------------------------------
// 自己的业务逻辑,调用 redis/rabbitmq/kafka/mybatis/顺丰单据/医疗化验报告/支付接口等第 3 方
//----------------------------------
System.out.println(new WeatherService().getWeather("101010100"));
return "获取成功";
}
}
@Configuration
public class LLMConfig {
@Bean
public ChatModel chatModel() {
return OpenAiChatModel.builder()
.apiKey(System.getenv("aliQwen-api"))
.modelName("qwen-plus")
.baseUrl("https://dashscope.aliyuncs.com/compatible-mode/v1")
.build();
}
// 低阶工具 Api
@Bean("lowfunction")
public FunctionAssistant functionAssistant(ChatModel chatModel) {
ToolSpecification toolSpecification = ToolSpecification.builder()
.name("开局发票助手")
.description("根据用户提交的开票信息,开具发票")
.parameters(JsonObjectSchema.builder()
.addStringProperty("companyName", "公司名称")
.addStringProperty("dutyNumber", "税号序列")
.addStringProperty("amount", "开票金额,保留两位有效数字")
.build())
.build();
ToolExecutor toolExecutor = (toolExecutionRequest, memoryId) -> {
System.out.println("toolExecutionRequest=====" + toolExecutionRequest.id());
System.out.println("toolExecutionRequest=====" + toolExecutionRequest.name());
String arguments = toolExecutionRequest.arguments();
System.out.println("toolExecutionRequest=====" + arguments);
return "发票已开具";
};
return AiServices.builder(FunctionAssistant.class)
.chatModel(chatModel)
.tools(Map.of(toolSpecification, toolExecutor))
.build();
}
// 高阶工具 Api
@Bean("highfunction")
public FunctionAssistant highFunctionAssistant(ChatModel chatModel) {
return AiServices.builder(FunctionAssistant.class)
.chatModel(chatModel)
.tools(new InvoiceHandler())
.build();
}
}
@RestController
public class ChatFunctionCallingController {
// @Resource(name = "lowfunction")
// private FunctionAssistant functionAssistant;
@Resource(name = "highfunction")
private FunctionAssistant highFunctionAssistant;
// @GetMapping("/chat")
// public void chat() {
// String msg = "开张发票,公司:尚硅谷教育科技有限公司 税号:zfdehtrhrt533 金额:668.12";
// String chat = functionAssistant.chat(msg);
// System.out.println("ok:" + chat);
// }
@GetMapping("/chat2")
public void chat2() {
String chat = highFunctionAssistant.chat("北京今天的天气");
System.out.println("ok:" + chat);
}
}

持续更新中......
持续更新中......
持续更新中......

微信公众号「极客日志」,在微信中扫描左侧二维码关注。展示文案:极客日志 zeeklog
查找任何按下的键的javascript键代码、代码、位置和修饰符。 在线工具,Keycode 信息在线工具,online
JavaScript 字符串转义/反转义;Java 风格 \uXXXX(Native2Ascii)编码与解码。 在线工具,Escape 与 Native 编解码在线工具,online
使用 Prettier 在浏览器内格式化 JavaScript 或 HTML 片段。 在线工具,JavaScript / HTML 格式化在线工具,online
Terser 压缩、变量名混淆,或 javascript-obfuscator 高强度混淆(体积会增大)。 在线工具,JavaScript 压缩与混淆在线工具,online
生成新的随机RSA私钥和公钥pem证书。 在线工具,RSA密钥对生成器在线工具,online
基于 Mermaid.js 实时预览流程图、时序图等图表,支持源码编辑与即时渲染。 在线工具,Mermaid 预览与可视化编辑在线工具,online