跳到主要内容

千帆

百度智能云千帆大模型 image

Maven 依赖

您可以在纯 Java 或 Spring Boot 应用程序中使用 DashScope 和 LangChain4j。

纯 Java

备注

1.0.0-alpha1 起,langchain4j-qianfan 已迁移到 langchain4j-community 并更名为 langchain4j-community-qianfan

1.0.0-alpha1 之前:

<dependency>
<groupId>dev.langchain4j</groupId>
<artifactId>langchain4j-qianfan</artifactId>
<version>${previous version here}</version>
</dependency>

1.0.0-alpha1 及之后:

<dependency>
<groupId>dev.langchain4j</groupId>
<artifactId>langchain4j-community-qianfan</artifactId>
<version>${latest version here}</version>
</dependency>

Spring Boot

备注

1.0.0-alpha1 起,langchain4j-qianfan-spring-boot-starter 已迁移到 langchain4j-community 并更名为 langchain4j-community-qianfan-spring-boot-starter

1.0.0-alpha1 之前:

<dependency>
<groupId>dev.langchain4j</groupId>
<artifactId>langchain4j-qianfan-spring-boot-starter</artifactId>
<version>${previous version here}</version>
</dependency>

1.0.0-alpha1 及之后:

<dependency>
<groupId>dev.langchain4j</groupId>
<artifactId>langchain4j-community-qianfan-spring-boot-starter</artifactId>
<version>${latest version here}</version>
</dependency>

或者,您可以使用 BOM 来一致地管理依赖项:

<dependencyManagement>
<dependency>
<groupId>dev.langchain4j</groupId>
<artifactId>langchain4j-community-bom</artifactId>
<version>${latest version here}</version>
<typ>pom</typ>
<scope>import</scope>
</dependency>
</dependencyManagement>

QianfanChatModel

千帆所有模型及付费状态

QianfanChatModel model = QianfanChatModel.builder()
.apiKey("apiKey")
.secretKey("secretKey")
.modelName("Yi-34B-Chat")
.build();

QianfanStreamingChatModel

QianfanStreamingChatModel qianfanStreamingChatModel = QianfanStreamingChatModel.builder()
.apiKey("apiKey")
.secretKey("secretKey")
.modelName("Yi-34B-Chat")
.build();

qianfanStreamingChatModel.chat("讲个故事")
.onNext(System.out::println)
.onComplete(() -> System.out.println("完成"))
.onError(Throwable::printStackTrace)
.start();

以下是另一个通过TokenStream来实现

  QianfanStreamingChatModel qianfanStreamingChatModel = QianfanStreamingChatModel.builder()
.apiKey("apiKey")
.secretKey("secretKey")
.modelName("Yi-34B-Chat")
.build();
IAiService assistant = AiServices.create(IAiService.class, qianfanStreamingChatModel);

TokenStream tokenStream = assistant.chatInTokenStream("Tell me a story.");
tokenStream.onPartialResponse(System.out::println)
.onError(Throwable::printStackTrace)
.start();

QianfanRAG

程序自动将匹配的内容与用户问题组装成一个Prompt,向大语言模型提问,大语言模型返回答案

LangChain4j 有一个"Easy RAG"功能,使开始使用 RAG 变得尽可能简单。您不必了解嵌入、选择向量存储、找到合适的嵌入模型、弄清楚如何解析和拆分文档等。只需指向您的文档,LangChain4j 将完成其魔法。

  • 导入依赖:langchain4j-easy-rag
<dependency>
<groupId>dev.langchain4j</groupId>
<artifactId>langchain4j-easy-rag</artifactId>
<version>1.0.0-beta3</version>
</dependency>
  • 使用

QianfanChatModel chatLanguageModel = QianfanChatModel.builder()
.apiKey(API_KEY)
.secretKey(SECRET_KEY)
.modelName("Yi-34B-Chat")
.build();
// 目录中的所有文件,txt 似乎更快
List<Document> documents = FileSystemDocumentLoader.loadDocuments("/home/langchain4j/documentation");
// 为简单起见,我们将使用内存存储:
InMemoryEmbeddingStore<TextSegment> embeddingStore = new InMemoryEmbeddingStore<>();
EmbeddingStoreIngestor.ingest(documents, embeddingStore);

IAiService assistant = AiServices.builder(IAiService.class)
.chatLanguageModel(chatLanguageModel)
.chatMemory(MessageWindowChatMemory.withMaxMessages(10))
.contentRetriever(EmbeddingStoreContentRetriever.from(embeddingStore))
.build();

String answer = assistant.chat("问题");
System.out.println(answer);

示例