2.1 开发环境准备
2.1.1 系统要求
基础要求: - Java 17 或更高版本 - Maven 3.6+ 或 Gradle 7.0+ - IDE:IntelliJ IDEA、Eclipse 或 VS Code - Git 版本控制
推荐配置: - 内存:8GB 以上 - 存储:SSD硬盘,至少50GB可用空间 - 网络:稳定的互联网连接(用于AI API调用)
2.1.2 Java环境配置
# 检查Java版本
java -version
# 设置JAVA_HOME环境变量(Windows)
set JAVA_HOME=C:\Program Files\Java\jdk-17
set PATH=%JAVA_HOME%\bin;%PATH%
# 设置JAVA_HOME环境变量(Linux/Mac)
export JAVA_HOME=/usr/lib/jvm/java-17-openjdk
export PATH=$JAVA_HOME/bin:$PATH
2.1.3 Maven配置
<!-- settings.xml 配置镜像源 -->
<settings>
<mirrors>
<mirror>
<id>aliyun</id>
<name>Aliyun Maven</name>
<url>https://maven.aliyun.com/repository/public</url>
<mirrorOf>central</mirrorOf>
</mirror>
</mirrors>
<profiles>
<profile>
<id>jdk-17</id>
<activation>
<activeByDefault>true</activeByDefault>
<jdk>17</jdk>
</activation>
<properties>
<maven.compiler.source>17</maven.compiler.source>
<maven.compiler.target>17</maven.compiler.target>
<maven.compiler.compilerVersion>17</maven.compiler.compilerVersion>
</properties>
</profile>
</profiles>
</settings>
2.2 Spring AI项目创建
2.2.1 使用Spring Initializr创建项目
方式一:Web界面创建 1. 访问 https://start.spring.io/ 2. 选择以下配置: - Project: Maven - Language: Java - Spring Boot: 3.2.0+ - Java: 17 - Packaging: Jar
方式二:命令行创建
curl https://start.spring.io/starter.zip \
-d type=maven-project \
-d language=java \
-d bootVersion=3.2.0 \
-d baseDir=spring-ai-demo \
-d groupId=com.example \
-d artifactId=spring-ai-demo \
-d name=spring-ai-demo \
-d description="Spring AI Demo Project" \
-d packageName=com.example.springai \
-d packaging=jar \
-d javaVersion=17 \
-d dependencies=web,actuator \
-o spring-ai-demo.zip
unzip spring-ai-demo.zip
cd spring-ai-demo
2.2.2 添加Spring AI依赖
<!-- pom.xml -->
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0
http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<parent>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-parent</artifactId>
<version>3.2.0</version>
<relativePath/>
</parent>
<groupId>com.example</groupId>
<artifactId>spring-ai-demo</artifactId>
<version>0.0.1-SNAPSHOT</version>
<name>spring-ai-demo</name>
<description>Spring AI Demo Project</description>
<properties>
<java.version>17</java.version>
<spring-ai.version>0.8.1</spring-ai.version>
</properties>
<dependencies>
<!-- Spring Boot Starters -->
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-web</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-actuator</artifactId>
</dependency>
<!-- Spring AI Core -->
<dependency>
<groupId>org.springframework.ai</groupId>
<artifactId>spring-ai-core</artifactId>
<version>${spring-ai.version}</version>
</dependency>
<!-- OpenAI Integration -->
<dependency>
<groupId>org.springframework.ai</groupId>
<artifactId>spring-ai-openai-spring-boot-starter</artifactId>
<version>${spring-ai.version}</version>
</dependency>
<!-- Azure OpenAI Integration -->
<dependency>
<groupId>org.springframework.ai</groupId>
<artifactId>spring-ai-azure-openai-spring-boot-starter</artifactId>
<version>${spring-ai.version}</version>
</dependency>
<!-- Ollama Integration -->
<dependency>
<groupId>org.springframework.ai</groupId>
<artifactId>spring-ai-ollama-spring-boot-starter</artifactId>
<version>${spring-ai.version}</version>
</dependency>
<!-- Vector Store Dependencies -->
<dependency>
<groupId>org.springframework.ai</groupId>
<artifactId>spring-ai-chroma-store</artifactId>
<version>${spring-ai.version}</version>
</dependency>
<dependency>
<groupId>org.springframework.ai</groupId>
<artifactId>spring-ai-pinecone-store</artifactId>
<version>${spring-ai.version}</version>
</dependency>
<!-- Test Dependencies -->
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-test</artifactId>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.springframework.ai</groupId>
<artifactId>spring-ai-test</artifactId>
<version>${spring-ai.version}</version>
<scope>test</scope>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
</plugin>
</plugins>
</build>
<!-- Spring AI Repository -->
<repositories>
<repository>
<id>spring-milestones</id>
<name>Spring Milestones</name>
<url>https://repo.spring.io/milestone</url>
<snapshots>
<enabled>false</enabled>
</snapshots>
</repository>
<repository>
<id>spring-snapshots</id>
<name>Spring Snapshots</name>
<url>https://repo.spring.io/snapshot</url>
<releases>
<enabled>false</enabled>
</releases>
</repository>
</repositories>
</project>
2.2.3 Gradle配置(可选)
// build.gradle
plugins {
id 'java'
id 'org.springframework.boot' version '3.2.0'
id 'io.spring.dependency-management' version '1.1.4'
}
group = 'com.example'
version = '0.0.1-SNAPSHOT'
java.sourceCompatibility = JavaVersion.VERSION_17
configurations {
compileOnly {
extendsFrom annotationProcessor
}
}
repositories {
mavenCentral()
maven { url 'https://repo.spring.io/milestone' }
maven { url 'https://repo.spring.io/snapshot' }
}
ext {
set('springAiVersion', '0.8.1')
}
dependencies {
implementation 'org.springframework.boot:spring-boot-starter-web'
implementation 'org.springframework.boot:spring-boot-starter-actuator'
implementation "org.springframework.ai:spring-ai-core:${springAiVersion}"
implementation "org.springframework.ai:spring-ai-openai-spring-boot-starter:${springAiVersion}"
implementation "org.springframework.ai:spring-ai-azure-openai-spring-boot-starter:${springAiVersion}"
implementation "org.springframework.ai:spring-ai-ollama-spring-boot-starter:${springAiVersion}"
testImplementation 'org.springframework.boot:spring-boot-starter-test'
testImplementation "org.springframework.ai:spring-ai-test:${springAiVersion}"
}
tasks.named('test') {
useJUnitPlatform()
}
2.3 配置文件设置
2.3.1 基础配置
# application.yml
spring:
application:
name: spring-ai-demo
# AI配置
ai:
openai:
api-key: ${OPENAI_API_KEY:your-openai-api-key}
base-url: ${OPENAI_BASE_URL:https://api.openai.com}
chat:
options:
model: gpt-3.5-turbo
temperature: 0.7
max-tokens: 1000
azure:
openai:
api-key: ${AZURE_OPENAI_API_KEY:your-azure-api-key}
endpoint: ${AZURE_OPENAI_ENDPOINT:https://your-resource.openai.azure.com/}
chat:
options:
deployment-name: gpt-35-turbo
temperature: 0.7
ollama:
base-url: ${OLLAMA_BASE_URL:http://localhost:11434}
chat:
options:
model: llama2
temperature: 0.8
# 服务器配置
server:
port: 8080
servlet:
context-path: /api
# 监控配置
management:
endpoints:
web:
exposure:
include: health,info,metrics,prometheus
endpoint:
health:
show-details: always
metrics:
export:
prometheus:
enabled: true
# 日志配置
logging:
level:
org.springframework.ai: DEBUG
com.example.springai: DEBUG
pattern:
console: "%d{yyyy-MM-dd HH:mm:ss} [%thread] %-5level %logger{36} - %msg%n"
2.3.2 环境特定配置
# application-dev.yml
spring:
ai:
openai:
api-key: ${OPENAI_API_KEY_DEV}
logging:
level:
root: INFO
org.springframework.ai: DEBUG
---
# application-prod.yml
spring:
ai:
openai:
api-key: ${OPENAI_API_KEY_PROD}
chat:
options:
temperature: 0.5 # 生产环境使用更保守的参数
max-tokens: 500
logging:
level:
root: WARN
org.springframework.ai: INFO
management:
endpoints:
web:
exposure:
include: health,metrics
2.3.3 环境变量配置
# .env 文件
OPENAI_API_KEY=sk-your-openai-api-key
AZURE_OPENAI_API_KEY=your-azure-api-key
AZURE_OPENAI_ENDPOINT=https://your-resource.openai.azure.com/
OLLAMA_BASE_URL=http://localhost:11434
# 数据库配置
SPRING_DATASOURCE_URL=jdbc:postgresql://localhost:5432/springai
SPRING_DATASOURCE_USERNAME=springai
SPRING_DATASOURCE_PASSWORD=password
# Redis配置
SPRING_REDIS_HOST=localhost
SPRING_REDIS_PORT=6379
SPRING_REDIS_PASSWORD=
2.4 项目结构设计
2.4.1 标准项目结构
spring-ai-demo/
├── src/
│ ├── main/
│ │ ├── java/
│ │ │ └── com/
│ │ │ └── example/
│ │ │ └── springai/
│ │ │ ├── SpringAiDemoApplication.java
│ │ │ ├── config/
│ │ │ │ ├── AIConfiguration.java
│ │ │ │ ├── VectorStoreConfiguration.java
│ │ │ │ └── SecurityConfiguration.java
│ │ │ ├── controller/
│ │ │ │ ├── ChatController.java
│ │ │ │ ├── EmbeddingController.java
│ │ │ │ └── ImageController.java
│ │ │ ├── service/
│ │ │ │ ├── ChatService.java
│ │ │ │ ├── EmbeddingService.java
│ │ │ │ ├── DocumentService.java
│ │ │ │ └── FunctionService.java
│ │ │ ├── model/
│ │ │ │ ├── dto/
│ │ │ │ ├── entity/
│ │ │ │ └── request/
│ │ │ ├── repository/
│ │ │ ├── function/
│ │ │ │ ├── WeatherFunction.java
│ │ │ │ └── CalculatorFunction.java
│ │ │ └── util/
│ │ │ ├── PromptUtils.java
│ │ │ └── ResponseUtils.java
│ │ └── resources/
│ │ ├── application.yml
│ │ ├── application-dev.yml
│ │ ├── application-prod.yml
│ │ ├── prompts/
│ │ │ ├── chat-template.st
│ │ │ └── system-message.txt
│ │ └── static/
│ │ └── index.html
│ └── test/
│ └── java/
│ └── com/
│ └── example/
│ └── springai/
│ ├── SpringAiDemoApplicationTests.java
│ ├── service/
│ └── controller/
├── docker/
│ ├── Dockerfile
│ └── docker-compose.yml
├── docs/
│ ├── API.md
│ └── DEPLOYMENT.md
├── scripts/
│ ├── start.sh
│ └── deploy.sh
├── .env.example
├── .gitignore
├── README.md
└── pom.xml
2.4.2 主应用类
// SpringAiDemoApplication.java
package com.example.springai;
import org.springframework.boot.SpringApplication;
import org.springframework.boot.autoconfigure.SpringBootApplication;
import org.springframework.cache.annotation.EnableCaching;
import org.springframework.scheduling.annotation.EnableAsync;
@SpringBootApplication
@EnableCaching
@EnableAsync
public class SpringAiDemoApplication {
public static void main(String[] args) {
SpringApplication.run(SpringAiDemoApplication.class, args);
}
}
2.4.3 基础配置类
// AIConfiguration.java
package com.example.springai.config;
import org.springframework.ai.chat.ChatModel;
import org.springframework.ai.embedding.EmbeddingModel;
import org.springframework.ai.openai.OpenAiChatModel;
import org.springframework.ai.openai.OpenAiEmbeddingModel;
import org.springframework.ai.openai.api.OpenAiApi;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.boot.autoconfigure.condition.ConditionalOnProperty;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.context.annotation.Primary;
import org.springframework.retry.annotation.EnableRetry;
import org.springframework.retry.support.RetryTemplate;
@Configuration
@EnableRetry
public class AIConfiguration {
@Value("${spring.ai.openai.api-key}")
private String openAiApiKey;
@Value("${spring.ai.openai.base-url:https://api.openai.com}")
private String openAiBaseUrl;
@Bean
@Primary
@ConditionalOnProperty(name = "spring.ai.openai.api-key")
public ChatModel openAiChatModel() {
return new OpenAiChatModel(
OpenAiApi.builder()
.withApiKey(openAiApiKey)
.withBaseUrl(openAiBaseUrl)
.build()
);
}
@Bean
@Primary
@ConditionalOnProperty(name = "spring.ai.openai.api-key")
public EmbeddingModel openAiEmbeddingModel() {
return new OpenAiEmbeddingModel(
OpenAiApi.builder()
.withApiKey(openAiApiKey)
.withBaseUrl(openAiBaseUrl)
.build()
);
}
@Bean
public RetryTemplate retryTemplate() {
return RetryTemplate.builder()
.maxAttempts(3)
.exponentialBackoff(1000, 2, 10000)
.retryOn(RuntimeException.class)
.build();
}
}
2.5 第一个AI应用
2.5.1 简单聊天服务
// ChatService.java
package com.example.springai.service;
import org.springframework.ai.chat.ChatModel;
import org.springframework.ai.chat.ChatResponse;
import org.springframework.ai.chat.prompt.Prompt;
import org.springframework.stereotype.Service;
import reactor.core.publisher.Flux;
@Service
public class ChatService {
private final ChatModel chatModel;
public ChatService(ChatModel chatModel) {
this.chatModel = chatModel;
}
public String chat(String message) {
Prompt prompt = new Prompt(message);
ChatResponse response = chatModel.call(prompt);
return response.getResult().getOutput().getContent();
}
public Flux<String> chatStream(String message) {
Prompt prompt = new Prompt(message);
return chatModel.stream(prompt)
.map(response -> response.getResult().getOutput().getContent());
}
}
2.5.2 REST控制器
// ChatController.java
package com.example.springai.controller;
import com.example.springai.service.ChatService;
import org.springframework.http.MediaType;
import org.springframework.web.bind.annotation.*;
import reactor.core.publisher.Flux;
@RestController
@RequestMapping("/chat")
@CrossOrigin(origins = "*")
public class ChatController {
private final ChatService chatService;
public ChatController(ChatService chatService) {
this.chatService = chatService;
}
@PostMapping
public String chat(@RequestBody String message) {
return chatService.chat(message);
}
@PostMapping(value = "/stream", produces = MediaType.TEXT_PLAIN_VALUE)
public Flux<String> chatStream(@RequestBody String message) {
return chatService.chatStream(message);
}
@GetMapping("/health")
public String health() {
return "Chat service is running!";
}
}
2.5.3 请求响应模型
// ChatRequest.java
package com.example.springai.model.request;
import com.fasterxml.jackson.annotation.JsonProperty;
public class ChatRequest {
@JsonProperty("message")
private String message;
@JsonProperty("temperature")
private Double temperature;
@JsonProperty("max_tokens")
private Integer maxTokens;
// 构造函数
public ChatRequest() {}
public ChatRequest(String message) {
this.message = message;
}
// Getters and Setters
public String getMessage() {
return message;
}
public void setMessage(String message) {
this.message = message;
}
public Double getTemperature() {
return temperature;
}
public void setTemperature(Double temperature) {
this.temperature = temperature;
}
public Integer getMaxTokens() {
return maxTokens;
}
public void setMaxTokens(Integer maxTokens) {
this.maxTokens = maxTokens;
}
}
// ChatResponse.java
package com.example.springai.model.dto;
import com.fasterxml.jackson.annotation.JsonProperty;
import java.time.LocalDateTime;
public class ChatResponse {
@JsonProperty("content")
private String content;
@JsonProperty("timestamp")
private LocalDateTime timestamp;
@JsonProperty("model")
private String model;
@JsonProperty("tokens_used")
private Integer tokensUsed;
// 构造函数
public ChatResponse() {
this.timestamp = LocalDateTime.now();
}
public ChatResponse(String content) {
this();
this.content = content;
}
// Getters and Setters
public String getContent() {
return content;
}
public void setContent(String content) {
this.content = content;
}
public LocalDateTime getTimestamp() {
return timestamp;
}
public void setTimestamp(LocalDateTime timestamp) {
this.timestamp = timestamp;
}
public String getModel() {
return model;
}
public void setModel(String model) {
this.model = model;
}
public Integer getTokensUsed() {
return tokensUsed;
}
public void setTokensUsed(Integer tokensUsed) {
this.tokensUsed = tokensUsed;
}
}
2.6 测试和验证
2.6.1 单元测试
// ChatServiceTest.java
package com.example.springai.service;
import org.junit.jupiter.api.Test;
import org.springframework.ai.chat.ChatModel;
import org.springframework.ai.chat.ChatResponse;
import org.springframework.ai.chat.Generation;
import org.springframework.ai.chat.prompt.Prompt;
import org.springframework.boot.test.context.SpringBootTest;
import org.springframework.boot.test.mock.mockito.MockBean;
import org.springframework.test.context.junit.jupiter.SpringJUnitConfig;
import static org.junit.jupiter.api.Assertions.*;
import static org.mockito.ArgumentMatchers.any;
import static org.mockito.Mockito.when;
@SpringBootTest
@SpringJUnitConfig
class ChatServiceTest {
@MockBean
private ChatModel chatModel;
@Test
void testChat() {
// 准备测试数据
String inputMessage = "Hello, AI!";
String expectedResponse = "Hello! How can I help you today?";
// 模拟ChatModel响应
ChatResponse mockResponse = new ChatResponse(
List.of(new Generation(expectedResponse))
);
when(chatModel.call(any(Prompt.class))).thenReturn(mockResponse);
// 创建服务实例
ChatService chatService = new ChatService(chatModel);
// 执行测试
String actualResponse = chatService.chat(inputMessage);
// 验证结果
assertEquals(expectedResponse, actualResponse);
}
}
2.6.2 集成测试
// ChatControllerIntegrationTest.java
package com.example.springai.controller;
import org.junit.jupiter.api.Test;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.boot.test.autoconfigure.web.servlet.AutoConfigureWebMvc;
import org.springframework.boot.test.context.SpringBootTest;
import org.springframework.http.MediaType;
import org.springframework.test.context.ActiveProfiles;
import org.springframework.test.web.servlet.MockMvc;
import static org.springframework.test.web.servlet.request.MockMvcRequestBuilders.post;
import static org.springframework.test.web.servlet.result.MockMvcResultMatchers.*;
@SpringBootTest
@AutoConfigureWebMvc
@ActiveProfiles("test")
class ChatControllerIntegrationTest {
@Autowired
private MockMvc mockMvc;
@Test
void testChatEndpoint() throws Exception {
String message = "Hello, Spring AI!";
mockMvc.perform(post("/api/chat")
.contentType(MediaType.APPLICATION_JSON)
.content(message))
.andExpect(status().isOk())
.andExpect(content().contentType(MediaType.TEXT_PLAIN))
.andExpect(content().string(org.hamcrest.Matchers.not(org.hamcrest.Matchers.emptyString())));
}
}
2.6.3 应用启动验证
# 启动应用
mvn spring-boot:run
# 或者
java -jar target/spring-ai-demo-0.0.1-SNAPSHOT.jar
# 测试健康检查
curl http://localhost:8080/api/actuator/health
# 测试聊天接口
curl -X POST http://localhost:8080/api/chat \
-H "Content-Type: application/json" \
-d "Hello, Spring AI!"
# 测试流式聊天
curl -X POST http://localhost:8080/api/chat/stream \
-H "Content-Type: application/json" \
-d "Tell me a story" \
--no-buffer
2.7 Docker化部署
2.7.1 Dockerfile
# Dockerfile
FROM openjdk:17-jdk-slim
# 设置工作目录
WORKDIR /app
# 复制Maven构建结果
COPY target/spring-ai-demo-*.jar app.jar
# 创建非root用户
RUN addgroup --system spring && adduser --system spring --ingroup spring
USER spring:spring
# 暴露端口
EXPOSE 8080
# 健康检查
HEALTHCHECK --interval=30s --timeout=3s --start-period=5s --retries=3 \
CMD curl -f http://localhost:8080/api/actuator/health || exit 1
# 启动应用
ENTRYPOINT ["java", "-jar", "app.jar"]
2.7.2 Docker Compose
# docker-compose.yml
version: '3.8'
services:
spring-ai-app:
build: .
ports:
- "8080:8080"
environment:
- SPRING_PROFILES_ACTIVE=docker
- OPENAI_API_KEY=${OPENAI_API_KEY}
- AZURE_OPENAI_API_KEY=${AZURE_OPENAI_API_KEY}
- AZURE_OPENAI_ENDPOINT=${AZURE_OPENAI_ENDPOINT}
depends_on:
- redis
- postgres
networks:
- spring-ai-network
restart: unless-stopped
redis:
image: redis:7-alpine
ports:
- "6379:6379"
volumes:
- redis_data:/data
networks:
- spring-ai-network
restart: unless-stopped
postgres:
image: postgres:15-alpine
ports:
- "5432:5432"
environment:
- POSTGRES_DB=springai
- POSTGRES_USER=springai
- POSTGRES_PASSWORD=password
volumes:
- postgres_data:/var/lib/postgresql/data
networks:
- spring-ai-network
restart: unless-stopped
ollama:
image: ollama/ollama:latest
ports:
- "11434:11434"
volumes:
- ollama_data:/root/.ollama
networks:
- spring-ai-network
restart: unless-stopped
volumes:
redis_data:
postgres_data:
ollama_data:
networks:
spring-ai-network:
driver: bridge
2.7.3 构建和运行脚本
#!/bin/bash
# build.sh
echo "Building Spring AI Demo..."
# 清理和构建
mvn clean package -DskipTests
# 构建Docker镜像
docker build -t spring-ai-demo:latest .
echo "Build completed!"
# start.sh
#!/bin/bash
echo "Starting Spring AI Demo..."
# 检查环境变量
if [ -z "$OPENAI_API_KEY" ]; then
echo "Warning: OPENAI_API_KEY not set"
fi
# 启动服务
docker-compose up -d
echo "Services started!"
echo "Application: http://localhost:8080"
echo "Health Check: http://localhost:8080/api/actuator/health"
2.8 开发工具配置
2.8.1 IntelliJ IDEA配置
<!-- .idea/runConfigurations/SpringAiDemoApplication.xml -->
<component name="ProjectRunConfigurationManager">
<configuration default="false" name="SpringAiDemoApplication" type="SpringBootApplicationConfigurationType">
<option name="ACTIVE_PROFILES" value="dev" />
<option name="MAIN_CLASS_NAME" value="com.example.springai.SpringAiDemoApplication" />
<option name="MODULE_NAME" value="spring-ai-demo" />
<option name="VM_PARAMETERS" value="-Dspring.profiles.active=dev" />
<envs>
<env name="OPENAI_API_KEY" value="your-api-key" />
</envs>
<method v="2">
<option name="Make" enabled="true" />
</method>
</configuration>
</component>
2.8.2 VS Code配置
// .vscode/launch.json
{
"version": "0.2.0",
"configurations": [
{
"type": "java",
"name": "Spring AI Demo",
"request": "launch",
"mainClass": "com.example.springai.SpringAiDemoApplication",
"projectName": "spring-ai-demo",
"args": "--spring.profiles.active=dev",
"env": {
"OPENAI_API_KEY": "your-api-key"
}
}
]
}
// .vscode/settings.json
{
"java.configuration.updateBuildConfiguration": "interactive",
"java.compile.nullAnalysis.mode": "automatic",
"spring-boot.ls.problem.application-properties.enabled": true
}
2.9 本章总结
2.9.1 核心要点
- 环境准备:Java 17+、Maven/Gradle、IDE配置
- 项目创建:使用Spring Initializr快速创建项目
- 依赖管理:添加Spring AI相关依赖
- 配置管理:多环境配置和环境变量
- 项目结构:标准的Spring Boot项目结构
- 基础应用:创建第一个AI聊天应用
2.9.2 最佳实践
- 配置安全:使用环境变量管理API密钥
- 多环境支持:dev、test、prod环境配置
- 容器化:使用Docker进行部署
- 测试覆盖:单元测试和集成测试
- 监控健康:集成Actuator监控
2.9.3 练习题
基础练习
- 创建一个Spring AI项目,集成OpenAI ChatGPT
- 实现一个简单的问答接口
- 添加健康检查和监控端点
进阶练习
- 配置多个AI提供商(OpenAI、Azure、Ollama)
- 实现流式响应接口
- 添加请求日志和错误处理
高级练习
- 使用Docker Compose部署完整的AI应用栈
- 实现AI服务的负载均衡
- 添加API限流和缓存机制
下一章我们将深入学习AI模型集成的基础知识和高级技巧。