docs/en/Community-Articles/2025-05-11-AI-Chat/POST.md
This article demonstrates how to integrate the .NET AI Chat Template into an ABP Framework application, enabling powerful AI chat capabilities in your ABP-based solution.
First, let's create a new single-layer Blazor Server project named AbpAiChat using ABP Studio, You can also use the following ABP CLI command to create the project:
abp new AbpAiChat -t app-nolayers --ui-framework blazor-server --use-open-source-template
The integration process involves copying and adapting the .NET AI Chat Template code into our ABP project. The template code is already included in our sample project, so you don't need to install it separately.
Components folderServices folderIngestedDocument, IngestedRecord) to the AbpAiChatDbContext and add new migrationwwwroot folderAdd the following packages to AbpAiChat.csproj:
<PackageReference Include="Microsoft.Extensions.AI.OpenAI" Version="9.4.3-preview.1.25230.7" />
<PackageReference Include="Microsoft.EntityFrameworkCore.Sqlite" Version="9.0.4" />
<PackageReference Include="Microsoft.Extensions.AI" Version="9.4.3-preview.1.25230.7" />
<PackageReference Include="Microsoft.SemanticKernel.Core" Version="1.47.0" />
<PackageReference Include="PdfPig" Version="0.1.9" />
<PackageReference Include="System.Linq.Async" Version="6.0.1" />
Add the following configuration to AbpAiChatModule.cs:
private void ConfigureAi(ServiceConfigurationContext context)
{
var credential = new ApiKeyCredential(context.Services.GetConfiguration()["GitHubToken"] ?? throw new InvalidOperationException("Missing configuration: GitHubToken. See the README for details."));
var openAiOptions = new OpenAIClientOptions()
{
Endpoint = new Uri("https://models.inference.ai.azure.com")
};
var ghModelsClient = new OpenAIClient(credential, openAiOptions);
var chatClient = ghModelsClient.GetChatClient("gpt-4o-mini").AsIChatClient();
var embeddingGenerator = ghModelsClient.GetEmbeddingClient("text-embedding-3-small").AsIEmbeddingGenerator();
var vectorStore = new JsonVectorStore(Path.Combine(AppContext.BaseDirectory, "vector-store"));
context.Services.AddSingleton<IVectorStore>(vectorStore);
context.Services.AddScoped<DataIngestor>();
context.Services.AddSingleton<SemanticSearch>();
context.Services.AddChatClient(chatClient).UseFunctionInvocation().UseLogging();
context.Services.AddEmbeddingGenerator(embeddingGenerator);
context.Services.Configure<AbpAspNetCoreContentOptions>(options =>
{
options.ContentTypeMaps.Add(".mjs", "application/javascript");
});
}
The ConfigureAi method is called in the ConfigureServices method of AbpAiChatModule. It sets up the AI services, including the OpenAI client, chat client, embedding generator, and vector store.
Add your GitHub Personal Access Token to appsettings.json:
{
"GitHubToken": "your-github-token"
}
You can obtain your token from GitHub Personal Access Tokens.
Let's add a custom AI function to retrieve the current user's information. Update the Chat.razor component:
chatOptions.Tools =
[
AIFunctionFactory.Create(SearchAsync),
AIFunctionFactory.Create(GetWeather),
AIFunctionFactory.Create(GetCurrentUserInfo)
];
[Description("Get current user information")]
private Task<string> GetCurrentUserInfo()
{
return Task.FromResult(CurrentUser.IsAuthenticated ?
$"UserId: {CurrentUser.Id}, Name: {CurrentUser.UserName}, Email: {CurrentUser.Email}, Roles: {string.Join(", ", CurrentUser.Roles)}" :
"No user information available.");
}
Add a Chat menu item in AbpAiChatMenuContributor to navigate to the AI Chat component.
After completing the integration, you can run the application and access the AI chat functionality. The chat interface allows you to:
This integration demonstrates how to leverage the power of AI in your ABP Framework applications. The .NET AI Chat Template provides a solid foundation for building intelligent chat interfaces, and ABP Framework makes it more powerful.