blazor-405749-security-considerations-ai-secret-management.md
Never hardcode AI provider access keys, credentials, or API endpoints directly in your source code. Hardcoded secrets introduce a major security vulnerability as code can be exposed, leading to data breach events and/or unauthorized service usage.
To mitigate these risks, adopt a secret management strategy based on your application hosting model.
For Blazor Server applications, all application logic is executed on the server and secrets are never exposed on the client’s browser. You can manage secrets as follows:
secrets.json file outside of your project folder and ensures secrets are not committed to source control.Blazor WebAssembly (WASM) apps run on the client side. Once the browser downloads the code to a device, users can decompile it and access sensitive information. To mitigate security risks, route all requests to the AI provider through a backend API proxy:
using System.Text;
/* ... */
// Fetch Azure OpenAI secrets from the appsettings.json file
var aiUri = builder.Configuration.GetSection("AzureOpenAISettings")["Endpoint"];
var aiKey = builder.Configuration.GetSection("AzureOpenAISettings")["Key"];
var aiModel = builder.Configuration.GetSection("AzureOpenAISettings")["DeploymentName"];
if (string.IsNullOrEmpty(aiUri) || string.IsNullOrEmpty(aiKey) || string.IsNullOrEmpty(aiModel))
throw new InvalidOperationException("Cannot fetch secrets from 'appsettings.json'");
// Registers the app's chat service
builder.Services.AddChatClient(aiUri, aiKey, aiModel);
// Enables API controllers and MVC endpoints alongside Razor components
builder.Services.AddMvc();
/* ... */
// Declare a server-side proxy so Blazor WASM can call Azure OpenAI without exposing the key in the browser
app.MapPost("/api/chat/{*path}", async (string path, HttpContext context, CancellationToken ct) => {
var httpClientFactory = context.RequestServices.GetRequiredService<IHttpClientFactory>();
var client = httpClientFactory.CreateClient();
client.BaseAddress = new(aiUri);
client.DefaultRequestHeaders.Authorization = new("Bearer", aiKey);
var newPath = path.Replace("proxychat", aiModel);
var endpointUri = new Uri(aiUri);
var uriBuilder = new UriBuilder(endpointUri) {
Path = $"{endpointUri.AbsolutePath}/{newPath}",
Query = context.Request.QueryString.Value
};
var body = await new StreamReader(context.Request.Body).ReadToEndAsync(ct);
var response = await client.PostAsync(uriBuilder.Uri,
new StringContent(body, Encoding.UTF8, "application/json"), ct);
context.Response.StatusCode = (int)response.StatusCode;
await response.Content.CopyToAsync(context.Response.Body, ct);
});
Tip
The DevExpress Template Kit automatically generates the AI proxy when you specify the following ASP.NET Core Blazor application parameters:
You can also create this setup with the .NET CLI:
dotnet new dx.blazor --name MyBlazorServerApp --interactivity Auto --add-views AIChat