docs/documentation/platform/audit-log-streams/audit-log-streams.mdx
If you're using Infisical Cloud, then it is available under the **Enterprise Tier**. If you're self-hosting Infisical, then you should contact [email protected] to purchase an enterprise license to use it.
Infisical Audit Log Streaming enables you to transmit your organization's audit logs to external logging providers for monitoring and analysis.
If you're using audit log streams as your primary log destination and don't need audit logs stored in PostgreSQL, you can disable PostgreSQL audit log storage by setting the following environment variable:
DISABLE_POSTGRES_AUDIT_LOG_STORAGE=true
This prevents audit logs from being written to PostgreSQL while still streaming them to all configured log stream destinations. If you're also using ClickHouse, logs will continue to be inserted into ClickHouse as well.
<Info> See the [environment variables reference](/self-hosting/configuration/envars) for all available audit log configuration options. </Info> 
</Step>
<Step title="Select Provider">
If your log provider is included in this list, select it. Otherwise click on **Custom** to input your own Endpoint URL and headers.

</Step>
<Step title="Input Credentials">
Depending on your chosen provider, you'll be asked to input different credentials.
For **Custom**, you need to input an endpoint URL and headers.

Once you're finished, click **Create Log Stream**.
</Step>
<Step title="Log Stream Created">
Your audit logs are now ready to be streamed.

</Step>
<Warning>
After setting up all Azure resources, it may take 10-20 minutes for logs to begin streaming.
</Warning>
<Steps>
<Step title="Create a Data Collection Endpoint">
Navigate to [Data Collection Endpoints](https://portal.azure.com/#view/HubsExtension/BrowseResource.ReactView/resourceType/microsoft.insights%2Fdatacollectionendpoints) and click **Create**.

Configure your Data Collection Endpoint by providing an **Endpoint Name**, **Subscription**, and a **Resource group**. Then click **Review + Create**.

After creation, it may take a few minutes for the Data Collection Endpoint to appear. Once visible, click on it and copy the **Logs Ingestion** URL. You will need this URL in later steps.

</Step>
<Step title="Create a Log Analytics Workspace">
<Info>
If you already have a Log Analytics Workspace, you may skip this step.
</Info>
Navigate to [Log Analytics Workspaces](https://portal.azure.com/#browse/Microsoft.OperationalInsights%2Fworkspaces) and click **Create**.

Configure your Log Analytics Workspace by providing a **Subscription**, **Resource group**, and a **Name**. Then click **Review + Create**.

Once the workspace is deployed, click **Go to resource** to access it.

</Step>
<Step title="Create a Custom Log Table">
Within your Log Analytics Workspace, navigate to **Tables** and click **Create**. Select **New custom log (DCR-based)** from the dropdown.

Configure the Custom Log Table: Provide a **Table name** (e.g., `InfisicalLogs`), select the **Data collection endpoint** created in Step 1, and create a new **Data collection rule** as illustrated in the image below. Then, click **Next**.

On the **Schema and transformation** page, you'll be prompted to upload a **Log Sample**. Create a `.json` file with the following content and upload it:
```json
{
"id": "00000000-0000-0000-0000-000000000000",
"actor": "user",
"actorMetadata": {
"email": "[email protected]",
"userId": "00000000-0000-0000-0000-000000000000",
"username": "[email protected]"
},
"ipAddress": "0.0.0.0",
"userAgent": "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/139.0.0.0 Safari/537.36",
"userAgentType": "web",
"eventType": "get-secrets",
"eventMetadata": {},
"orgId": "00000000-0000-0000-0000-000000000000",
"projectId": "00000000-0000-0000-0000-000000000000",
"TimeGenerated": "2025-01-01T00:00:00.000Z"
}
```
Optionally, you can add **Transformations** to further destructure the data. For example, to extract actor email and userId:
```
source
| extend
ActorEmail = tostring(actorMetadata.email),
ActorUserId = tostring(actorMetadata.userId)
```
On the final step, click **Create**.
<Warning>
It may take a few minutes for your Custom Log Table to be created and appear under Tables.
</Warning>
</Step>
<Step title="Obtain Data Collection Rule Immutable ID">
After creating your Data Collection Rule, you'll need its **Immutable ID**.
Navigate to [Data collection rules](https://portal.azure.com/#view/HubsExtension/BrowseResource.ReactView/resourceType/microsoft.insights%2Fdatacollectionrules). Click on your newly created DCR and copy its **Immutable ID** for the next step.

</Step>
<Step title="Create Audit Log Stream on Infisical">
In Infisical, create a new audit log stream and select the **Azure** provider. Input the following details:
- **Tenant ID**: Your Tenant ID
- **Client ID**: The Client ID of an App Registration
- **Client Secret**: The Client Secret of an App Registration
- **Data Collection Endpoint URL**: Obtained from Step 1
- **Data Collection Rule Immutable ID**: Obtained from Step 4
- **Custom Log Table Name**: Defined in Step 3

<Warning>
The App Registration used for authentication must have the **Monitoring Metrics Publisher** role assigned on the **Data Collection Rule** created in Step 3. [See Microsoft Guide](https://learn.microsoft.com/en-us/azure/azure-monitor/logs/tutorial-logs-ingestion-portal#assign-permissions-to-the-dcr).
</Warning>
</Step>
</Steps>
</Accordion>
<Accordion title="Better Stack">
You can stream to Better Stack using a **Custom** log stream.
<Steps>
<Step title="Connect Source">
On Better Stack, select **Connect Source** and click **Create source** after providing a name.

Once your source is created, take note of the **endpoint** and **Source token** for the next step.

</Step>
<Step title="Create Audit Log Stream on Infisical">
On Infisical, create a new audit log stream and select the **Custom** option.

1. Fill in the endpoint URL with your Better Stack source endpoint
2. Create a new header with key `Authorization` and set the value as `Bearer <betterstack-src-token>`

Once you're finished, click **Create Log Stream**.
</Step>
</Steps>
</Accordion>
<Accordion title="Cribl">
Stream Infisical audit logs to Cribl Stream for centralized processing and routing. Infisical supports Cribl as a provider for seamless integration.
<Steps>
<Step title="Create Infisical Data Source">
In Cribl Stream, navigate to **Worker Groups** and select your Worker Group. Take note of the **Ingress Address** for later steps.

Within your Worker Group, navigate to **Data > Sources > HTTP** and click **Add Source**.

Configure the **Input ID**, **Port**, and **Cribl HTTP event API** path (e.g., `/infisical`). Then, generate an **Auth Token**.
You can optionally configure TLS in the **TLS Settings** tab and add a pipeline in the **Pre-Processing** tab.
<Warning>
Ensure that you're using a port that's open on your instance.
</Warning>

Once you've configured the Data Source, click **Save** and deploy your changes.
</Step>
<Step title="Create Audit Log Stream on Infisical">
On Infisical, create a new audit log stream and select the **Cribl** provider option.
Input the following credentials:
- **Cribl Stream URL**: Your HTTP source endpoint composed of `http://<ingress-address>:<port>/<http-event-api-path>/_bulk`
- **Cribl Stream Token**: The authentication token from Step 1
<Info>
If you configured TLS for your Data Source, use the `https://` protocol.
</Info>

Once you're finished, click **Create Log Stream**.
</Step>
</Steps>
</Accordion>
<Accordion title="Datadog">
You can stream to Datadog using the **Datadog** provider log stream.
<Steps>
<Step title="Navigate to API Keys section">

</Step>
<Step title="Select New Key and provide a key name">


</Step>
<Step title="Create Audit Log Stream on Infisical">
On Infisical, create a new audit log stream and select the **Datadog** provider option.
Input your **Datadog Region** and the **Token** obtained from step 2.

Once you're finished, click **Create Log Stream**.
</Step>
</Steps>
</Accordion>
<Accordion title="Splunk">
You can stream to Splunk using the **Splunk** provider log stream.
<Steps>
<Step title="Obtain Splunk Token">
Navigate to **Settings** > **Data Inputs**.

Click on **HTTP Event Collector**.

Click on **New Token** in the top right.

Provide a name and click **Next**.

On the next page, click **Review** and then **Submit** at the top. On the final page you'll see your token.
Copy the **Token Value** and your Splunk hostname from the URL to be used for later.

</Step>
<Step title="Create Audit Log Stream on Infisical">
On Infisical, create a new audit log stream and select the **Splunk** provider option.
Input your **Splunk Hostname** and the **Token** obtained from step 1.

Once you're finished, click **Create Log Stream**.
</Step>
</Steps>
</Accordion>
{
"id": "7dc1713b-d787-4147-9e21-770be01cc992",
"actor": "user",
"actorMetadata": {
"email": "[email protected]",
"userId": "7383b701-d83f-45c0-acb4-04e138b987ab",
"username": "[email protected]"
},
"ipAddress": "127.0.0.1",
"eventType": "create-secret",
"eventMetadata": {
"secretId": "3e5c796e-6599-4181-8dca-51133bb3acd0",
"secretKey": "TEST-SECRET",
"secretPath": "/",
"environment": "dev",
"secretVersion": 1
},
"userAgent": "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/131.0.0.0 Safari/537.36",
"userAgentType": "web",
"expiresAt": "2025-01-18T01:11:25.552Z",
"createdAt": "2025-01-15T01:11:25.552Z",
"updatedAt": "2025-01-15T01:11:25.552Z",
"orgId": "785649f1-ff4b-4ef9-a40a-9b9878e46e57",
"projectId": "09bfcc01-0917-4bea-9c7a-2d320584d5b1"
}
<AccordionGroup>
<Accordion title="User Metadata">
This metadata is present when the `actor` field is set to `user`.
<ParamField path="userId" type="string" required>
The unique identifier for the actor.
</ParamField>
<ParamField path="email" type="string" required>
The email address of the actor.
</ParamField>
<ParamField path="username" type="string" required>
The username of the actor.
</ParamField>
</Accordion>
<Accordion title="Identity Metadata">
This metadata is present when the `actor` field is set to `identity`.
<ParamField path="identityId" type="string" required>
The unique identifier for the identity.
</ParamField>
<ParamField path="name" type="string" required>
The name of the identity.
</ParamField>
</Accordion>
<Accordion title="Service Token Metadata">
This metadata is present when the `actor` field is set to `service`.
<ParamField path="serviceId" type="string" required>
The unique identifier for the service.
</ParamField>
<ParamField path="name" type="string" required>
The name of the service.
</ParamField>
</Accordion>
</AccordionGroup>
<Note>
If the `actor` field is set to `platform`, `scimClient`, or `unknownUser`, the `actorMetadata` field will be an empty object.
</Note>
`get-secrets`, `delete-secrets`, `get-secret`, `create-secret`, `update-secret`, `delete-secret`, `get-workspace-key`, `authorize-integration`, `update-integration-auth`, `unauthorize-integration`, `create-integration`, `delete-integration`, `add-trusted-ip`, `update-trusted-ip`, `delete-trusted-ip`, `create-service-token`, `delete-service-token`, `create-identity`, `update-identity`, `delete-identity`, `login-identity-universal-auth`, `add-identity-universal-auth`, `update-identity-universal-auth`, `get-identity-universal-auth`, `create-identity-universal-auth-client-secret`, `revoke-identity-universal-auth-client-secret`, `get-identity-universal-auth-client-secret`, `create-environment`, `update-environment`, `delete-environment`, `add-workspace-member`, `remove-workspace-member`, `create-folder`, `update-folder`, `delete-folder`, `create-webhook`, `update-webhook-status`, `delete-webhook`, `webhook-triggered`, `get-secret-imports`, `create-secret-import`, `update-secret-import`, `delete-secret-import`, `update-user-workspace-role`, `update-user-workspace-denied-permissions`, `create-certificate-authority`, `get-certificate-authority`, `update-certificate-authority`, `delete-certificate-authority`, `get-certificate-authority-csr`, `get-certificate-authority-cert`, `sign-intermediate`, `import-certificate-authority-cert`, `get-certificate-authority-crl`, `issue-cert`, `get-cert`, `delete-cert`, `revoke-cert`, `get-cert-body`, `create-pki-alert`, `get-pki-alert`, `update-pki-alert`, `delete-pki-alert`, `create-pki-collection`, `get-pki-collection`, `update-pki-collection`, `delete-pki-collection`, `get-pki-collection-items`, `add-pki-collection-item`, `delete-pki-collection-item`, `org-admin-accessed-project`, `create-certificate-template`, `update-certificate-template`, `delete-certificate-template`, `get-certificate-template`, `create-certificate-template-est-config`, `update-certificate-template-est-config`, `get-certificate-template-est-config`, `update-project-slack-config`, `get-project-slack-config`, `integration-synced`, `create-shared-secret`, `delete-shared-secret`, `read-shared-secret`.
The `projectId` field will only be present if the event occurred at the project level, not the organization level.