Back to Infisical

Audit Log Streams

docs/documentation/platform/audit-log-streams/audit-log-streams.mdx

0.159.2520.9 KB
Original Source
<Info> Audit log streams is a paid feature.
If you're using Infisical Cloud, then it is available under the **Enterprise Tier**. If you're self-hosting Infisical, then you should contact [email protected] to purchase an enterprise license to use it.
</Info>

Infisical Audit Log Streaming enables you to transmit your organization's audit logs to external logging providers for monitoring and analysis.

Disabling PostgreSQL Audit Log Storage

If you're using audit log streams as your primary log destination and don't need audit logs stored in PostgreSQL, you can disable PostgreSQL audit log storage by setting the following environment variable:

bash
DISABLE_POSTGRES_AUDIT_LOG_STORAGE=true

This prevents audit logs from being written to PostgreSQL while still streaming them to all configured log stream destinations. If you're also using ClickHouse, logs will continue to be inserted into ClickHouse as well.

<Info> See the [environment variables reference](/self-hosting/configuration/envars) for all available audit log configuration options. </Info>

Overview

<Steps> <Step title="Create Stream"> 1. Navigate to **Organization Settings** 2. Select the **Audit Log Streams** tab 3. Click **Add Log Stream**
    ![stream create](/images/platform/audit-log-streams/stream-create.png)
</Step>
	<Step title="Select Provider">
	    If your log provider is included in this list, select it. Otherwise click on **Custom** to input your own Endpoint URL and headers.

	    ![select provider](/images/platform/audit-log-streams/select-provider.png)
	</Step>
	<Step title="Input Credentials">
	    Depending on your chosen provider, you'll be asked to input different credentials.

			For **Custom**, you need to input an endpoint URL and headers.

	    ![custom provider](/images/platform/audit-log-streams/custom-provider.png)

			Once you're finished, click **Create Log Stream**.
	</Step>
	<Step title="Log Stream Created">
	    Your audit logs are now ready to be streamed.

    ![stream list](/images/platform/audit-log-streams/stream-list.png)
	</Step>
</Steps>

Example Providers

<AccordionGroup> <Accordion title="Azure"> Infisical offers a dedicated **Azure** provider to stream your audit logs, enabling seamless integration with services like Microsoft Sentinel.
    <Warning>
        After setting up all Azure resources, it may take 10-20 minutes for logs to begin streaming.
    </Warning>

    <Steps>
        <Step title="Create a Data Collection Endpoint">
            Navigate to [Data Collection Endpoints](https://portal.azure.com/#view/HubsExtension/BrowseResource.ReactView/resourceType/microsoft.insights%2Fdatacollectionendpoints) and click **Create**.

            ![azure create dce](/images/platform/audit-log-streams/azure-create-dce.png)

            Configure your Data Collection Endpoint by providing an **Endpoint Name**, **Subscription**, and a **Resource group**. Then click **Review + Create**.

            ![azure configure dce](/images/platform/audit-log-streams/azure-configure-dce.png)

            After creation, it may take a few minutes for the Data Collection Endpoint to appear. Once visible, click on it and copy the **Logs Ingestion** URL. You will need this URL in later steps.

            ![azure dce url](/images/platform/audit-log-streams/azure-dce-url.png)
        </Step>
        <Step title="Create a Log Analytics Workspace">
            <Info>
                If you already have a Log Analytics Workspace, you may skip this step.
            </Info>

            Navigate to [Log Analytics Workspaces](https://portal.azure.com/#browse/Microsoft.OperationalInsights%2Fworkspaces) and click **Create**.

            ![azure create law](/images/platform/audit-log-streams/azure-create-law.png)

            Configure your Log Analytics Workspace by providing a **Subscription**, **Resource group**, and a **Name**. Then click **Review + Create**.

            ![azure configure law](/images/platform/audit-log-streams/azure-configure-law.png)

            Once the workspace is deployed, click **Go to resource** to access it.

            ![azure go to resource](/images/platform/audit-log-streams/azure-go-to-resource.png)
        </Step>
        <Step title="Create a Custom Log Table">
            Within your Log Analytics Workspace, navigate to **Tables** and click **Create**. Select **New custom log (DCR-based)** from the dropdown.

            ![azure new table](/images/platform/audit-log-streams/azure-new-table.png)

            Configure the Custom Log Table: Provide a **Table name** (e.g., `InfisicalLogs`), select the **Data collection endpoint** created in Step 1, and create a new **Data collection rule** as illustrated in the image below. Then, click **Next**.

            ![azure configure table](/images/platform/audit-log-streams/azure-configure-table.png)

            On the **Schema and transformation** page, you'll be prompted to upload a **Log Sample**. Create a `.json` file with the following content and upload it:

            ```json
            {
            "id": "00000000-0000-0000-0000-000000000000",
            "actor": "user",
            "actorMetadata": {
                "email": "[email protected]",
                "userId": "00000000-0000-0000-0000-000000000000",
                "username": "[email protected]"
            },
            "ipAddress": "0.0.0.0",
            "userAgent": "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/139.0.0.0 Safari/537.36",
            "userAgentType": "web",
            "eventType": "get-secrets",
            "eventMetadata": {},
            "orgId": "00000000-0000-0000-0000-000000000000",
            "projectId": "00000000-0000-0000-0000-000000000000",
            "TimeGenerated": "2025-01-01T00:00:00.000Z"
            }
            ```

            Optionally, you can add **Transformations** to further destructure the data. For example, to extract actor email and userId:

            ```
            source
            | extend
                ActorEmail = tostring(actorMetadata.email),
                ActorUserId = tostring(actorMetadata.userId)
            ```

            On the final step, click **Create**.

            <Warning>
                It may take a few minutes for your Custom Log Table to be created and appear under Tables.
            </Warning>
        </Step>
        <Step title="Obtain Data Collection Rule Immutable ID">
            After creating your Data Collection Rule, you'll need its **Immutable ID**.

            Navigate to [Data collection rules](https://portal.azure.com/#view/HubsExtension/BrowseResource.ReactView/resourceType/microsoft.insights%2Fdatacollectionrules). Click on your newly created DCR and copy its **Immutable ID** for the next step.

            ![azure dcr](/images/platform/audit-log-streams/azure-dcr.png)
        </Step>
        <Step title="Create Audit Log Stream on Infisical">
            In Infisical, create a new audit log stream and select the **Azure** provider. Input the following details:

            - **Tenant ID**: Your Tenant ID
            - **Client ID**: The Client ID of an App Registration
            - **Client Secret**: The Client Secret of an App Registration
            - **Data Collection Endpoint URL**: Obtained from Step 1
            - **Data Collection Rule Immutable ID**: Obtained from Step 4
            - **Custom Log Table Name**: Defined in Step 3

            ![azure create als](/images/platform/audit-log-streams/azure-create-als.png)

            <Warning>
                The App Registration used for authentication must have the **Monitoring Metrics Publisher** role assigned on the **Data Collection Rule** created in Step 3. [See Microsoft Guide](https://learn.microsoft.com/en-us/azure/azure-monitor/logs/tutorial-logs-ingestion-portal#assign-permissions-to-the-dcr).
            </Warning>
        </Step>
    </Steps>
</Accordion>
<Accordion title="Better Stack">
    You can stream to Better Stack using a **Custom** log stream.

    <Steps>
        <Step title="Connect Source">
            On Better Stack, select **Connect Source** and click **Create source** after providing a name.

            ![better stack connect source](/images/platform/audit-log-streams/betterstack-create-source.png)

            Once your source is created, take note of the **endpoint** and **Source token** for the next step.

    				![better stack connect](/images/platform/audit-log-streams/betterstack-source-details.png)
        </Step>
        <Step title="Create Audit Log Stream on Infisical">
    				On Infisical, create a new audit log stream and select the **Custom** option.

    				![select custom](/images/platform/audit-log-streams/select-custom.png)

            1. Fill in the endpoint URL with your Better Stack source endpoint
            2. Create a new header with key `Authorization` and set the value as `Bearer <betterstack-src-token>`

            ![custom provider](/images/platform/audit-log-streams/custom-provider.png)

            Once you're finished, click **Create Log Stream**.
  		  </Step>
    </Steps>
</Accordion>
<Accordion title="Cribl">
    Stream Infisical audit logs to Cribl Stream for centralized processing and routing. Infisical supports Cribl as a provider for seamless integration.

    <Steps>
        <Step title="Create Infisical Data Source">
            In Cribl Stream, navigate to **Worker Groups** and select your Worker Group. Take note of the **Ingress Address** for later steps.

    				![cribl ingress address](/images/platform/audit-log-streams/cribl-ingress-address.png)

            Within your Worker Group, navigate to **Data > Sources > HTTP** and click **Add Source**.

    				![cribl add source](/images/platform/audit-log-streams/cribl-add-source.png)

            Configure the **Input ID**, **Port**, and **Cribl HTTP event API** path (e.g., `/infisical`). Then, generate an **Auth Token**.

            You can optionally configure TLS in the **TLS Settings** tab and add a pipeline in the **Pre-Processing** tab.

            <Warning>
                Ensure that you're using a port that's open on your instance.
            </Warning>

    				![cribl general settings](/images/platform/audit-log-streams/cribl-general-settings.png)

            Once you've configured the Data Source, click **Save** and deploy your changes.
    		</Step>
        <Step title="Create Audit Log Stream on Infisical">
            On Infisical, create a new audit log stream and select the **Cribl** provider option.

            Input the following credentials:
            - **Cribl Stream URL**: Your HTTP source endpoint composed of `http://<ingress-address>:<port>/<http-event-api-path>/_bulk`
            - **Cribl Stream Token**: The authentication token from Step 1

            <Info>
                If you configured TLS for your Data Source, use the `https://` protocol.
            </Info>

    				![cribl details](/images/platform/audit-log-streams/cribl-details.png)

            Once you're finished, click **Create Log Stream**.
    		</Step>
    </Steps>
</Accordion>
<Accordion title="Datadog">
    You can stream to Datadog using the **Datadog** provider log stream.

    <Steps>
        <Step title="Navigate to API Keys section">
    				![api key create](/images/platform/audit-log-streams/datadog-api-sidebar.png)
    		</Step>
        <Step title="Select New Key and provide a key name">
    				![api key form](/images/platform/audit-log-streams/data-create-api-key.png)
    				![api key form](/images/platform/audit-log-streams/data-dog-api-key.png)
    		</Step>
        <Step title="Create Audit Log Stream on Infisical">
            On Infisical, create a new audit log stream and select the **Datadog** provider option.

            Input your **Datadog Region** and the **Token** obtained from step 2.

    				![datadog details](/images/platform/audit-log-streams/datadog-details.png)

            Once you're finished, click **Create Log Stream**.
    		</Step>
    </Steps>
</Accordion>
<Accordion title="Splunk">
    You can stream to Splunk using the **Splunk** provider log stream.

    <Steps>
        <Step title="Obtain Splunk Token">
            Navigate to **Settings** > **Data Inputs**.

    				![splunk data inputs](/images/platform/audit-log-streams/splunk-data-inputs.png)

            Click on **HTTP Event Collector**.

    				![splunk http collector](/images/platform/audit-log-streams/splunk-http-collector.png)

            Click on **New Token** in the top right.

    				![splunk new token](/images/platform/audit-log-streams/splunk-new-token.png)

            Provide a name and click **Next**.

    				![splunk name](/images/platform/audit-log-streams/splunk-name.png)

            On the next page, click **Review** and then **Submit** at the top. On the final page you'll see your token.

            Copy the **Token Value** and your Splunk hostname from the URL to be used for later.

    				![splunk credentials](/images/platform/audit-log-streams/splunk-credentials.png)
    		</Step>
        <Step title="Create Audit Log Stream on Infisical">
            On Infisical, create a new audit log stream and select the **Splunk** provider option.

            Input your **Splunk Hostname** and the **Token** obtained from step 1.

    				![splunk details](/images/platform/audit-log-streams/splunk-details.png)

            Once you're finished, click **Create Log Stream**.
        </Step>
    </Steps>
</Accordion>
</AccordionGroup>

Example Log Entry

created-secret.json
{
  "id": "7dc1713b-d787-4147-9e21-770be01cc992",
  "actor": "user",
  "actorMetadata": {
    "email": "[email protected]",
    "userId": "7383b701-d83f-45c0-acb4-04e138b987ab",
    "username": "[email protected]"
  },
  "ipAddress": "127.0.0.1",
  "eventType": "create-secret",
  "eventMetadata": {
    "secretId": "3e5c796e-6599-4181-8dca-51133bb3acd0",
    "secretKey": "TEST-SECRET",
    "secretPath": "/",
    "environment": "dev",
    "secretVersion": 1
  },
  "userAgent": "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/131.0.0.0 Safari/537.36",
  "userAgentType": "web",
  "expiresAt": "2025-01-18T01:11:25.552Z",
  "createdAt": "2025-01-15T01:11:25.552Z",
  "updatedAt": "2025-01-15T01:11:25.552Z",
  "orgId": "785649f1-ff4b-4ef9-a40a-9b9878e46e57",
  "projectId": "09bfcc01-0917-4bea-9c7a-2d320584d5b1"
}

Audit Logs Structure

<Warning> Streamed audit log structure **varies based on provider**, but they all share the audit log fields shown below. </Warning> <ParamField path="id" type="string" required> The unique identifier for the log entry. </ParamField> <ParamField path="actor" type="platform | user | service | identity | scimClient | unknownUser" required> The entity responsible for performing or causing the event; this can be a user or service. </ParamField> <ParamField path="actorMetadata" type="object" required> The metadata associated with the actor. This varies based on the actor type.
<AccordionGroup>
    <Accordion title="User Metadata">
        This metadata is present when the `actor` field is set to `user`.

        <ParamField path="userId" type="string" required>
            The unique identifier for the actor.
        </ParamField>
        <ParamField path="email" type="string" required>
            The email address of the actor.
        </ParamField>
        <ParamField path="username" type="string" required>
            The username of the actor.
        </ParamField>
    </Accordion>
    <Accordion title="Identity Metadata">
        This metadata is present when the `actor` field is set to `identity`.

        <ParamField path="identityId" type="string" required>
            The unique identifier for the identity.
        </ParamField>
        <ParamField path="name" type="string" required>
            The name of the identity.
        </ParamField>
    </Accordion>
    <Accordion title="Service Token Metadata">
        This metadata is present when the `actor` field is set to `service`.

        <ParamField path="serviceId" type="string" required>
            The unique identifier for the service.
        </ParamField>
        <ParamField path="name" type="string" required>
            The name of the service.
        </ParamField>
    </Accordion>
</AccordionGroup>

<Note>
    If the `actor` field is set to `platform`, `scimClient`, or `unknownUser`, the `actorMetadata` field will be an empty object.
</Note>
</ParamField> <ParamField path="ipAddress" type="string" required> The IP address of the actor. </ParamField> <ParamField path="eventType" type="string" required> The type of event that occurred. Below you can see a list of possible event types. More event types will be added in the future as we expand our audit logs further.
`get-secrets`, `delete-secrets`, `get-secret`, `create-secret`, `update-secret`, `delete-secret`, `get-workspace-key`, `authorize-integration`, `update-integration-auth`, `unauthorize-integration`, `create-integration`, `delete-integration`, `add-trusted-ip`, `update-trusted-ip`, `delete-trusted-ip`, `create-service-token`, `delete-service-token`, `create-identity`, `update-identity`, `delete-identity`, `login-identity-universal-auth`, `add-identity-universal-auth`, `update-identity-universal-auth`, `get-identity-universal-auth`, `create-identity-universal-auth-client-secret`, `revoke-identity-universal-auth-client-secret`, `get-identity-universal-auth-client-secret`, `create-environment`, `update-environment`, `delete-environment`, `add-workspace-member`, `remove-workspace-member`, `create-folder`, `update-folder`, `delete-folder`, `create-webhook`, `update-webhook-status`, `delete-webhook`, `webhook-triggered`, `get-secret-imports`, `create-secret-import`, `update-secret-import`, `delete-secret-import`, `update-user-workspace-role`, `update-user-workspace-denied-permissions`, `create-certificate-authority`, `get-certificate-authority`, `update-certificate-authority`, `delete-certificate-authority`, `get-certificate-authority-csr`, `get-certificate-authority-cert`, `sign-intermediate`, `import-certificate-authority-cert`, `get-certificate-authority-crl`, `issue-cert`, `get-cert`, `delete-cert`, `revoke-cert`, `get-cert-body`, `create-pki-alert`, `get-pki-alert`, `update-pki-alert`, `delete-pki-alert`, `create-pki-collection`, `get-pki-collection`, `update-pki-collection`, `delete-pki-collection`, `get-pki-collection-items`, `add-pki-collection-item`, `delete-pki-collection-item`, `org-admin-accessed-project`, `create-certificate-template`, `update-certificate-template`, `delete-certificate-template`, `get-certificate-template`, `create-certificate-template-est-config`, `update-certificate-template-est-config`, `get-certificate-template-est-config`, `update-project-slack-config`, `get-project-slack-config`, `integration-synced`, `create-shared-secret`, `delete-shared-secret`, `read-shared-secret`.
</ParamField> <ParamField path="eventMetadata" type="object" required> The metadata associated with the event. This varies based on the event type. </ParamField> <ParamField path="userAgent" type="string"> The user agent of the actor, if applicable. </ParamField> <ParamField path="userAgentType" type="web | cli | k8-operator | terraform | other | InfisicalPythonSDK | InfisicalNodeSDK"> The type of user agent. </ParamField> <ParamField path="expiresAt" type="string" required> The expiration date of the log entry. When this date is reached, the log entry will be deleted from Infisical. </ParamField> <ParamField path="createdAt" type="string" required> The creation date of the log entry. </ParamField> <ParamField path="updatedAt" type="string" required> The last update date of the log entry. This is unlikely to be out of sync with the `createdAt` field, as we do not update log entries after they've been created. </ParamField> <ParamField path="orgId" type="string" required> The unique identifier for the organization where the event occurred. </ParamField> <ParamField path="projectId" type="string"> The unique identifier for the project where the event occurred.
The `projectId` field will only be present if the event occurred at the project level, not the organization level.
</ParamField>