---
title: 'Log streaming: Microsoft Azure Blob Storage'
summary: null
url: >-
  https://www.fastly.com/documentation/guides/integrations/logging-endpoints/object-and-cloud-storage/log-streaming-azure-blob-storage
---

Fastly's [Real-Time Log Streaming](https://www.fastly.com/documentation/guides/integrations/streaming-logs/about-fastlys-realtime-log-streaming-features) feature can send log files to [Microsoft Azure Blob Storage](https://azure.microsoft.com/en-us/services/storage/blobs/) (Blob Storage). Blob Storage is a static file storage service used to control arbitrarily large amounts of unstructured data and serve them to users over HTTP and HTTPS.

> **NOTE:** 
>
> Fastly does not provide direct support for third-party services. Read [Fastly's Terms of Service](https://www.fastly.com/terms) for more information.
>
>

## Prerequisites

Before adding Blob Storage as a logging endpoint for Fastly services, create an Azure storage account in the [Azure portal](https://portal.azure.com/#create/Microsoft.StorageAccount-ARM). For help creating the account, check out Microsoft's [account creation](https://learn.microsoft.com/en-us/azure/storage/common/storage-account-create?tabs=azure-portal) documentation.

We recommend creating a Shared Access Signature (SAS) user specifically for Fastly. For more information, check out Microsoft's [shared access signatures (SAS)](https://learn.microsoft.com/en-us/azure/storage/common/storage-sas-overview) documentation. Fastly supports the following types of shared access signatures:

- [User delegation SAS](https://learn.microsoft.com/en-us/rest/api/storageservices/create-user-delegation-sas)
- [Service SAS](https://learn.microsoft.com/en-us/rest/api/storageservices/create-service-sas)
- [Account SAS](https://learn.microsoft.com/en-us/rest/api/storageservices/create-account-sas)

Here is an example of a SAS token that provides write permissions to a blob:

`sv=2018-04-05&ss=b&st=2018-04-29T22%3A18%3A26Z&sr=b&se=2020-04-30T02%3A23%3A26Z&sp=w&sig=Z%2FRHIX5Xcg0Mq2rqI3OlWTjEg2tYkboXr1P9ZUXDtkk%3D`

Refer to the Microsoft documentation above for details on the SAS token elements to use when configuring your SAS token.

## Adding Blob Storage as a logging endpoint

After you've registered for an Azure account and created a SAS token, follow these instructions to add Blob Storage as a logging endpoint:

### Cdn Services

1.   Review the information in our guide to [setting up remote log streaming](/guides/integrations/streaming-logs/setting-up-remote-log-streaming).

2. In the Azure Blob Storage area, click **Create endpoint**.
3. Fill out the **Create a Microsoft Azure Blob Storage endpoint** fields as follows:

   -   In the **Name** field, enter a human-readable name for the endpoint.

   -   In the **Placement** area, select where the logging call should be placed in the generated VCL. Valid values are **Format Version Default** and **None**. Read our guide on [changing log placement](/guides/integrations/streaming-logs/changing-log-placement) for more information.

   - In the **Log format** field, enter a string formatted as a comma-separated value (CSV) to use for log formatting. See [Ingesting data for Azure Data Explorer](https://www.fastly.com/documentation/guides/integrations/logging-endpoints/object-and-cloud-storage/log-streaming-azure-blob-storage#ingesting-data-for-azure-data-explorer) for more information.
   - In the **Storage account name** field, enter the unique Azure namespace in which your data objects will be stored.
   - In the **Container** field, enter the name of the Blob Storage container to store logs in. Check out Microsoft's [Blob storage page](https://docs.microsoft.com/en-us/azure/storage/blobs/storage-blobs-introduction) for more information.
   - In the **SAS token** field, enter the token associated with the container.

   > **IMPORTANT:** Ensure you update your token before it expires otherwise the logging functionality will not work.

   - _(Optional)_ In the **Maximum bytes** field, enter the maximum file size in bytes.
   -   *(Optional)* In the **Period** field, enter an interval (in seconds) to control how frequently your log files are rotated. Rotation entails the finalization of one file object and the start of a new one, never removing any previously created file object. This value defaults to `3600` seconds.

   -   *(Optional)* In the **Timestamp format** field, enter a timestamp format for log files. The default is an `strftime` compatible string. Our guide on [changing where log files are written](/guides/integrations/streaming-logs/changing-where-log-files-are-written) provides more information.

   -   *(Optional)* From the **Processing region** menu, select a geographic region where logs are processed before being sent to the logging endpoint. Our guide on [regional log aggregation](/guides/integrations/streaming-logs/setting-up-regional-log-aggregation) provides more information.

4. Click **Advanced options** and fill out the fields as follows:
   -   *(Optional)* In the **Path** field, enter the path within the bucket to store the files. The path ends with a trailing slash. If this field is left empty, the files will be saved in the bucket's root path. Our guide on [changing where log files are written](/guides/integrations/streaming-logs/changing-where-log-files-are-written) provides more information.

   -   *(Optional)* In the **PGP public key** field, enter a PGP public key that Fastly will use to encrypt your log files before writing them to disk. You will only be able to read the contents by decrypting them with your private key. The PGP key should be in [PEM (Privacy-Enhanced Mail) format](https://en.wikipedia.org/wiki/Privacy-enhanced_Electronic_Mail). Read our guide on [log encryption](/guides/integrations/streaming-logs/encrypting-logs) for more information.

   -   In the **Select a log line format** area, select the log line format for your log messages. Our guide on [changing log line formats](/guides/integrations/streaming-logs/changing-log-line-formats) provides more information.

   -   *(Optional)* In the **Compression** field, select the compression format you want applied to the log files. Our guide on [changing log compression options](/guides/integrations/streaming-logs/changing-log-compression-options) provides more information.

5.   Click **Create** to create the new logging endpoint.

6.   From the **Activate** menu, select **Activate on Production** to deploy your configuration changes.

> **NOTE:** Although Fastly continuously streams logs into Azure Blob Storage, the storage portal and API do not make files available for access until after their upload is complete.

### Ingesting data for Azure Data Explorer

[Azure Data Explorer](https://azure.microsoft.com/en-gb/services/data-explorer/) is a  data exploration service for log and telemetry data. To ingest your data correctly, Data Explorer requires your logs to be formatted as comma-separated values (CSVs). When creating your logging endpoint:

- Set the **Log format** to a CSV string ( `%H,%{time.start.sec}V,%{regsub(req.http.User-Agent, \{"""\}, \{""""\})}V`).
- Specify **blank** when you **Select a log line format** in the **Advanced options**.

Our guide on [changing log line formats](https://www.fastly.com/documentation/guides/integrations/streaming-logs/changing-log-line-formats) provides more information.

### Compute Services

1.   Review the information in our guide to [setting up remote log streaming for Compute](/guides/integrations/streaming-logs/setting-up-remote-log-streaming-for-compute). Additionally, our developer documentation provides more [information about logging](/guides/integrations/non-fastly-services/developer-guide-logging/) with Compute code written in our [supported languages](/reference/compute/sdks/).

2. In the Azure Blob Storage area, click **Create endpoint**.
3. Fill out the **Create a Microsoft Azure Blob Storage endpoint** fields as follows:

   -   In the **Name** field, enter the endpoint name you specified in your Compute code. For example, in our [Rust code example](/guides/compute/developer-guides/rust/#logging), the name is `my_endpoint_name`.

   - In the **Storage account name** field, enter the unique Azure namespace in which your data objects will be stored.
   - In the **Container** field, enter the name of the Blob Storage container to store logs in. See Microsoft's [Blob storage page](https://docs.microsoft.com/en-us/azure/storage/blobs/storage-blobs-introduction) for more information.
   - In the **SAS token** field, enter the token associated with the container.

   > **IMPORTANT:** Ensure you update your token before it expires otherwise the logging functionality will not work.

   - _(Optional)_ In the **Maximum bytes** field, enter the maximum file size in bytes.
   -   *(Optional)* In the **Period** field, enter an interval (in seconds) to control how frequently your log files are rotated. Rotation entails the finalization of one file object and the start of a new one, never removing any previously created file object. This value defaults to `3600` seconds.

   -   *(Optional)* In the **Timestamp format** field, enter a timestamp format for log files. The default is an `strftime` compatible string. Our guide on [changing where log files are written](/guides/integrations/streaming-logs/changing-where-log-files-are-written) provides more information.

4. Click **Advanced options** and fill out the fields as follows:
   -   *(Optional)* In the **Path** field, enter the path within the bucket to store the files. The path ends with a trailing slash. If this field is left empty, the files will be saved in the bucket's root path. Our guide on [changing where log files are written](/guides/integrations/streaming-logs/changing-where-log-files-are-written) provides more information.

   -   *(Optional)* In the **PGP public key** field, enter a PGP public key that Fastly will use to encrypt your log files before writing them to disk. You will only be able to read the contents by decrypting them with your private key. The PGP key should be in [PEM (Privacy-Enhanced Mail) format](https://en.wikipedia.org/wiki/Privacy-enhanced_Electronic_Mail). Read our guide on [log encryption](/guides/integrations/streaming-logs/encrypting-logs) for more information.

   -   In the **Select a log line format** area, select the log line format for your log messages. Our guide on [changing log line formats](/guides/integrations/streaming-logs/changing-log-line-formats) provides more information.

   -   *(Optional)* In the **Compression** field, select the compression format you want applied to the log files. Our guide on [changing log compression options](/guides/integrations/streaming-logs/changing-log-compression-options) provides more information.

5.   Click **Create** to create the new logging endpoint.

6.   From the **Activate** menu, select **Activate on Production** to deploy your configuration changes.

> **NOTE:** Although Fastly continuously streams logs into Azure Blob Storage, the storage portal and API do not make files available for access until after their upload is complete.

### Recommended log format

Log messages can take on any format you choose as long as they can be processed from Azure Blob Storage.

#### Ingesting data for Azure Data Explorer

[Azure Data Explorer](https://azure.microsoft.com/en-gb/services/data-explorer/) is a  data exploration service for log and telemetry data. To ingest your data correctly, Data Explorer requires your logs to be formatted as comma-separated values (CSVs). When creating your logging endpoint:

- Set the log format to a CSV string (`${time_start_sec},"${req_http_user_agent}","${log_message}"`). In this example, `${time_start_sec}`, `${req_http_user_agent}`, and `${log_message}` are placeholders for variables in your Compute service code.
- Specify **blank** when you **Select a log line format** in the **Advanced options**.

Our guide on [changing log line formats](https://www.fastly.com/documentation/guides/integrations/streaming-logs/changing-log-line-formats) provides more information.

## Related content

- [Using Azure Blob Storage as an origin](https://www.fastly.com/documentation/guides/integrations/non-fastly-services/microsoft-azure-blob-storage)
- [API reference: Azure Blob log streaming](https://www.fastly.com/documentation/reference/api/logging/azureblob/)
- [CLI reference: Azure Blob log streaming](https://www.fastly.com/documentation/reference/cli/logging/azureblob/)
