Azure Blob Storage: SAS Token

 

Photo by Max DeRoin on Pexels.com
Photo by Max DeRoin on Pexels.com

A shared access signature (SAS) provides secure delegated access to resources in your storage account without compromising the security of your data. With a SAS, you have granular control over how a client can access your data. You can control what resources the client may access, what permissions they have on those resources, and how long the SAS is valid, among other parameters. - Official Microsoft Docs.

In this blog, we discussed about SAS token for Azure Blob Storage. We talked about why this form of token is useful, the different kind of tokens, and code snippet to generate this token. We also briefly talked about generating this token via Azure Portal UI.

Why and when do we generate a SAS token?

In some cases, we need a subject (can be a client, agent, or a person) to perform some tasks on Azure Blob Storage on our behalf. The task can be reading a blob in a blob container, or listing blob containers from the root of the storage account. A good example is to delegate Azure Batch Compute worker nodes to perform read/write operation on a blob storage account.

This is a delegation model. That's the token is generated based on a credential that has the permission to perform the delegated task. And it is also time bounded. It has a start time and an expiration time. We can generate tokens based on the following level of access

  1. Account Access Token - Access level is the entire storage account, and we can limit it to (the following information is pulled from pydoc):
    1. Service: Access to service-level APIs (e.g., Get/Set Service Properties, Get Service Stats, List Containers/Queues/Shares)
    2. Container: Access to container-level APIs (e.g., Create/Delete Container, Create/Delete Queue, Create/Delete Share, List Blobs/Files and Directories)
    3. Object: Access to object-level APIs for blobs, queue messages, and files(e.g. Put Blob, Query Entity, Get Messages, Create File, etc.)
  2. Container Access Token - This is targeted at a container level access. We control the operations that are allowed on the container
  3. Blob Access Token - This is targeted at a blob level.

From the above, we have access tokens for storage account (coarse grain), container, and blob (fine grain). On top of these tokens, we defined the permissions (read, write, update, etc) allowed. See azure.storage.blob.AccountSASPermissions class.

How to generate a SAS Token?

Here are the two ways to generate SAS tokens.

1. Blob Storage Client (Python)

import datetime
import os
import sys

import azure.storage.blob as azureblob

config = {
    "STORAGE_ACCOUNT_URL": os.getenv("STORAGE_ACCOUNT_URL"),
    "STORAGE_ACCOUNT_KEY": os.getenv("STORAGE_ACCOUNT_KEY"),
    "STORAGE_CONTAINER_NAME": os.getenv("STORAGE_CONTAINER_NAME"),
}

if __name__ == "__main__":
    if (
        config["STORAGE_ACCOUNT_URL"] is None
        or config["STORAGE_ACCOUNT_KEY"] is None
        or config["STORAGE_CONTAINER_NAME"] is None
    ):
        print("STORAGE_ACCOUNT_URL, STORAGE_ACCOUNT_KEY, STORAGE_CONTAINER_NAME are required")
        sys.exit(1)

    blob_client = azureblob.BlobServiceClient(
        config["STORAGE_ACCOUNT_URL"], config["STORAGE_ACCOUNT_KEY"]
    )

    # generate account based token
    account_sas_token = azureblob.generate_account_sas(
        account_name=blob_client.account_name,
        account_key=blob_client.credential.account_key,
        resource_types=azureblob.ResourceTypes(container=True, service=True, object=True),
        permission=azureblob.AccountSasPermissions(read=True, list=True),
        expiry=datetime.datetime.utcnow() + datetime.timedelta(hours=2),
    )

    # generate container based token 
    container_sas_token = azureblob.generate_container_sas(
        account_name=blob_client.account_name,
        container_name=config["STORAGE_CONTAINER_NAME"],
        account_key=blob_client.credential.account_key,
        permission=azureblob.AccountSasPermissions(read=True, list=True),
        expiry=datetime.datetime.utcnow() + datetime.timedelta(hours=2),
    )

    # generate blob based token 
    blob_sas_token = azureblob.generate_blob_sas(
        account_name=blob_client.account_name,
        account_key=blob_client.credential.account_key,
        container_name=config["STORAGE_CONTAINER_NAME"],
        blob_name="example",
        permission=azureblob.AccountSasPermissions(read=True),
        expiry=datetime.datetime.utcnow() + datetime.timedelta(hours=2),
    )

2. Azure Portal

2.1 Account based token

Account Access Token
Account Access Token

2.2 Container based

Container Access Token
Container Access Token

Please note that after the token is generated, the URI provided in the UI is incompleted. You need to add two extra query parameters to it.

    ?restype=container&comp=list

2.3 Blob Access Token

Blob Access Token
Blob Access Token


How to use the token?

There are list of best practices that we should be aware of. Here are the some of the commonly used HTTPS GET URL format.

1. List blob containers under the root

HTTP 1.1/GET https://<account-name>.blob.core.windows.net/?comp=list&<account-access-token>

2. List blobs under container

HTTP 1.1/GET https://<account-name>.blob.core.windows.net/<container-name>?restype=container&comp=list&<container-access-token> (account-access-token may work too)

3. Get blob content under container

HTTP 1.1/GET https://<account-name>.blob.core.windows.net/<container-name>/<blob-name>?<blob-access-token> (container-access-token or account-access-token may work too)

Conclusion

We have briefly explained the usage of SAS tokens, and illustrated how to generate them. We also highlighted the incomplete URL in the Azure Portal UI. Finally, we provided you a list of commonly used HTTPS GET URL format.




Comments

Popular posts from this blog

OpenAI: Functions Feature in 2023-07-01-preview API version

Storing embedding in Azure Database for PostgreSQL

Happy New Year, 2024 from DALL-E