最新消息:雨落星辰是一个专注网站SEO优化、网站SEO诊断、搜索引擎研究、网络营销推广、网站策划运营及站长类的自媒体原创博客

powershell - Size of Azure Storage Account inconsistent in CLI and Azure Portal - Stack Overflow

programmeradmin2浏览0评论

Initially I wanted to find out how big (in GB) a Storage account in our Azure Subscription is. Here I had the problem that I get a different size whether I look in the Azure portal or in the Azure Cli (by a factor of about 200).

  1. Over the Azure portal in the Storage Account under -> Monitoring -> Insights -> Capacity I see ~ 100 GB
  2. Over the Azure CLI I get 600 MB (I have tried multiple version all with the same result. The most promising I included below).

This different is of course quite big and irritating. When I tried to download the Storage Account (with az storage blob download-batch) I stop the program after he got over 1 GB as he was still on the first of about 10 folders. Therefore I assume that the 100 GB are the correct estimate (which also fits far better with what I was expecting). Still the question remains why doesn´t my Azure Cli command work?

# List all blobs, extract contentLength, filter for valid numbers, and sum the sizes
$blobSizes = az storage blob list --container-name $container --account-name $account --query "[].properties.contentLength" --output json | ConvertFrom-Json
# Filter to ensure only numeric values are included in the sum
$validSizes = $blobSizes | Where-Object { $_ -match '^\d+$' }
# Calculate total size in bytes
$sizeBytes = ($validSizes | Measure-Object -Sum).Sum
# Convert to GB
$sizeGB = [math]::Round($sizeBytes / 1GB, 2)
Write-Output "Total size of container '$container': $sizeGB GB ($sizeBytes bytes)"

I have also tried some version while adding the --include v parameter as my first guess was that the version, metadata or snapshots version could be the reason but no matter what, the size stayed the same. (Also I would not expect to have any versioning or snapshots in the data).

On other reason I could think of is that the files in the storage account are .gz files and therefore compressed. Also the Paths are really long. Maybe this is causing an issue? I have also tested the code on a smaller container here the size was consistent with the size of the download.

I am aware that the portal only gives an estimate of the entire storage account and the script gives an estimate of a single container. But as I have only two containers in the account were one is only a few KB bit this should not matter in my context. (In the Portal I also do not see any File shares, Queues or Tables. So there should not be a chance that they are responsible).

Does anyone know how this could happen? I am aware, that the 100 GB are most likely the correct estimate, but I would still be able to use the Cli and trust the results.

Initially I wanted to find out how big (in GB) a Storage account in our Azure Subscription is. Here I had the problem that I get a different size whether I look in the Azure portal or in the Azure Cli (by a factor of about 200).

  1. Over the Azure portal in the Storage Account under -> Monitoring -> Insights -> Capacity I see ~ 100 GB
  2. Over the Azure CLI I get 600 MB (I have tried multiple version all with the same result. The most promising I included below).

This different is of course quite big and irritating. When I tried to download the Storage Account (with az storage blob download-batch) I stop the program after he got over 1 GB as he was still on the first of about 10 folders. Therefore I assume that the 100 GB are the correct estimate (which also fits far better with what I was expecting). Still the question remains why doesn´t my Azure Cli command work?

# List all blobs, extract contentLength, filter for valid numbers, and sum the sizes
$blobSizes = az storage blob list --container-name $container --account-name $account --query "[].properties.contentLength" --output json | ConvertFrom-Json
# Filter to ensure only numeric values are included in the sum
$validSizes = $blobSizes | Where-Object { $_ -match '^\d+$' }
# Calculate total size in bytes
$sizeBytes = ($validSizes | Measure-Object -Sum).Sum
# Convert to GB
$sizeGB = [math]::Round($sizeBytes / 1GB, 2)
Write-Output "Total size of container '$container': $sizeGB GB ($sizeBytes bytes)"

I have also tried some version while adding the --include v parameter as my first guess was that the version, metadata or snapshots version could be the reason but no matter what, the size stayed the same. (Also I would not expect to have any versioning or snapshots in the data).

On other reason I could think of is that the files in the storage account are .gz files and therefore compressed. Also the Paths are really long. Maybe this is causing an issue? I have also tested the code on a smaller container here the size was consistent with the size of the download.

I am aware that the portal only gives an estimate of the entire storage account and the script gives an estimate of a single container. But as I have only two containers in the account were one is only a few KB bit this should not matter in my context. (In the Portal I also do not see any File shares, Queues or Tables. So there should not be a chance that they are responsible).

Does anyone know how this could happen? I am aware, that the 100 GB are most likely the correct estimate, but I would still be able to use the Cli and trust the results.

Share Improve this question edited 6 hours ago Christoph Rackwitz 15.7k5 gold badges39 silver badges51 bronze badges asked yesterday Manuel Manuel 8411 gold badge12 silver badges28 bronze badges 1
  • Ensure you are including all blob versions, snapshots, and all containers in your CLI commands, by default az storage blob list only retrieves current blobs from a single container. – Venkatesan Commented 10 hours ago
Add a comment  | 

1 Answer 1

Reset to default 0

Azure Storage Account Size is inconsistent in CLI and Azure Portal

The CLI command was only checking for one container and did not include some hidden data like versions or snapshots. I updated the script to loop through all containers and calculated the total size correctly.

I had executing in my environment the size now matches from the Azure Portal under Insights > Capacity, so the size is accurate in both Azure CLI and Azure portal

Powershell script using Azure CLI commands:

param(
    [string]$accountName,
    [string]$resourceGroup
)
Write-Output "Fetching storage key for '$accountName' in resource group '$resourceGroup'..."
$storageKey = az storage account keys list `
    --account-name $accountName `
    --resource-group $resourceGroup `
    --query "[0].value" `
    --output tsv
Write-Output "Listing containers."
$containerList = az storage container list `
    --account-name $accountName `
    --account-key $storageKey `
    --query "[].name" `
    --output tsv
$totalSizeBytes = 0
foreach ($container in $containerList) {
    Write-Output "Scanning container: $container"
    $blobSizes = az storage blob list `
        --container-name $container `
        --account-name $accountName `
        --account-key $storageKey `
        --query "[].properties.contentLength" `
        --output json | ConvertFrom-Json
    $validSizes = $blobSizes | Where-Object { $_ -match '^\d+$' }
    $containerSize = ($validSizes | Measure-Object -Sum).Sum
    Write-Output " -> Container size: $([math]::Round($containerSize / 1MB, 2)) MB"
    $totalSizeBytes += $containerSize
}
$totalSizeGB = [math]::Round($totalSizeBytes / 1GB, 2)
Write-Output "`n============================"
Write-Output " Total Size: $totalSizeGB GB"

Output:

发布评论

评论列表(0)

  1. 暂无评论